WO2019016888A1 - Terminal, display system and display method - Google Patents

Terminal, display system and display method Download PDF

Info

Publication number
WO2019016888A1
WO2019016888A1 PCT/JP2017/026094 JP2017026094W WO2019016888A1 WO 2019016888 A1 WO2019016888 A1 WO 2019016888A1 JP 2017026094 W JP2017026094 W JP 2017026094W WO 2019016888 A1 WO2019016888 A1 WO 2019016888A1
Authority
WO
WIPO (PCT)
Prior art keywords
terminal
processor
display
searcher
type
Prior art date
Application number
PCT/JP2017/026094
Other languages
French (fr)
Japanese (ja)
Inventor
香子 米澤
敏郎 清水
オリオル ガスケス
Original Assignee
株式会社ispace
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ispace filed Critical 株式会社ispace
Priority to PCT/JP2017/026094 priority Critical patent/WO2019016888A1/en
Publication of WO2019016888A1 publication Critical patent/WO2019016888A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64GCOSMONAUTICS; VEHICLES OR EQUIPMENT THEREFOR
    • B64G1/00Cosmonautic vehicles
    • B64G1/16Extraterrestrial cars
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01VGEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
    • G01V9/00Prospecting or detecting by methods not provided for in groups G01V1/00 - G01V8/00
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04QSELECTING
    • H04Q9/00Arrangements in telecontrol or telemetry systems for selectively calling a substation from a main station, in which substation desired apparatus is selected for applying a control signal thereto or for obtaining measured values therefrom

Definitions

  • the present disclosure relates to a terminal, a display system, and a display method.
  • the terminal is a terminal that displays the situation around the searcher, and an operation unit that receives from the user the designation of the position of the target existing around the searcher and the type of the target; It has a processor which controls to display an object corresponding to the kind or form of the object on a display unit at a position received by the operation unit, and a communication unit, and the processor is a position received by the operation unit
  • the position and type received by the operation unit may be controlled to be transmitted from the communication unit to the other terminal such that an object corresponding to the type or form of the object is displayed on the other terminal.
  • the terminal which concerns on the other aspect of this indication is a terminal which displays the surrounding condition of a spacecraft, Comprising: The position of the object which exists around the spacecraft and the kind or form of the object concerned from other terminals
  • the communication system may include a communication unit to receive, and a processor that controls an object corresponding to the type or form of the received object to be displayed on a display unit at the position of the received object.
  • a display system is a display system having a first terminal and a second terminal, and displaying a situation around a search device, wherein the first terminal is A first operation unit that receives from the first user the position of an object existing around the searcher and the designation of the type or form of the object, the first communication unit, and the position received by the first operation unit And a first processor that controls the type or form to be transmitted from the first communication unit, and the second terminal receives the position of the object and the type or form of the object. And a second processor that controls an object corresponding to the type or form of the received object to be displayed on the corresponding display unit at the position of the received object. Good.
  • a display method is a display method for displaying the status of a searcher, wherein the first terminal is a position of a target existing around the searcher and a type of the target or the target A step of receiving designation of a form from a first user, a step of transmitting the received position and type or form by the first terminal, and a step of receiving the position and the type or form by the second terminal And controlling the second terminal to display an object corresponding to the type or form of the designated object at the position received by the first terminal on the corresponding display unit. You may have.
  • FIG. It is a block diagram showing a schematic structure of a display system concerning this embodiment. It is a block diagram which shows schematic structure of the 1st terminal which concerns on this embodiment. It is a block diagram which shows schematic structure of the 2nd terminal which concerns on this embodiment. It is a block diagram showing a schematic structure of an information processor concerning this embodiment. It is a figure which shows the outline
  • FIG. It is an example of screen field A1. It is an example of screen field A2. It is an example of screen field A3. It is an example of screen field A4. It is a figure which shows the outline
  • FIG. It is an example of screen field A5. It is an example of screen field A6. It is an example of screen field A7. It is an example of screen field A8.
  • a display system and a display method capable of facilitating grasping the situation around the spacecraft are provided.
  • the terminal is a terminal for displaying the situation around the searcher, and an operation unit that receives from the user the designation of the position of the target existing around the searcher and the type of the target And a processor configured to control an object corresponding to the type or form of the object to be displayed on the display unit at a position received by the operation unit, and a communication unit, the processor receiving the operation unit The position and type received by the operation unit are controlled to be transmitted from the communication unit to the other terminal such that an object corresponding to the type or form of the object is displayed on the other position on the other terminal.
  • the terminal according to the second aspect of the embodiment is the terminal according to the first aspect, and when the target is an obstacle, the operation unit selects an icon representing the form of the obstacle by the user. Accept as designation of the form of the target, accept designation of the position of the obstacle from the user, and display the object representing the form of the obstacle on the display unit at the position accepted by the operation unit.
  • the position received by the operation unit and the shape of the obstacle are controlled from the communication unit so that an object representing the form of the obstacle is displayed on the other terminal at the position received by the operation unit. Control to send to the terminal of
  • an object representing the form of the obstacle is displayed at the position of the obstacle specified by the user at the terminal, so that another user operating the other terminal (for example, a pilot) 2.) does not need to input the position and the form of the obstacle by itself, and can easily grasp the obstacle around the spacecraft. Therefore, for example, when another user is a pilot, it is possible to concentrate on the operation of the spacecraft and to know the position and the form of the obstacle, so that the obstacle can be avoided when deciding the traveling direction. it can.
  • the terminal according to the third aspect of the embodiment is the terminal according to the second aspect, and further converting means for converting the position received by the operation unit into the latitude and longitude of the star on which the probe is present
  • the processor controls the object to be displayed at a position corresponding to the latitude and the longitude, and the latitude and the longitude are positions received by the operation unit from the communication unit to the other terminal Control to send.
  • the terminal according to the fourth aspect of the embodiment is a terminal according to any one of the first to third aspects, wherein the terminal according to the solar power generation in the solar panel provided in the searcher There is a sun direction determining means for determining a direction, and the processor controls to display the direction of the sun on the display unit.
  • a terminal according to a fifth aspect of the embodiment is a terminal according to any one of the first to fourth aspects, wherein the processor acquires the latitude and longitude of the lander, and uses the latitude and longitude of the lander. The position or direction of the lander is controlled to be displayed on the display unit.
  • the terminal includes the position of an object present around the searcher and a communication unit that receives the type or form of the object from another terminal, and the position of the received object. Controlling the display unit to display an object corresponding to the type or form of the received object.
  • the terminal can display an object corresponding to the type or form of the received target at the position of the target received from another terminal. You can grasp the situation around the spacecraft without it.
  • a terminal according to a seventh aspect of the embodiment is a terminal according to the sixth aspect, wherein the processor is controlled to display a camera image of the searcher and to display a line indicating a distance on the camera image. Do.
  • a line representing the distance is displayed on the camera image, so that the user can grasp the distance of the object in the camera image.
  • the terminal according to the eighth aspect of the embodiment is the terminal according to the seventh aspect, wherein the processor determines the inclination of the spacecraft back and forth from the acceleration sensor value of the spacecraft, and the spacecraft of the spacecraft The position of the line representing the distance is corrected according to the back and forth inclination.
  • the terminal according to a ninth aspect of the embodiment is a terminal according to the seventh or eighth aspect, wherein the processor is configured to calculate a line representing the distance from distortion information of a lens mounted on a camera of the spacecraft. to correct.
  • the terminal according to a tenth aspect of the embodiment is a terminal according to any one of the sixth to ninth aspects, further comprising an operation unit that receives setting of traveling parameters by a user, and the processor is configured to The communication unit is controlled to transmit the set command to another terminal such that the terminal can select whether to permit the command or not.
  • the terminal according to an eleventh aspect of the embodiment is a terminal according to the tenth aspect, and when the processor receives from the at least one other terminal permission for the travel parameter, the search for the travel parameter Control to display an object for command transmission in a display mode that allows reception of a transmission instruction to the aircraft, and the processor receives an instruction to transmit the traveling parameter to the searcher when the operation unit receives the transmission instruction to the searcher, The communication unit is controlled to transmit a travel command including the travel parameter to the searcher.
  • the user of the terminal when permission of the travel parameter is received from at least one other terminal, the user of the terminal can transmit the travel parameter to the searcher, so the probability of transmitting the incorrect travel parameter is determined. It can be reduced.
  • the terminal according to a twelfth aspect of the embodiment is a terminal according to any one of the first to eleventh aspects, wherein the processor is configured to set the object of the obstacle within the field of view of the camera of the searcher to the camera. It controls to display in the display mode different from the object of the obstacle of out of view.
  • the user of the terminal can easily distinguish the object of the obstacle in the view of the camera of the searcher from the object of the obstacle outside the view of the camera.
  • the terminal according to a thirteenth aspect of the embodiment is a terminal according to any one of the first to twelfth aspects, wherein the processor is configured to calculate a traveling direction and a traveling distance and / or a distance designated by a user operating the terminal.
  • the movement trajectory of the searcher on the screen is updated according to the information on the velocity.
  • the terminal according to a fourteenth aspect of the embodiment is a terminal according to the thirteenth aspect, wherein the processor updates the movement trajectory of the searcher, further considering the slip ratio of the tire of the searcher .
  • a terminal according to a fifteenth aspect of the embodiment is a terminal according to the fourteenth aspect, wherein the slip ratio is obtained by the traveling distance to which the probe is instructed and a star searched by the probe. It is set according to the difference with the distance actually traveled on the earth on the soil.
  • a terminal according to a sixteenth aspect of the embodiment is a terminal according to the fourteenth aspect, wherein the slip ratio is an actual travel distance at which the spacecraft is instructed and a star which the spacecraft searches. It is set according to the difference between the distance and the distance.
  • a display system is a display system having a first terminal and a second terminal, and displaying a situation around a search device, the first terminal being the search
  • a first operation unit that receives, from a first user, the position of an object existing around the machine and the designation of the type or form of the object, a first communication unit, and a position received by the first operation unit;
  • a second processor configured to control the type or form to be transmitted from the first communication unit, and the second terminal receives the position of the object and the type or form of the object.
  • a communication unit, and a second processor that controls an object corresponding to the type or form of the received object to be displayed on a corresponding display unit at the position of the received object.
  • the second terminal displays the object corresponding to the type of the target at the position of the target specified by the user of the first terminal. You can easily grasp the surrounding situation.
  • a display method is a display method for displaying a state of a searcher, wherein a first terminal is a position of an object existing around the searcher and the type or form of the object Receiving from the first user the designation of the first user, transmitting the received position and type or form by the first terminal, and receiving the position and the type or form by the second terminal. And controlling the second terminal to display an object corresponding to the type or form of the target subjected to the designation on a corresponding display unit at a position received by the first terminal. .
  • the second terminal displays the object corresponding to the type of the target at the position of the target specified by the user of the first terminal. You can easily grasp the surrounding situation.
  • the pilot operating the spacecraft must control the spacecraft while constantly monitoring whether the danger to the spacecraft is approaching by watching the camera image sent from the spacecraft. For this reason, it did not reach attention to grasping
  • the spacecraft explores the moon, which is an example of a star, and a display system for grasping the situation around the spacecraft will be described.
  • FIG. 1 is a block diagram showing a schematic configuration of a display system according to the present embodiment.
  • the display system S includes a first terminal 1, a second terminal 2, a third terminal 3, a fourth terminal 4, an information processing device 5, and a ground station 6.
  • the first terminal 1, the second terminal 2, the third terminal 3, and the fourth terminal 4 can communicate with the information processing device 5 via the communication network N.
  • Communication by the first terminal 1, the second terminal 2, the third terminal 3, and the fourth terminal 4 may be wireless or wired.
  • these communication units will be described as an example of wireless communication.
  • the first terminal 1 is a terminal operated by a first user (here, as an example, a co-pilot that assists a pilot operating a searcher).
  • the second terminal 2 is a terminal operated by (here, as an example, a pilot operating a searcher).
  • the third terminal 3 is a terminal operated by a third user (for example, a project manager or the like).
  • the fourth terminal 4 is a terminal operated by a fourth user (for example, a system manager).
  • a team for operation of a spacecraft is formed by four people of a pilot, a co-pilot, a project manager, and a system manager.
  • the ground station 6 can communicate with the lander 7 by radio.
  • the ground station 6 can acquire the latitude and longitude of the lander from the lander 7.
  • the searcher 8 can communicate with the lander 7 by radio.
  • the ground station 6 can receive information from the searcher 8 via the lander 7. For example, image data captured by the searcher 8, solar power generation in a solar panel provided in the searcher 8 It can receive voltage.
  • the information processing device 5 is, for example, a server, and can communicate with the ground station 6.
  • the information processing apparatus 5 acquires information (for example, image data captured by the searcher 8, voltage of solar power generation in a solar panel provided in the searcher, latitude and longitude of a lander, etc.) from the ground station 6 Can.
  • FIG. 2 is a block diagram showing a schematic configuration of the first terminal 1 according to the present embodiment.
  • the first operation unit 11, the first communication unit 12, the storage unit 13, a random access memory (RAM) 14, a first processor 15, and a display unit 16 are provided.
  • each part is connected via a bus.
  • the first operation unit 11 receives an operation by the first user.
  • the first operation unit 11 is, for example, a touch panel.
  • the first communication unit 12 communicates with another terminal or the information processing apparatus 5. This communication may be wired or wireless.
  • the storage unit 13 stores a program to be executed by the first processor 15.
  • the RAM 14 temporarily stores information.
  • the first processor 15 reads a program from the storage unit 13 and executes the program.
  • FIG. 3 is a block diagram showing a schematic configuration of a second terminal according to the present embodiment.
  • the second operation unit 21, the second communication unit 22, the storage unit 23, a random access memory (RAM) 24, a second processor 25, and a display unit 26 are provided.
  • each part is connected via a bus.
  • the second operation unit 21 receives an operation by the second user.
  • the second operation unit 21 is, for example, a touch panel.
  • the second communication unit 22 communicates with another terminal or the information processing apparatus 5. This communication may be wired or wireless.
  • the storage unit 23 stores a program to be executed by the second processor 25.
  • the RAM 24 temporarily stores information.
  • the second processor 25 reads the program from the storage unit 23 and executes the program.
  • FIG. 4 is a block diagram showing a schematic configuration of the information processing apparatus according to the present embodiment.
  • an input unit 51 As shown in FIG. 4, an input unit 51, a communication unit 52, a storage unit 53, a random access memory (RAM) 54, and a processor 55 are provided. As shown in FIG. 4, each part is connected via a bus.
  • RAM random access memory
  • the input unit 51 receives an input from the user.
  • the communication unit 52 communicates with other terminals. This communication may be wired or wireless.
  • the storage unit 53 stores a program for the processor 55 to execute.
  • the RAM 54 temporarily stores information.
  • the processor 55 reads the program from the storage unit 53 and executes the program.
  • FIG. 5 is a diagram showing an outline of the screen G1 displayed on the first terminal 1. As shown in FIG. As shown in FIG. 5, the screen G1 displayed on the first terminal 1 includes screen areas A1 to A4.
  • FIG. 6 is an example of the screen area A1.
  • an object R10 indicating a searcher is displayed.
  • objects R11 to R15 indicating obstacles outside the field of view of the camera of the searcher are displayed.
  • objects R16 to R17 indicating obstacles in the visual field of the camera of the spacecraft are displayed in a display mode different from objects R11 to R15 indicating obstacles outside the visual field of the camera of the spacecraft .
  • the first processor 15 controls to display an obstacle object in the camera view of the searcher 8 in a display mode different from that of the obstacle object outside the camera view. ing.
  • an arrow object R181 indicating the direction of the lander is displayed.
  • the first processor 15 obtains the latitude and longitude of the lander 7 and controls the direction of the lander to be displayed on the corresponding display unit 16 using the latitude and longitude of the lander 7.
  • the first processor 15 may display not only the direction of the lander 7 but also the position of the lander 7.
  • the second processor 25 may perform the same processing to display the direction or position of the lander 7.
  • an arrow object R182 indicating the direction of the sun is displayed.
  • the first processor 15, the second processor 25 or the processor 55 determines the direction of the sun according to the voltage of solar power generation in the solar panel provided in the spacecraft. It may function as a sun direction determining means. Specifically, for example, the solar direction determining means may determine the perpendicular direction of the solar panel as the direction of the sun when the voltage of solar power generation is maximum.
  • the first processor 15 controls to display the direction of the sun on the corresponding display unit 16.
  • the second processor 25 may perform the same processing to display the direction of the sun.
  • an arrow object R183 indicating the direction of the earth is displayed.
  • an arrow object R184 indicating the direction of the next destination is displayed.
  • FIG. 7 is an example of the screen area A2.
  • an icon R21 representing a convex form obstacle an icon R22 representing a concave form obstacle, an icon R23 representing an object requiring attention, and icons R24, R25, R26 representing a marker are displayed. ing.
  • the first operation unit 11 receives from the first user the designation of the position of an object present around the searcher and the type or form of the object.
  • the subject here is an obstacle, a danger point or a danger zone, resources (for example, minerals, water, etc.) or other explorers.
  • Hazardous points include the boundaries of the sun and the shade. Because the potential difference is large at the boundary between the sun and the shade, there is a possibility that a failure or abnormality of the electronic device mounted on the probe 8 may occur.
  • the first operation unit 11 receives the selection of the icon representing the form of the obstacle by the first user as designation of the form of the target.
  • the icon R21 of FIG. 7 is selected by the first user.
  • the icon R22 of FIG. 7 is selected by the first user. In this way, the form of the target obstacle is specified.
  • the first processor 15 causes the position and fault of the first operation unit 11 to be displayed on the second terminal 2 so that the object representing the form of the obstacle is displayed at the position of the first operation unit 11.
  • the form of the object is controlled to be transmitted from the first communication unit 12 to the second terminal 2.
  • the first user specifies the position of the obstacle by specifying the icon R21 or the icon R22, and then touching the position on the screen corresponding to the position of the obstacle in the screen area A1.
  • the first operation unit 11 receives designation of the position of the obstacle from the first user.
  • the first processor 15 controls, for example, to display an object corresponding to the specified type or form of the obstacle on the corresponding display unit 16 at the specified position. At that time, even if the first processor 15 functions as conversion means for converting the position received by the first operation unit 11 into the latitude and longitude of the star (in this example, the moon as an example) where the searcher 8 is present. Good. The first processor 15 may then control to display the object at a position corresponding to the latitude and longitude. Thereby, the screen area A1 of FIG. 6 is obtained.
  • FIG. 8 is an example of the screen area A3.
  • an object R31 of the searcher 8 and an object R32 of the lander 7 are displayed. Further, in the screen area A3, a movement trajectory L1 of the searcher 8 is shown. Further, in the screen area A3, an image area R33 representing the danger zone and an image area R34 representing the danger zone are shown.
  • the first processor 15 in the first terminal 1 is, for example, designated by a second user (for example, a pilot) operating the second terminal 2
  • the movement trajectory of the searcher 8 on the screen may be updated according to the traveling direction and the number of revolutions of the wheel per unit time (for example, the number of revolutions per minute: RPM (Rotation Per Minute)).
  • the number of revolutions per unit time of the wheel is an example of information related to the speed.
  • the movement trajectory of the searcher 8 on the screen can be updated in real time.
  • the present invention is not limited to this, and the first processor 15 may update the movement trajectory of the searcher 8 on the screen according to the traveling direction and the traveling distance every time the traveling of the session according to the traveling parameter is completed, for example. Good.
  • the first processor 15 may update the movement trajectory of the searcher in consideration of, for example, the slip ratio of the tire of the searcher 8.
  • This slip ratio is the difference between the distance traveled by the explorer 8 and the actual distance traveled on the earth on the soil collected by the star (in this case, the moon, for example) explored by the explorer 8. It may be set accordingly.
  • this slip ratio may be set according to the difference between the distance traveled by the searcher 8 and the distance actually traveled by the star searched by the searcher 8.
  • These settings may be set by the first processor 15, may be set by the second processor 25, or may be set by the processor 55.
  • FIG. 9 is an example of the screen area A4.
  • buttons R41 and R42 are displayed.
  • a travel command set by the second user in this case, a pilot
  • a travel command traveling at a speed of 30 rpm and a distance of 0.85 m is displayed in the direction of 2.85 °.
  • a button R41 for permitting the driving parameter and a button R42 for not permitting the driving parameter are shown.
  • FIG. 10 is a diagram showing an outline of the screen G2 displayed on the second terminal 2. As shown in FIG. As shown in FIG. 10, the screen G2 displayed on the second terminal 2 includes screen areas A5 to A8.
  • FIG. 11 is an example of the screen area A5.
  • an object R50 indicating a searcher is displayed.
  • objects R51 to R57 indicating obstacles outside the field of view of the camera of the searcher are displayed.
  • objects R58 to R59 indicating obstacles entering the field of vision of the camera of the spacecraft are displayed in a display mode different from objects R51 to R57 indicating obstacles outside the field of vision of the camera of the spacecraft .
  • the first processor 15 controls, for example, the position received by the first operation unit 11 and the form of the obstacle to be transmitted from the first communication unit 12 .
  • the form of the obstacle is an example of the type of object.
  • the second communication unit 22 receives the position of the obstacle and the form of the obstacle transmitted from the first terminal 1. At this time, the second communication unit 22 may receive the position of the obstacle and the form of the obstacle via the information processing device 5. Then, the second processor 25 controls the display unit 26 to display an object representing the form of the obstacle at the position received by the first operation unit 11.
  • the second processor 25 may receive the latitude and longitude of the star (in this example, the moon as an example here) where the explorer 8 is present, as the position received by the first operation unit 11. The second processor 25 may then control to display the object at a position corresponding to the latitude and longitude.
  • Objects representing "" can be displayed. Therefore, for example, the second user (for example, a pilot) does not have to input the position and the form of the obstacle by himself, and can concentrate on the operation of the spacecraft and can grasp the position and the form of the obstacle. So, when deciding on the direction of travel, obstacles can be avoided.
  • an obstacle is displayed as an example of the object, but the present invention is not limited to this.
  • a dangerous spot or a dangerous zone, a place where resources (for example, water, minerals, etc.) exist, or another explorer The place may be displayed as an object.
  • the second processor 25 also controls to display an obstacle object in the field of view of the camera of the searcher 8 in a display manner different from that of the obstacle object outside the field of view of the camera. At that time, since the view angle of the camera of the searcher 8 is known in advance, the second processor 25 determines whether or not the object is within the range of the view angle for each object. An obstacle object that falls within the field of view of the eight cameras may be determined.
  • the second user for example, the pilot
  • the second user can separately recognize the obstacle particles in the field of view of the camera and the obstacle outside the field of view of the camera. This prevents the spacecraft 8 from colliding with obstacles outside the camera's field of view.
  • FIG. 12 is an example of the screen area A6.
  • an image captured by the camera of the searcher 8 is displayed. This image is sequentially updated.
  • lines L2 to L6 indicating distances are displayed.
  • the second processor 25 controls the camera image of the searcher 8 to display a line representing the distance. The position of the line representing the distance is determined by taking a mark (for example, line, etc.) attached at a predetermined distance interval with the camera in advance on the ground, and the position to be displayed in the image is determined for each distance. There is.
  • the second user for example, a pilot
  • the second processor 25 determines the inclination of the probe 8 forward and backward from the acceleration sensor value of the probe 8 and corrects the position of the line representing the distance according to the inclination of the probe 8 forward and backward. It is also good. This is because if there is an anteroposterior inclination of the searcher 8, the anteroposterior inclination is attached, and the distance is extended accordingly, so the second processor 25 sets the line representing the distance closer to the front side. It may be corrected to This configuration can improve the accuracy of the position of the line representing the distance, even if there is an anteroposterior tilt of the searcher 8.
  • the second processor 25 may correct the line representing the distance from the distortion information of the lens mounted on the camera of the searcher 8. For example, as the image is distorted towards the end of the lens, the second processor 25 may correct the position of the line towards the end of the image depending on the distortion. According to this configuration, even if the lens is distorted, the accuracy of the position of the line representing the distance can be improved.
  • FIG. 13 is an example of the screen area A7.
  • the screen area A7 is a screen for setting a travel command.
  • a slider R71 for setting the direction a slider R72 for setting the traveling distance from this, and a slider R73 for setting the number of rotations of the wheels of the searcher 8 are displayed.
  • a button R74 for setting a travel command is displayed.
  • a second user for example, a pilot
  • the second operation unit 21 in the second terminal 2 receives the setting of the travel parameter by the second user (for example, the pilot).
  • the travel parameters include, by way of example, the direction, the distance, and the number of rotations of the wheel.
  • a button R76 for transmitting traveling parameters set for another user and a button R77 for retransmitting the traveling parameters are displayed.
  • the second processor 25 selects the other terminal, that is, the first terminal 1, the third terminal 1,
  • the second communication unit 22 is controlled to transmit the set traveling parameters to the third terminal 4 and the fourth terminal 4.
  • the first terminal 1, the third terminal 3 and the fourth terminal 4 receive the traveling parameters, and display the traveling parameters as shown in FIG. 9, for example.
  • the first processor 15 transmits a signal to the second terminal 2 to permit the traveling parameter. Control the first communication unit 12 so that On the other hand, when the button R42 for disallowing the traveling parameter shown in FIG. 9 is pressed, the first processor 15 transmits a signal to the second terminal 2 to the effect that the traveling parameter is not permitted. Control unit 12;
  • FIG. 13 shows an image area R75 indicating whether the permission of the travel parameter has been lowered by another user.
  • a display box R751 indicating whether the third user (for example, the manager) has lowered the travel parameter permission is shown.
  • a display box R752 indicating whether the second user (for example, co-pilot) has lowered the permission of the travel parameter is shown.
  • a display box R753 indicating whether or not the travel parameter permission has been lowered by the fourth user (for example, the system manager) is shown.
  • the display boxes R751 to R753 display, for example, green when the permission is given and red when the permission is not given.
  • the command is issued.
  • the send button R78 is changed to a state where it can be pressed.
  • the searcher of the traveling command Control is performed to display an object for command transmission (here, as an example, a command transmission button R78) in a display mode in which a transmission instruction to 8 can be received.
  • a command transmission button R78 a command transmission button
  • the second processor 25 receives an instruction for transmission of the traveling command to the searcher 8 when it receives permission of the traveling parameter from at least the first terminal 1, and displays the object for traveling instruction in a display mode. You may control to display.
  • the travel command can be output to the searcher 8, so an erroneous travel command is transmitted. The probability can be reduced.
  • the second processor 25 transmits the travel command including the travel parameter to the searcher 8 in the second communication. Control unit 22; As a result, a traveling command is transmitted to the searcher 8. Thereafter, the searcher 8 receives this travel command and travels according to this travel command.
  • FIG. 13 shows an emergency stop button R79.
  • the second user for example, a pilot presses the emergency stop button R 79 to emergency the searcher 8 when the searcher 8 is traveling and a obstacle such as a tear is found ahead while the searcher 8 is traveling. It is possible to stop.
  • the second processor 25 controls the second communication unit 22 to transmit an emergency stop command to the search device 8.
  • an emergency stop command is transmitted to the searcher 8.
  • the searcher 8 receives this emergency stop command, and performs an emergency stop according to the emergency stop command.
  • FIG. 14 is an example of the screen area A8.
  • an object R80 indicating the searcher 8 is displayed.
  • objects R811 to R817 indicating obstacles in the field of view of the camera on the side of the searcher 8 are displayed.
  • Objects R821 to R825 indicating obstacles in the field of view of the camera ahead of the searcher 8 are displayed.
  • objects R821 to R825 showing obstacles in the view of the camera ahead of the searcher 8 are objects R811 to R817 showing obstacles in the view of the camera on the side of the search Are displayed in different display modes.
  • objects R831 to R837 indicating obstacles outside the field of view of the camera in front of the searcher 8 and outside the field of view of the camera on the side of the searcher 8 are displayed.
  • These objects R831 to R837 are objects R821 to R825 that indicate obstacles in the field of vision of the camera ahead of the searcher 8 and objects R811 to R817 that indicate obstacles in the field of vision of the camera on the side of the spacecraft 8 It is displayed in different display modes.
  • a movement trajectory L7 of the searcher 8 is shown.
  • the second processor 25 in the second terminal 2 may, for example, travel direction designated by the second user (eg, pilot) and wheel unit time
  • the movement trajectory of the searcher 8 on the screen may be updated according to the number of rotations (for example, the number of rotations per minute: RPM (Rotation Per Minute)).
  • the number of revolutions per unit time of the wheel is an example of information related to the speed.
  • the movement trajectory of the searcher 8 on the screen can be updated in real time.
  • the second processor 25 may update the movement trajectory of the searcher 8 on the screen according to the traveling direction and the traveling distance every time the second processor 25 completes traveling of the session according to the traveling command, for example. Good.
  • the second processor 25 may update the movement trajectory of the searcher 8 in consideration of, for example, the slip ratio of the tire of the searcher 8.
  • This slip ratio is the difference between the distance traveled by the explorer 8 and the actual distance traveled on the earth on the soil collected by the star (in this case, the moon, for example) explored by the explorer 8. It may be set accordingly.
  • this slip ratio may be set according to the difference between the distance traveled by the searcher 8 and the distance actually traveled by the star searched by the searcher 8. These settings may be set by the first processor 15, may be set by the second processor 25, or may be set by the processor 55.
  • display system S concerning this embodiment is the display system which has the 1st terminal 1 and the 2nd terminal 2, and displays the situation around a spacecraft.
  • the first terminal 1 receives from the first user a designation of a target position and a target type existing around the searcher 8, a first operation unit 11, a first communication unit 12, and the first communication unit 12.
  • the first processor 15 controls to transmit the position and type received by the first operation unit 11 from the first communication unit 12.
  • the second terminal 2 displays a second communication unit that receives the position of the object and the type of the object, and an object corresponding to the type of the received object at the position of the received object, the display unit 26 corresponding to And a second processor 25 that controls to display on the display.
  • the display unit 26 displays an object corresponding to the type of the target at the position of the target designated by the first user (for example, co-pilot) in the first terminal 1.
  • a second user for example, a pilot who operates can facilitate understanding of the situation around the searcher 8.
  • the spacecraft has been described as searching for the moon, but other satellites, planets, asteroids, space to be explored in space or on the earth (for example, dangerous zones such as volcanoes, high temperature) Zones, cold zones, disaster sites, accident sites, etc.).
  • a program for executing each process of the first terminal 1 and / or the second terminal 2 of the present embodiment is recorded in a computer readable recording medium, and the program recorded in the recording medium is a computer.
  • the above-described various processes relating to the first terminal 1 and / or the second terminal 2 of the present embodiment may be performed by being read by the system and executed by the processor.
  • the present disclosure is not limited to the above embodiment as it is, and in the implementation stage, the components can be modified and embodied without departing from the scope of the invention.
  • various inventions can be formed by appropriate combinations of a plurality of constituent elements disclosed in the above embodiment. For example, some components may be deleted from all the components shown in the embodiment. Furthermore, components in different embodiments may be combined as appropriate.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Life Sciences & Earth Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Geophysics (AREA)
  • Navigation (AREA)

Abstract

Provided is a terminal for displaying the surroundings of a probe vehicle, the terminal having: an operating unit that receives, from a user, the position of a target present around the probe vehicle and designation of the type of said target; a processor that performs control so as to cause a display unit to display an object corresponding to the type or shape of the target at the position received by the operating unit; and a communication unit, wherein the processor performs control so as to cause the communication unit to transmit the position and type of the target received by the operating unit to another terminal such that the object corresponding to the type or shape of the target gets displayed at the position received by the operating unit on the other terminal.

Description

端末、表示システム及び表示方法Terminal, display system and display method
 本開示は、端末、表示システム及び表示方法に関する。 The present disclosure relates to a terminal, a display system, and a display method.
 地上及び宇宙空間において用いられる探査機として、車輪を駆動することによって移動する車輪型の探査機(いわゆるローバ)が知られている。この探査機を遠隔に操作するためのユーザインタフェースが開発されている。 As a spacecraft used on the ground and space, a wheel type spacecraft (so-called rover) which moves by driving wheels is known. A user interface for remotely operating this probe has been developed.
概要Overview
 本開示の一態様に係る端末は、探査機の周囲の状況を表示する端末であって、前記探査機の周囲に存在する対象の位置と当該対象の種類の指定をユーザから受け付ける操作部と、前記操作部が受け付けた位置に、前記対象の種類または形態に対応するオブジェクトを表示部に表示するよう制御するプロセッサと、通信部と、を有し、前記プロセッサは、前記操作部が受け付けた位置に前記対象の種類または形態に対応するオブジェクトが他の端末において表示されるように、前記操作部が受け付けた位置と種類を前記通信部から前記他の端末へ送信するよう制御してもよい。 The terminal according to an aspect of the present disclosure is a terminal that displays the situation around the searcher, and an operation unit that receives from the user the designation of the position of the target existing around the searcher and the type of the target; It has a processor which controls to display an object corresponding to the kind or form of the object on a display unit at a position received by the operation unit, and a communication unit, and the processor is a position received by the operation unit The position and type received by the operation unit may be controlled to be transmitted from the communication unit to the other terminal such that an object corresponding to the type or form of the object is displayed on the other terminal.
 また、本開示の他の態様に係る端末は、探査機の周囲の状況を表示する端末であって、前記探査機の周囲に存在する対象の位置と当該対象の種類または形態を他の端末から受信する通信部と、前記受信された対象の位置に、当該受信された対象の種類または形態に対応するオブジェクトを表示部に表示するよう制御するプロセッサと、を有してもよい。 Moreover, the terminal which concerns on the other aspect of this indication is a terminal which displays the surrounding condition of a spacecraft, Comprising: The position of the object which exists around the spacecraft and the kind or form of the object concerned from other terminals The communication system may include a communication unit to receive, and a processor that controls an object corresponding to the type or form of the received object to be displayed on a display unit at the position of the received object.
 また、本開示の他の態様に係る表示システムは、第1の端末と第2の端末を有し、探査機の周囲の状況を表示する表示システムであって、前記第1の端末は、前記探査機の周囲に存在する対象の位置と当該対象の種類または形態の指定を第1のユーザから受け付ける第1の操作部と、第1の通信部と、前記第1の操作部が受け付けた位置と種類または形態を前記第1の通信部から送信するよう制御する第1のプロセッサと、を有し、前記第2の端末は、前記対象の位置と前記対象の種類または形態を受信する第2の通信部と、前記受信された対象の位置に、当該受信された対象の種類または形態に対応するオブジェクトを、対応する表示部に表示するよう制御する第2のプロセッサと、を有してもよい。 A display system according to another aspect of the present disclosure is a display system having a first terminal and a second terminal, and displaying a situation around a search device, wherein the first terminal is A first operation unit that receives from the first user the position of an object existing around the searcher and the designation of the type or form of the object, the first communication unit, and the position received by the first operation unit And a first processor that controls the type or form to be transmitted from the first communication unit, and the second terminal receives the position of the object and the type or form of the object. And a second processor that controls an object corresponding to the type or form of the received object to be displayed on the corresponding display unit at the position of the received object. Good.
 また、本開示の他の態様に係る表示方法は、探査機の状況を表示する表示方法であって、第1の端末は、前記探査機の周囲に存在する対象の位置と当該対象の種類または形態の指定を第1のユーザから受け付ける工程と、前記第1の端末は、受け付けた位置と種類または形態を送信する工程と、第2の端末は、前記位置と前記種類または形態を受信する工程と、前記第2の端末は、前記第1の端末が受け付けた位置に、当該指定を受けた対象の種類または形態に対応するオブジェクトを、対応する表示部に表示するよう制御する工程と、を有してもよい。 Further, a display method according to another aspect of the present disclosure is a display method for displaying the status of a searcher, wherein the first terminal is a position of a target existing around the searcher and a type of the target or the target A step of receiving designation of a form from a first user, a step of transmitting the received position and type or form by the first terminal, and a step of receiving the position and the type or form by the second terminal And controlling the second terminal to display an object corresponding to the type or form of the designated object at the position received by the first terminal on the corresponding display unit. You may have.
本実施形態に係る表示システムの概略構成を示すブロック図である。It is a block diagram showing a schematic structure of a display system concerning this embodiment. 本実施形態に係る第1の端末の概略構成を示すブロック図である。It is a block diagram which shows schematic structure of the 1st terminal which concerns on this embodiment. 本実施形態に係る第2の端末の概略構成を示すブロック図である。It is a block diagram which shows schematic structure of the 2nd terminal which concerns on this embodiment. 本実施形態に係る情報処理装置の概略構成を示すブロック図である。It is a block diagram showing a schematic structure of an information processor concerning this embodiment. 第1の端末1に表示される画面G1の概要を示す図である。It is a figure which shows the outline | summary of Screen G1 displayed on the 1st terminal 1. FIG. 画面領域A1の一例である。It is an example of screen field A1. 画面領域A2の一例である。It is an example of screen field A2. 画面領域A3の一例である。It is an example of screen field A3. 画面領域A4の一例である。It is an example of screen field A4. 第2の端末2に表示される画面G2の概要を示す図である。It is a figure which shows the outline | summary of Screen G2 displayed on the 2nd terminal 2. FIG. 画面領域A5の一例である。It is an example of screen field A5. 画面領域A6の一例である。It is an example of screen field A6. 画面領域A7の一例である。It is an example of screen field A7. 画面領域A8の一例である。It is an example of screen field A8.
実施形態Embodiment
 各実施形態では、探査機の周囲の状況把握を容易化することが可能な表示システム及び表示方法を提供する。 In each embodiment, a display system and a display method capable of facilitating grasping the situation around the spacecraft are provided.
 実施形態の第1の態様に係る端末は、探査機の周囲の状況を表示する端末であって、前記探査機の周囲に存在する対象の位置と当該対象の種類の指定をユーザから受け付ける操作部と、前記操作部が受け付けた位置に、前記対象の種類または形態に対応するオブジェクトを表示部に表示するよう制御するプロセッサと、通信部と、を有し、前記プロセッサは、前記操作部が受け付けた位置に前記対象の種類または形態に対応するオブジェクトが他の端末において表示されるように、前記操作部が受け付けた位置と種類を前記通信部から前記他の端末へ送信するよう制御する。 The terminal according to the first aspect of the embodiment is a terminal for displaying the situation around the searcher, and an operation unit that receives from the user the designation of the position of the target existing around the searcher and the type of the target And a processor configured to control an object corresponding to the type or form of the object to be displayed on the display unit at a position received by the operation unit, and a communication unit, the processor receiving the operation unit The position and type received by the operation unit are controlled to be transmitted from the communication unit to the other terminal such that an object corresponding to the type or form of the object is displayed on the other position on the other terminal.
 この構成によれば、当該端末のユーザによって指定された対象の位置に対象の種類または形態に対応するオブジェクトが他の端末で表示されるので、他の端末を操作するユーザは、探査機の周囲の状況を容易に把握することができる。 According to this configuration, since the object corresponding to the type or form of the object is displayed on the other terminal at the position of the object designated by the user of the terminal, the user who operates the other terminal can You can easily grasp the situation of
 実施形態の第2の態様に係る端末は、第1の態様に係る端末であって、前記対象が障害物である場合、前記操作部は、前記ユーザによる障害物の形態を表すアイコンの選択を前記対象の形態の指定として受け付け、障害物の位置の指定を前記ユーザから受け付け、前記プロセッサは、前記操作部が受け付けた位置に、前記障害物の形態を表すオブジェクトを前記表示部に表示するよう制御し、前記操作部が受け付けた位置に前記障害物の形態を表すオブジェクトが他の端末において表示されるように、前記操作部が受け付けた位置と前記障害物の形態を前記通信部から前記他の端末へ送信するよう制御する。 The terminal according to the second aspect of the embodiment is the terminal according to the first aspect, and when the target is an obstacle, the operation unit selects an icon representing the form of the obstacle by the user. Accept as designation of the form of the target, accept designation of the position of the obstacle from the user, and display the object representing the form of the obstacle on the display unit at the position accepted by the operation unit The position received by the operation unit and the shape of the obstacle are controlled from the communication unit so that an object representing the form of the obstacle is displayed on the other terminal at the position received by the operation unit. Control to send to the terminal of
 この構成によれば、他の端末において、当該端末でユーザによって指定された障害物の位置に障害物の形態を表すオブジェクトを表示されるので、他の端末を操作する他のユーザ(例えば、パイロット)は、自ら障害物の位置と形態を入力する必要がなく、探査機の周囲の障害物の把握を容易化することができる。このため、例えば、他のユーザがパイロットの場合、探査機の操縦に専念できるとともに、障害物の位置と形態を把握することができるので、進行方向の決定の際に、障害物を避けることができる。 According to this configuration, at another terminal, an object representing the form of the obstacle is displayed at the position of the obstacle specified by the user at the terminal, so that another user operating the other terminal (for example, a pilot) 2.) does not need to input the position and the form of the obstacle by itself, and can easily grasp the obstacle around the spacecraft. Therefore, for example, when another user is a pilot, it is possible to concentrate on the operation of the spacecraft and to know the position and the form of the obstacle, so that the obstacle can be avoided when deciding the traveling direction. it can.
 実施形態の第3の態様に係る端末は、第2の態様に係る端末であって、前記操作部が受け付けた位置を、前記探査機が存在する星の緯度及び経度に変換する変換手段を更に有し、前記プロセッサは、前記緯度及び前記経度に対応する位置に前記オブジェクトを表示するよう制御し、前記緯度及び前記経度を前記操作部が受け付けた位置として、前記通信部から前記他の端末へ送信するよう制御する。 The terminal according to the third aspect of the embodiment is the terminal according to the second aspect, and further converting means for converting the position received by the operation unit into the latitude and longitude of the star on which the probe is present The processor controls the object to be displayed at a position corresponding to the latitude and the longitude, and the latitude and the longitude are positions received by the operation unit from the communication unit to the other terminal Control to send.
 この構成によれば、当該端末及び他の端末において、緯度及び経度に対応する位置にオブジェクトを表示するようことができる。 According to this configuration, it is possible to display an object at a position corresponding to the latitude and the longitude on the terminal and the other terminals.
 実施形態の第4の態様に係る端末は、第1から3のいずれかの態様に係る端末であって、前記探査機に設けられた太陽光パネルにおける太陽光発電の電圧に応じて、太陽の方向を決定する太陽方向決定手段を有し、前記プロセッサは、前記太陽の方向を前記表示部に表示するよう制御する。 The terminal according to the fourth aspect of the embodiment is a terminal according to any one of the first to third aspects, wherein the terminal according to the solar power generation in the solar panel provided in the searcher There is a sun direction determining means for determining a direction, and the processor controls to display the direction of the sun on the display unit.
 この構成によれば、太陽の方向を表示することができるので、ユーザは太陽の方向を容易に把握することができる。 According to this configuration, since the direction of the sun can be displayed, the user can easily grasp the direction of the sun.
 実施形態の第5の態様に係る端末は、第1から4のいずれかの態様に係る端末であって、前記プロセッサは、ランダーの緯度及び経度を取得し、当該ランダーの緯度及び経度を用いてランダーの位置または方角を前記表示部に表示するよう制御する。 A terminal according to a fifth aspect of the embodiment is a terminal according to any one of the first to fourth aspects, wherein the processor acquires the latitude and longitude of the lander, and uses the latitude and longitude of the lander. The position or direction of the lander is controlled to be displayed on the display unit.
 この構成によれば、ランダーの位置または方角を表示することができるので、ユーザはランダーの位置または方角を容易に把握することができる。 According to this configuration, since the position or direction of the lander can be displayed, the user can easily grasp the position or direction of the lander.
 実施形態の第6の態様に係る端末は、探査機の周囲に存在する対象の位置と当該対象の種類または形態を他の端末から受信する通信部と、前記受信された対象の位置に、当該受信された対象の種類または形態に対応するオブジェクトを表示部に表示するよう制御するプロセッサと、を有する。 The terminal according to the sixth aspect of the embodiment includes the position of an object present around the searcher and a communication unit that receives the type or form of the object from another terminal, and the position of the received object. Controlling the display unit to display an object corresponding to the type or form of the received object.
 この構成によれば、端末は、他の端末から受信された対象の位置に、当該受信された対象の種類または形態に対応するオブジェクトを表示することができるので、端末を操作するユーザは、手間なく探査機の周囲の状況を把握することができる。 According to this configuration, the terminal can display an object corresponding to the type or form of the received target at the position of the target received from another terminal. You can grasp the situation around the spacecraft without it.
 実施形態の第7の態様に係る端末は、第6の態様に係る端末であって、前記プロセッサは、探査機のカメラ画像を表示するとともに、当該カメラ画像に距離を表す線を表示するよう制御する。 A terminal according to a seventh aspect of the embodiment is a terminal according to the sixth aspect, wherein the processor is controlled to display a camera image of the searcher and to display a line indicating a distance on the camera image. Do.
 この構成によれば、カメラ画像に距離を表す線が表示されるので、ユーザは、カメラ画像中の対象物の距離を把握することができる。 According to this configuration, a line representing the distance is displayed on the camera image, so that the user can grasp the distance of the object in the camera image.
 実施形態の第8の態様に係る端末は、第7の態様に係る端末であって、前記プロセッサは、前記探査機の加速度センサ値から当該探査機の前後の傾きを決定し、当該探査機の前後の傾きに応じて、前記距離を表す線の位置を補正する。 The terminal according to the eighth aspect of the embodiment is the terminal according to the seventh aspect, wherein the processor determines the inclination of the spacecraft back and forth from the acceleration sensor value of the spacecraft, and the spacecraft of the spacecraft The position of the line representing the distance is corrected according to the back and forth inclination.
 この構成によれば、探査機の前後の傾きがあったとしても、距離を表す線の距離の正確性を向上させることができる。 According to this configuration, it is possible to improve the accuracy of the distance of the line representing the distance even if there is an anteroposterior tilt of the searcher.
 実施形態の第9の態様に係る端末は、第7または8の態様に係る端末であって、前記プロセッサは、前記探査機のカメラに搭載されたレンズの歪み情報から、前記距離を表す線を補正する。 The terminal according to a ninth aspect of the embodiment is a terminal according to the seventh or eighth aspect, wherein the processor is configured to calculate a line representing the distance from distortion information of a lens mounted on a camera of the spacecraft. to correct.
 この構成によれば、レンズに歪みがあっとしても、距離を表す線の距離の正確性を向上させることができる。 According to this configuration, even if the lens is distorted, the accuracy of the distance of the line representing the distance can be improved.
 実施形態の第10の態様に係る端末は、第6から9のいずれかの態様に係る端末であって、ユーザによる走行パラメータの設定を受け付ける操作部を更に有し、前記プロセッサは、前記他の端末において前記コマンドを許可するか否かを選択可能なように、前記設定されたコマンドを他の端末に送信するよう前記通信部を制御する。 The terminal according to a tenth aspect of the embodiment is a terminal according to any one of the sixth to ninth aspects, further comprising an operation unit that receives setting of traveling parameters by a user, and the processor is configured to The communication unit is controlled to transmit the set command to another terminal such that the terminal can select whether to permit the command or not.
 この構成によれば、他の端末を操作する他のユーザが、当該走行パラメータを許可するかどうか選択することができるので、ユーザによって誤って設定されたコマンドに対しては許可しないことにより、誤って設定されたコマンドが探査機に送信される確率を低減することができる。 According to this configuration, since another user who operates another terminal can select whether to permit the travel parameter, an error is caused by not permitting the command incorrectly set by the user. It is possible to reduce the probability that the set command is transmitted to the searcher.
 実施形態の第11の態様に係る端末は、第10の態様に係る端末であって、前記プロセッサは、少なくとも一つの他の端末から、走行パラメータの許可を受信した場合、当該走行パラメータの前記探査機への送信指示を受付可能な表示態様で、コマンド送信用のオブジェクトを表示するよう制御し、前記操作部が、前記走行パラメータの前記探査機への送信指示を受け付けた場合、前記プロセッサは、前記走行パラメータを含む走行コマンドを前記探査機へ送信するよう前記通信部を制御する。 The terminal according to an eleventh aspect of the embodiment is a terminal according to the tenth aspect, and when the processor receives from the at least one other terminal permission for the travel parameter, the search for the travel parameter Control to display an object for command transmission in a display mode that allows reception of a transmission instruction to the aircraft, and the processor receives an instruction to transmit the traveling parameter to the searcher when the operation unit receives the transmission instruction to the searcher, The communication unit is controlled to transmit a travel command including the travel parameter to the searcher.
 この構成によれば、少なくとも一つの他の端末から走行パラメータの許可を受信した場合、当該端末のユーザは、走行パラメータを探査機へ送信することができるので、誤った走行パラメータを送信する確率を低減することができる。 According to this configuration, when permission of the travel parameter is received from at least one other terminal, the user of the terminal can transmit the travel parameter to the searcher, so the probability of transmitting the incorrect travel parameter is determined. It can be reduced.
 実施形態の第12の態様に係る端末は、第1から11のいずれかの態様に係る端末であって、前記プロセッサは、前記探査機のカメラの視野に入る前記障害物のオブジェクトを、前記カメラの視野外の障害物のオブジェクトとは異なる表示態様で表示するよう制御する。 The terminal according to a twelfth aspect of the embodiment is a terminal according to any one of the first to eleventh aspects, wherein the processor is configured to set the object of the obstacle within the field of view of the camera of the searcher to the camera. It controls to display in the display mode different from the object of the obstacle of out of view.
 この構成によれば、端末のユーザは、探査機のカメラの視野に入る前記障害物のオブジェクトを、カメラの視野外の障害物のオブジェクトから区別して容易に識別することができる。 According to this configuration, the user of the terminal can easily distinguish the object of the obstacle in the view of the camera of the searcher from the object of the obstacle outside the view of the camera.
 実施形態の第13の態様に係る端末は、第1から12のいずれかの態様に係る端末であって、前記プロセッサは、前記端末を操作するユーザによって指定された走行方角と走行距離及び/または速度に関する情報とに応じて、画面における探査機の移動軌跡を更新する。 The terminal according to a thirteenth aspect of the embodiment is a terminal according to any one of the first to twelfth aspects, wherein the processor is configured to calculate a traveling direction and a traveling distance and / or a distance designated by a user operating the terminal. The movement trajectory of the searcher on the screen is updated according to the information on the velocity.
 この構成によれば、画面における探査機の移動軌跡が更新されるので、探査機の現在位置を容易に把握することができる。 According to this configuration, since the movement trajectory of the searcher on the screen is updated, the current position of the searcher can be easily grasped.
 実施形態の第14の態様に係る端末は、第13の態様に係る端末であって、前記プロセッサは、前記探査機のタイヤのスリップ率を更に考慮して、当該探査機の移動軌跡を更新する。 The terminal according to a fourteenth aspect of the embodiment is a terminal according to the thirteenth aspect, wherein the processor updates the movement trajectory of the searcher, further considering the slip ratio of the tire of the searcher .
 この構成によれば、探査機の現在位置の推定精度を向上させることができる。 According to this configuration, it is possible to improve the estimation accuracy of the current position of the searcher.
 実施形態の第15の態様に係る端末は、第14の態様に係る端末であって、前記スリップ率は、前記探査機が指令された走行距離と、前記探査機が探査する星で採取された土壌の上を地球上で実際に走行した距離との差に応じて、設定されている。 A terminal according to a fifteenth aspect of the embodiment is a terminal according to the fourteenth aspect, wherein the slip ratio is obtained by the traveling distance to which the probe is instructed and a star searched by the probe. It is set according to the difference with the distance actually traveled on the earth on the soil.
 この構成によれば、スリップ率の精度を向上させることができるので、探査機の現在位置の推定精度を向上させることができる。 According to this configuration, since the accuracy of the slip ratio can be improved, the estimation accuracy of the current position of the searcher can be improved.
 実施形態の第16の態様に係る端末は、第14の態様に係る端末であって、前記スリップ率は、前記探査機が指令された走行距離と、前記探査機が探査する星において実際に走行した距離との差に応じて、設定される。 A terminal according to a sixteenth aspect of the embodiment is a terminal according to the fourteenth aspect, wherein the slip ratio is an actual travel distance at which the spacecraft is instructed and a star which the spacecraft searches. It is set according to the difference between the distance and the distance.
 この構成によれば、スリップ率の精度を向上させることができるので、探査機の現在位置の推定精度を向上させることができる。 According to this configuration, since the accuracy of the slip ratio can be improved, the estimation accuracy of the current position of the searcher can be improved.
 実施形態の第17の態様に係る表示システムは、第1の端末と第2の端末を有し、探査機の周囲の状況を表示する表示システムであって、前記第1の端末は、前記探査機の周囲に存在する対象の位置と当該対象の種類または形態の指定を第1のユーザから受け付ける第1の操作部と、第1の通信部と、前記第1の操作部が受け付けた位置と種類または形態を前記第1の通信部から送信するよう制御する第1のプロセッサと、を有し、前記第2の端末は、前記対象の位置と前記対象の種類または形態を受信する第2の通信部と、前記受信された対象の位置に、当該受信された対象の種類または形態に対応するオブジェクトを、対応する表示部に表示するよう制御する第2のプロセッサと、を有する。 A display system according to a seventeenth aspect of the embodiment is a display system having a first terminal and a second terminal, and displaying a situation around a search device, the first terminal being the search A first operation unit that receives, from a first user, the position of an object existing around the machine and the designation of the type or form of the object, a first communication unit, and a position received by the first operation unit; And a second processor configured to control the type or form to be transmitted from the first communication unit, and the second terminal receives the position of the object and the type or form of the object. A communication unit, and a second processor that controls an object corresponding to the type or form of the received object to be displayed on a corresponding display unit at the position of the received object.
 この構成によれば、第1の端末のユーザによって指定された対象の位置に対象の種類に対応するオブジェクトが第2の端末で表示されるので、第2の端末を操作するユーザは、探査機の周囲の状況を容易に把握することができる。 According to this configuration, the second terminal displays the object corresponding to the type of the target at the position of the target specified by the user of the first terminal. You can easily grasp the surrounding situation.
 実施形態の第18の態様に係る表示方法は、探査機の状況を表示する表示方法であって、第1の端末が、前記探査機の周囲に存在する対象の位置と当該対象の種類または形態の指定を第1のユーザから受け付ける工程と、前記第1の端末が、受け付けた位置と種類または形態を送信する工程と、第2の端末が、前記位置と前記種類または形態を受信する工程と、前記第2の端末が、前記第1の端末が受け付けた位置に、当該指定を受けた対象の種類または形態に対応するオブジェクトを、対応する表示部に表示するよう制御する工程と、を有する。 A display method according to an eighteenth aspect of the embodiment is a display method for displaying a state of a searcher, wherein a first terminal is a position of an object existing around the searcher and the type or form of the object Receiving from the first user the designation of the first user, transmitting the received position and type or form by the first terminal, and receiving the position and the type or form by the second terminal. And controlling the second terminal to display an object corresponding to the type or form of the target subjected to the designation on a corresponding display unit at a position received by the first terminal. .
 この構成によれば、第1の端末のユーザによって指定された対象の位置に対象の種類に対応するオブジェクトが第2の端末で表示されるので、第2の端末を操作するユーザは、探査機の周囲の状況を容易に把握することができる。 According to this configuration, the second terminal displays the object corresponding to the type of the target at the position of the target specified by the user of the first terminal. You can easily grasp the surrounding situation.
 以下、本実施形態について、図面を参照しながら説明する。なお、以下に説明する実施形態は、本技術を実施する場合の一例を示すものであって、本技術を以下に説明する具体的構成に限定するものではない。本技術の実施にあたっては、実施の形態に応じた具体的構成が適宜採用されてよい。 Hereinafter, the present embodiment will be described with reference to the drawings. Note that the embodiments described below show an example in the case of implementing the present technology, and the present technology is not limited to the specific configuration described below. In the implementation of the present technology, a specific configuration according to the embodiment may be appropriately adopted.
 探査機を操縦するパイロットは、探査機から送られてくるカメラ画像を注視して探査機に危険が迫っていないか常時監視しつつ、探査機を操縦する必要がある。このため、探査機の周囲の状況を把握することにまで注意が及ばず、探査機の周囲の状況を把握することが困難であった。そこで、本実施形態に係る課題の一つは、パイロットによる探査機の周囲の状況把握を容易化することにある。本実施形態では、探査機は、星の一例である月を探査するものとし、この探査機の周囲の状況を把握するための表示システムについて説明する。 The pilot operating the spacecraft must control the spacecraft while constantly monitoring whether the danger to the spacecraft is approaching by watching the camera image sent from the spacecraft. For this reason, it did not reach attention to grasping | ascertaining the surrounding condition of a spacecraft, and it was difficult to grasp the surrounding condition of a spacecraft. Therefore, one of the problems according to the present embodiment is to facilitate the understanding of the situation around the probe by the pilot. In the present embodiment, the spacecraft explores the moon, which is an example of a star, and a display system for grasping the situation around the spacecraft will be described.
 (実施形態)
 まず、図1を用いて本実施形態に係る表示システムの構成について説明する。図1は、本実施形態に係る表示システムの概略構成を示すブロック図である。表示システムSは、第1の端末1、第2の端末2、第3の端末3、第4の端末4、情報処理装置5、及び地上局6を備える。
(Embodiment)
First, the configuration of the display system according to the present embodiment will be described with reference to FIG. FIG. 1 is a block diagram showing a schematic configuration of a display system according to the present embodiment. The display system S includes a first terminal 1, a second terminal 2, a third terminal 3, a fourth terminal 4, an information processing device 5, and a ground station 6.
 第1の端末1、第2の端末2、第3の端末3、第4の端末4は、通信回路網Nを介して情報処理装置5と通信することができる。第1の端末1、第2の端末2、第3の端末3、第4の端末4による通信は、無線であっても有線であってもよい。ここではこれらの通信部は一例として無線であるものとして説明する。 The first terminal 1, the second terminal 2, the third terminal 3, and the fourth terminal 4 can communicate with the information processing device 5 via the communication network N. Communication by the first terminal 1, the second terminal 2, the third terminal 3, and the fourth terminal 4 may be wireless or wired. Here, these communication units will be described as an example of wireless communication.
 第1の端末1は、第1のユーザ(ここでは一例として、探査機を操縦するパイロットを補助するコパイロット)が操作する端末である。第2の端末2は、(ここでは一例として、探査機を操縦するパイロット)が操作する端末である。第3の端末3は、第3のユーザ(例えば、プロジェクトマネージャーなど)が操作する端末である。第4の端末4は、第4のユーザ(例えば、システムマネージャーなど)が操作する端末である。本実施形態では一例として、パイロット、コパイロット、プロジェクトマネージャー、システムマネージャーの4名で、探査機の運用のためのチームが形成されている。 The first terminal 1 is a terminal operated by a first user (here, as an example, a co-pilot that assists a pilot operating a searcher). The second terminal 2 is a terminal operated by (here, as an example, a pilot operating a searcher). The third terminal 3 is a terminal operated by a third user (for example, a project manager or the like). The fourth terminal 4 is a terminal operated by a fourth user (for example, a system manager). In this embodiment, as an example, a team for operation of a spacecraft is formed by four people of a pilot, a co-pilot, a project manager, and a system manager.
 地上局6は、ランダー7と無線により交信が可能である。このため、地上局6は、ランダー7からランダーの緯度、経度を取得することができる。
 探査機8は、無線によりランダー7と交信可能である。このため、地上局6はランダー7を介して探査機8から情報を受信することができ、例えば、探査機8が撮像した画像データ、探査機8に設けられた太陽光パネルにおける太陽光発電の電圧を受信することができる。
The ground station 6 can communicate with the lander 7 by radio. Thus, the ground station 6 can acquire the latitude and longitude of the lander from the lander 7.
The searcher 8 can communicate with the lander 7 by radio. For this reason, the ground station 6 can receive information from the searcher 8 via the lander 7. For example, image data captured by the searcher 8, solar power generation in a solar panel provided in the searcher 8 It can receive voltage.
 情報処理装置5は、例えばサーバであり、地上局6と通信可能である。情報処理装置5は、地上局6から情報(例えば、探査機8が撮像した画像データ、探査機に設けられた太陽光パネルにおける太陽光発電の電圧、ランダーの緯度、経度など)を取得することができる。 The information processing device 5 is, for example, a server, and can communicate with the ground station 6. The information processing apparatus 5 acquires information (for example, image data captured by the searcher 8, voltage of solar power generation in a solar panel provided in the searcher, latitude and longitude of a lander, etc.) from the ground station 6 Can.
 (第1の端末の構成)
 続いて、図2を用いて第1の端末の構成について説明する。図2は、本実施形態に係る第1の端末1の概略構成を示すブロック図である。図2に示すように、第1の操作部11、第1の通信部12、記憶部13、RAM(Random Access Memory)14、第1のプロセッサ15、及び表示部16を備える。図2に示すように、各部はバスを介して接続されている。
(Configuration of first terminal)
Subsequently, the configuration of the first terminal will be described using FIG. FIG. 2 is a block diagram showing a schematic configuration of the first terminal 1 according to the present embodiment. As shown in FIG. 2, the first operation unit 11, the first communication unit 12, the storage unit 13, a random access memory (RAM) 14, a first processor 15, and a display unit 16 are provided. As shown in FIG. 2, each part is connected via a bus.
 第1の操作部11は、第1のユーザによる操作を受け付ける。第1の操作部11は例えばタッチパネルである。
 第1の通信部12は、他の端末または情報処理装置5と通信する。この通信は有線であっても無線であってもよい。
 記憶部13は、第1のプロセッサ15が実行するためのプログラムが格納されている。
 RAM14は、一時的に情報を格納する。
 第1のプロセッサ15は、記憶部13からプログラムを読み出して実行する。
The first operation unit 11 receives an operation by the first user. The first operation unit 11 is, for example, a touch panel.
The first communication unit 12 communicates with another terminal or the information processing apparatus 5. This communication may be wired or wireless.
The storage unit 13 stores a program to be executed by the first processor 15.
The RAM 14 temporarily stores information.
The first processor 15 reads a program from the storage unit 13 and executes the program.
 (第2の端末の構成)
 続いて、図3を用いて第2の端末の構成について説明する。図3は、本実施形態に係る第2の端末の概略構成を示すブロック図である。図3に示すように、第2の操作部21、第2の通信部22、記憶部23、RAM(Random Access Memory)24、第2のプロセッサ25、及び表示部26を備える。図3に示すように、各部はバスを介して接続されている。
(Configuration of second terminal)
Subsequently, the configuration of the second terminal will be described using FIG. FIG. 3 is a block diagram showing a schematic configuration of a second terminal according to the present embodiment. As shown in FIG. 3, the second operation unit 21, the second communication unit 22, the storage unit 23, a random access memory (RAM) 24, a second processor 25, and a display unit 26 are provided. As shown in FIG. 3, each part is connected via a bus.
 第2の操作部21は、第2のユーザによる操作を受け付ける。第2の操作部21は例えばタッチパネルである。
 第2の通信部22は、他の端末または情報処理装置5と通信する。この通信は有線であっても無線であってもよい。
 記憶部23は、第2のプロセッサ25が実行するためのプログラムが格納されている。
 RAM24は、一時的に情報を格納する。
 第2のプロセッサ25は、記憶部23からプログラムを読み出して実行する。
The second operation unit 21 receives an operation by the second user. The second operation unit 21 is, for example, a touch panel.
The second communication unit 22 communicates with another terminal or the information processing apparatus 5. This communication may be wired or wireless.
The storage unit 23 stores a program to be executed by the second processor 25.
The RAM 24 temporarily stores information.
The second processor 25 reads the program from the storage unit 23 and executes the program.
 (情報処理装置の構成)
 続いて、図4を用いて情報処理装置の構成について説明する。図4は、本実施形態に係る情報処理装置の概略構成を示すブロック図である。図4に示すように、入力部51、通信部52、記憶部53、RAM(Random Access Memory)54、及びプロセッサ55を備える。図4に示すように、各部はバスを介して接続されている。
(Configuration of information processing apparatus)
Subsequently, the configuration of the information processing apparatus will be described with reference to FIG. FIG. 4 is a block diagram showing a schematic configuration of the information processing apparatus according to the present embodiment. As shown in FIG. 4, an input unit 51, a communication unit 52, a storage unit 53, a random access memory (RAM) 54, and a processor 55 are provided. As shown in FIG. 4, each part is connected via a bus.
 入力部51は、ユーザによる入力を受け付ける。
 通信部52は、他の端末と通信する。この通信は有線であっても無線であってもよい。
 記憶部53は、プロセッサ55が実行するためのプログラムが格納されている。
 RAM54は、一時的に情報を格納する。
 プロセッサ55は、記憶部53からプログラムを読み出して実行する。
The input unit 51 receives an input from the user.
The communication unit 52 communicates with other terminals. This communication may be wired or wireless.
The storage unit 53 stores a program for the processor 55 to execute.
The RAM 54 temporarily stores information.
The processor 55 reads the program from the storage unit 53 and executes the program.
 図5は、第1の端末1に表示される画面G1の概要を示す図である。図5に示すように第1の端末1に表示される画面G1には、画面領域A1~A4が含まれている。 FIG. 5 is a diagram showing an outline of the screen G1 displayed on the first terminal 1. As shown in FIG. As shown in FIG. 5, the screen G1 displayed on the first terminal 1 includes screen areas A1 to A4.
 図6は、画面領域A1の一例である。画面領域A1において、探査機を示すオブジェクトR10が表示されている。また画面領域A1において、探査機のカメラの視野外の障害物を示すオブジェクトR11~R15が表示されている。また画面領域A1において、探査機のカメラの視野に入る障害物を示すオブジェクトR16~R17は、探査機のカメラの視野外の障害物を示すオブジェクトR11~R15とは異なる表示態様で表示されている。 FIG. 6 is an example of the screen area A1. In the screen area A1, an object R10 indicating a searcher is displayed. In the screen area A1, objects R11 to R15 indicating obstacles outside the field of view of the camera of the searcher are displayed. Further, in the screen area A1, objects R16 to R17 indicating obstacles in the visual field of the camera of the spacecraft are displayed in a display mode different from objects R11 to R15 indicating obstacles outside the visual field of the camera of the spacecraft .
 この表示を実現するために、第1のプロセッサ15は、探査機8のカメラの視野に入る障害物のオブジェクトを、カメラの視野外の障害物のオブジェクトとは異なる表示態様で表示するよう制御している。 In order to realize this display, the first processor 15 controls to display an obstacle object in the camera view of the searcher 8 in a display mode different from that of the obstacle object outside the camera view. ing.
 また画面領域A1において、ランダーの方角を示す矢印オブジェクトR181が表示されている。 Also, in the screen area A1, an arrow object R181 indicating the direction of the lander is displayed.
 この表示を実現するために、第1のプロセッサ15は、ランダー7の緯度及び経度を取得し、ランダー7の緯度及び経度を用いてランダーの方角を対応する表示部16に表示するよう制御する。なお、第1のプロセッサ15は、ランダー7の方角に限らず、ランダー7の位置を表示するようにしてもよい。なお、第2のプロセッサ25が同様の処理をしてランダー7の方角または位置を表示するようにしてもよい。 In order to realize this display, the first processor 15 obtains the latitude and longitude of the lander 7 and controls the direction of the lander to be displayed on the corresponding display unit 16 using the latitude and longitude of the lander 7. The first processor 15 may display not only the direction of the lander 7 but also the position of the lander 7. The second processor 25 may perform the same processing to display the direction or position of the lander 7.
 また画面領域A1において、太陽の方角を示す矢印オブジェクトR182が表示されている。 Further, in the screen area A1, an arrow object R182 indicating the direction of the sun is displayed.
 この表示を実現するために、第1のプロセッサ15、第2のプロセッサ25、またはプロセッサ55は、探査機に設けられた太陽光パネルにおける太陽光発電の電圧に応じて、太陽の方向を決定する太陽方向決定手段として機能してもよい。具体的には例えば太陽方向決定手段は、太陽光発電の電圧が最大になるときの、太陽光パネルの垂線方向を太陽の方向に決定してもよい。第1のプロセッサ15は、太陽の方向を対応する表示部16に表示するよう制御する。なお、第2のプロセッサ25が同様の処理をして太陽の方向を表示するようにしてもよい。 In order to realize this display, the first processor 15, the second processor 25 or the processor 55 determines the direction of the sun according to the voltage of solar power generation in the solar panel provided in the spacecraft. It may function as a sun direction determining means. Specifically, for example, the solar direction determining means may determine the perpendicular direction of the solar panel as the direction of the sun when the voltage of solar power generation is maximum. The first processor 15 controls to display the direction of the sun on the corresponding display unit 16. The second processor 25 may perform the same processing to display the direction of the sun.
 また画面領域A1において、地球の方角を示す矢印オブジェクトR183が表示されている。また、画面領域A1において、次の目的地の方角を示す矢印オブジェクトR184が表示されている。 Further, in the screen area A1, an arrow object R183 indicating the direction of the earth is displayed. Further, in the screen area A1, an arrow object R184 indicating the direction of the next destination is displayed.
 図7は、画面領域A2の一例である。画面領域A2において、凸状の形態の障害物を表すアイコンR21、凹状の形態の障害物を表すアイコンR22、注意が必要な対象を表すアイコンR23、マーカを表すアイコンR24、R25、R26が表示されている。 FIG. 7 is an example of the screen area A2. In the screen area A2, an icon R21 representing a convex form obstacle, an icon R22 representing a concave form obstacle, an icon R23 representing an object requiring attention, and icons R24, R25, R26 representing a marker are displayed. ing.
 第1の操作部11は、探査機の周囲に存在する対象の位置と当該対象の種類または形態の指定を第1のユーザから受け付ける。ここで対象は、障害物、危険地点あるいは危険地帯、資源(例えば、鉱物、水など)、または他の探査機などである。危険地点には、日向と日陰の境界が含まれる。日向と日陰の境界においては電位差が大きいので、探査機8に搭載された電子機器の故障または異常が発生する可能性があるからである。 The first operation unit 11 receives from the first user the designation of the position of an object present around the searcher and the type or form of the object. The subject here is an obstacle, a danger point or a danger zone, resources (for example, minerals, water, etc.) or other explorers. Hazardous points include the boundaries of the sun and the shade. Because the potential difference is large at the boundary between the sun and the shade, there is a possibility that a failure or abnormality of the electronic device mounted on the probe 8 may occur.
 例えば対象が障害物である場合、第1の操作部11は、第1のユーザによる障害物の形態を表すアイコンの選択を対象の形態の指定として受け付ける。具体的には例えば、図7において、障害物が凸状の形態の場合(例えば、岩、丘など)の場合、図7のアイコンR21が第1のユーザによって選択される。一方、障害物が凹状の形態の場合(例えば、崖など)の場合、図7のアイコンR22が第1のユーザによって選択される。このようにして、対象である障害物の形態が指定される。 For example, when the target is an obstacle, the first operation unit 11 receives the selection of the icon representing the form of the obstacle by the first user as designation of the form of the target. Specifically, for example, in FIG. 7, in the case where the obstacle has a convex form (for example, a rock, a hill, etc.), the icon R21 of FIG. 7 is selected by the first user. On the other hand, when the obstacle is in a concave form (for example, a cliff or the like), the icon R22 of FIG. 7 is selected by the first user. In this way, the form of the target obstacle is specified.
 第1のプロセッサ15は、第1の操作部11が受け付けた位置に障害物の形態を表すオブジェクトが第2の端末2において表示されるように、第1の操作部11が受け付けた位置と障害物の形態を第1の通信部12から第2の端末2へ送信するよう制御する。 The first processor 15 causes the position and fault of the first operation unit 11 to be displayed on the second terminal 2 so that the object representing the form of the obstacle is displayed at the position of the first operation unit 11. The form of the object is controlled to be transmitted from the first communication unit 12 to the second terminal 2.
 第1のユーザは、アイコンR21またはアイコンR22を指定してから、画面領域A1において、当該障害物の位置に対応する画面上の位置をタッチすることによって、障害物の位置を指定する。このようにして、第1の操作部11は、障害物の位置の指定を第1のユーザから受け付ける。 The first user specifies the position of the obstacle by specifying the icon R21 or the icon R22, and then touching the position on the screen corresponding to the position of the obstacle in the screen area A1. Thus, the first operation unit 11 receives designation of the position of the obstacle from the first user.
 そして、第1のプロセッサ15は例えば、指定された位置に、指定された障害物の種類または形態に対応するオブジェクトを、対応する表示部16に表示するよう制御する。その際、第1のプロセッサ15は、第1の操作部11が受け付けた位置を、探査機8が存在する星(ここでは一例として月)の緯度及び経度に変換する変換手段として機能してもよい。そして第1のプロセッサ15は、緯度及び経度に対応する位置にオブジェクトを表示するよう制御してもよい。これにより、図6の画面領域A1が得られる。 Then, the first processor 15 controls, for example, to display an object corresponding to the specified type or form of the obstacle on the corresponding display unit 16 at the specified position. At that time, even if the first processor 15 functions as conversion means for converting the position received by the first operation unit 11 into the latitude and longitude of the star (in this example, the moon as an example) where the searcher 8 is present. Good. The first processor 15 may then control to display the object at a position corresponding to the latitude and longitude. Thereby, the screen area A1 of FIG. 6 is obtained.
 図8は、画面領域A3の一例である。画面領域A3において、探査機8のオブジェクトR31、ランダー7のオブジェクトR32が表示されている。また、画面領域A3において、探査機8の移動軌跡L1が示されている。また、画面領域A3において、危険地帯を表す画像領域R33、危険地帯を表す画像領域R34が示されている。 FIG. 8 is an example of the screen area A3. In the screen area A3, an object R31 of the searcher 8 and an object R32 of the lander 7 are displayed. Further, in the screen area A3, a movement trajectory L1 of the searcher 8 is shown. Further, in the screen area A3, an image area R33 representing the danger zone and an image area R34 representing the danger zone are shown.
 図8に示す探査機8の移動軌跡を更新するために、第1の端末1における第1のプロセッサ15は例えば、第2の端末2を操作する第2のユーザ(例えばパイロット)によって指定された走行方角と車輪の単位時間あたりの回転数(例えば、毎分毎の回転数:RPM(Rotation Per Minute))に応じて、画面における探査機8の移動軌跡を更新してもよい。車輪の単位時間あたりの回転数は、速度に関する情報の一例である。これにより、リアルタイムに画面における探査機8の移動軌跡を更新することができる。 In order to update the movement trajectory of the searcher 8 shown in FIG. 8, the first processor 15 in the first terminal 1 is, for example, designated by a second user (for example, a pilot) operating the second terminal 2 The movement trajectory of the searcher 8 on the screen may be updated according to the traveling direction and the number of revolutions of the wheel per unit time (for example, the number of revolutions per minute: RPM (Rotation Per Minute)). The number of revolutions per unit time of the wheel is an example of information related to the speed. As a result, the movement trajectory of the searcher 8 on the screen can be updated in real time.
 なお、これに限らず、第1のプロセッサ15は、例えば走行パラメータに従うセッションの走行を完了する毎に、走行方角と走行距離とに応じて、画面における探査機8の移動軌跡を更新してもよい。 The present invention is not limited to this, and the first processor 15 may update the movement trajectory of the searcher 8 on the screen according to the traveling direction and the traveling distance every time the traveling of the session according to the traveling parameter is completed, for example. Good.
 また、第1のプロセッサ15は例えば、探査機8のタイヤのスリップ率を考慮して、当該探査機の移動軌跡を更新してもよい。このスリップ率は、探査機8が指令された走行距離と、探査機8が探査する星(ここでは一例として月)で採取された土壌の上を地球上で実際に走行した距離との差に応じて、設定されていてもよい。また、このスリップ率は、探査機8が指令された走行距離と、探査機8が探査する星において実際に走行した距離との差に応じて、設定されてもよい。これらの設定は、第1のプロセッサ15で設定されてもよいし、第2のプロセッサ25で設定されてもよいし、プロセッサ55で設定されてもよい。 In addition, the first processor 15 may update the movement trajectory of the searcher in consideration of, for example, the slip ratio of the tire of the searcher 8. This slip ratio is the difference between the distance traveled by the explorer 8 and the actual distance traveled on the earth on the soil collected by the star (in this case, the moon, for example) explored by the explorer 8. It may be set accordingly. In addition, this slip ratio may be set according to the difference between the distance traveled by the searcher 8 and the distance actually traveled by the star searched by the searcher 8. These settings may be set by the first processor 15, may be set by the second processor 25, or may be set by the processor 55.
 図9は、画面領域A4の一例である。画面領域A4において、ボタンR41、R42が表示されている。画面領域A4において、第2のユーザ(ここではパイロット)によって設定された走行指令の一例として、2.85°の方角に、距離0.85m、30rpmの速度で進行する走行指令が表示されている。画面領域A4において、この走行パラメータを許可するためのボタンR41、この走行パラメータを許可しないためのボタンR42が示されている。 FIG. 9 is an example of the screen area A4. In the screen area A4, buttons R41 and R42 are displayed. In the screen area A4, as an example of a travel command set by the second user (in this case, a pilot), a travel command traveling at a speed of 30 rpm and a distance of 0.85 m is displayed in the direction of 2.85 °. . In the screen area A4, a button R41 for permitting the driving parameter and a button R42 for not permitting the driving parameter are shown.
 図10は、第2の端末2に表示される画面G2の概要を示す図である。図10に示すように第2の端末2に表示される画面G2には、画面領域A5~A8が含まれている。 FIG. 10 is a diagram showing an outline of the screen G2 displayed on the second terminal 2. As shown in FIG. As shown in FIG. 10, the screen G2 displayed on the second terminal 2 includes screen areas A5 to A8.
 図11は、画面領域A5の一例である。画面領域A5において、探査機を示すオブジェクトR50が表示されている。また画面領域A5において、探査機のカメラの視野外の障害物を示すオブジェクトR51~R57が表示されている。また画面領域A5において、探査機のカメラの視野に入る障害物を示すオブジェクトR58~R59は、探査機のカメラの視野外の障害物を示すオブジェクトR51~R57とは異なる表示態様で表示されている。 FIG. 11 is an example of the screen area A5. In the screen area A5, an object R50 indicating a searcher is displayed. Further, in the screen area A5, objects R51 to R57 indicating obstacles outside the field of view of the camera of the searcher are displayed. Further, in the screen area A5, objects R58 to R59 indicating obstacles entering the field of vision of the camera of the spacecraft are displayed in a display mode different from objects R51 to R57 indicating obstacles outside the field of vision of the camera of the spacecraft .
 図11の画面領域A5の表示を実現するために、第1のプロセッサ15は例えば、第1の操作部11が受け付けた位置と障害物の形態を第1の通信部12から送信するよう制御する。ここで、障害物の形態は、対象の種類の一例である。そして第2の通信部22は、第1の端末1から送信された障害物の位置と障害物の形態を受信する。この際、第2の通信部22は、情報処理装置5を介して障害物の位置と障害物の形態を受信してもよい。そして、第2のプロセッサ25は、第1の操作部11が受け付けた位置に、障害物の形態を表すオブジェクトを、表示部26に表示するよう制御する。 In order to realize the display of the screen area A5 of FIG. 11, the first processor 15 controls, for example, the position received by the first operation unit 11 and the form of the obstacle to be transmitted from the first communication unit 12 . Here, the form of the obstacle is an example of the type of object. The second communication unit 22 receives the position of the obstacle and the form of the obstacle transmitted from the first terminal 1. At this time, the second communication unit 22 may receive the position of the obstacle and the form of the obstacle via the information processing device 5. Then, the second processor 25 controls the display unit 26 to display an object representing the form of the obstacle at the position received by the first operation unit 11.
 また、その際、第2のプロセッサ25が、第1の操作部11が受け付けた位置として、探査機8が存在する星(ここでは一例として月)の緯度及び経度を受信してもよい。そして第2のプロセッサ25は、緯度及び経度に対応する位置にオブジェクトを表示するよう制御してもよい。 Also, at that time, the second processor 25 may receive the latitude and longitude of the star (in this example, the moon as an example here) where the explorer 8 is present, as the position received by the first operation unit 11. The second processor 25 may then control to display the object at a position corresponding to the latitude and longitude.
 これにより、第2の端末2の表示部26に、第1の端末1で第1のユーザによって指定された位置に、第1の端末1で第1のユーザによって指定された「障害物の形態」を表すオブジェクトを表示することができる。このため、例えば第2のユーザ(例えば、パイロット)は、自ら障害物の位置と形態を入力する必要がなく、探査機の操縦に専念できるとともに、障害物の位置と形態を把握することができるので、進行方向の決定の際に、障害物を避けることができる。 As a result, on the display unit 26 of the second terminal 2, the “obstacle form designated by the first user at the first terminal 1 at the position designated by the first user at the first terminal 1. Objects representing "" can be displayed. Therefore, for example, the second user (for example, a pilot) does not have to input the position and the form of the obstacle by himself, and can concentrate on the operation of the spacecraft and can grasp the position and the form of the obstacle. So, when deciding on the direction of travel, obstacles can be avoided.
 なお、画面領域A5では、対象物の一例として障害物が表示されたが、これに限らず、危険地点あるいは危険地帯、資源(例えば、水、鉱物など)がある場所、または他の探査機の場所が対象物として表示されてもよい。 In the screen area A5, an obstacle is displayed as an example of the object, but the present invention is not limited to this. A dangerous spot or a dangerous zone, a place where resources (for example, water, minerals, etc.) exist, or another explorer The place may be displayed as an object.
 また第2のプロセッサ25は、探査機8のカメラの視野に入る障害物のオブジェクトを、カメラの視野外の障害物のオブジェクトとは異なる表示態様で表示するよう制御する。その際には、探査機8のカメラの視野角が予め分かっているので、第2のプロセッサ25は、オブジェクト毎に、視野角の範囲内に入っているか否かを判定することにより、探査機8のカメラの視野に入る障害物のオブジェクトを決定してもよい。 The second processor 25 also controls to display an obstacle object in the field of view of the camera of the searcher 8 in a display manner different from that of the obstacle object outside the field of view of the camera. At that time, since the view angle of the camera of the searcher 8 is known in advance, the second processor 25 determines whether or not the object is within the range of the view angle for each object. An obstacle object that falls within the field of view of the eight cameras may be determined.
 この構成により、第2のユーザ(例えば、パイロット)は、カメラの視野にある障害粒と、カメラの視野外の障害物を分けて認識することができる。これにより、カメラの視野外の障害物に探査機8が衝突するのを避けることができます。 With this configuration, the second user (for example, the pilot) can separately recognize the obstacle particles in the field of view of the camera and the obstacle outside the field of view of the camera. This prevents the spacecraft 8 from colliding with obstacles outside the camera's field of view.
 図12は、画面領域A6の一例である。画面領域A6において、探査機8のカメラによって撮像された画像が表示されている。この画像は、順次更新される。画面領域A6において、距離を表す線L2~L6が表示されている。この表示を実現するために、第2のプロセッサ25は、探査機8のカメラ画像に、距離を表す線を表示するよう制御する。距離を表す線の位置は、地上において予めカメラで、所定の距離間隔で付けられた目印(例えば、線など)などを撮影することによって、距離毎に、画像内で表示する位置が決められている。この構成により、第2のユーザ(例えば、パイロット)は、距離を表す線を参照して、カメラ画像中の対象物の距離を把握することができる。 FIG. 12 is an example of the screen area A6. In the screen area A6, an image captured by the camera of the searcher 8 is displayed. This image is sequentially updated. In the screen area A6, lines L2 to L6 indicating distances are displayed. In order to realize this display, the second processor 25 controls the camera image of the searcher 8 to display a line representing the distance. The position of the line representing the distance is determined by taking a mark (for example, line, etc.) attached at a predetermined distance interval with the camera in advance on the ground, and the position to be displayed in the image is determined for each distance. There is. According to this configuration, the second user (for example, a pilot) can grasp the distance of the object in the camera image with reference to the line representing the distance.
 また、第2のプロセッサ25は、探査機8の加速度センサ値から探査機8の前後の傾きを決定し、当該探査機8の前後の傾きに応じて、距離を表す線の位置を補正してもよい。これは、探査機8の前後の傾きがある場合には、前後方向の傾斜が付いているため、その分、距離が伸びるため、第2のプロセッサ25は、距離を表す線の位置を手前側に補正してもよい。この構成により、探査機8の前後の傾きがあったとしても、距離を表す線の位置の正確性を向上させることができる。 Further, the second processor 25 determines the inclination of the probe 8 forward and backward from the acceleration sensor value of the probe 8 and corrects the position of the line representing the distance according to the inclination of the probe 8 forward and backward. It is also good. This is because if there is an anteroposterior inclination of the searcher 8, the anteroposterior inclination is attached, and the distance is extended accordingly, so the second processor 25 sets the line representing the distance closer to the front side. It may be corrected to This configuration can improve the accuracy of the position of the line representing the distance, even if there is an anteroposterior tilt of the searcher 8.
 また、第2のプロセッサ25は、探査機8のカメラに搭載されたレンズの歪み情報から、上記距離を表す線を補正してもよい。例えば、レンズの端の方で画像が歪むので、第2のプロセッサ25は、その歪みに応じて、画像の端の方の線の位置を補正してもよい。この構成により、レンズに歪みがあっとしても、距離を表す線の位置の正確性を向上させることができる。 Further, the second processor 25 may correct the line representing the distance from the distortion information of the lens mounted on the camera of the searcher 8. For example, as the image is distorted towards the end of the lens, the second processor 25 may correct the position of the line towards the end of the image depending on the distortion. According to this configuration, even if the lens is distorted, the accuracy of the position of the line representing the distance can be improved.
 図13は、画面領域A7の一例である。画面領域A7は、走行指令を設定するための画面である。画面領域A7において、方角の設定のためのスライダーR71、これからの走行距離の設定のためのスライダーR72、探査機8の車輪の回転数設定のためのスライダーR73が表示されている。また、走行指令を設定するボタンR74が表示されている。 FIG. 13 is an example of the screen area A7. The screen area A7 is a screen for setting a travel command. In the screen area A7, a slider R71 for setting the direction, a slider R72 for setting the traveling distance from this, and a slider R73 for setting the number of rotations of the wheels of the searcher 8 are displayed. In addition, a button R74 for setting a travel command is displayed.
 第2のユーザ(例えば、パイロット)は、これらのスライダーR71、R72、R73を操作して、それぞれのパラメータを変化させ、ボタンR74を押下することによって、その時点のパラメータの値を走行パラメータに設定する。このようにして、第2の端末2における第2の操作部21は、第2のユーザ(例えば、パイロット)による走行パラメータの設定を受け付ける。この走行パラメータには、一例として、方角、距離、及び車輪の回転数が含まれる。 A second user (for example, a pilot) operates these sliders R71, R72 and R73 to change the respective parameters and presses the button R74 to set the value of the parameter at that time as the travel parameter. Do. Thus, the second operation unit 21 in the second terminal 2 receives the setting of the travel parameter by the second user (for example, the pilot). The travel parameters include, by way of example, the direction, the distance, and the number of rotations of the wheel.
 そして、他のユーザに設定された走行パラメータを送るためのボタンR76と、走行パラメータを再送するためのボタンR77とが表示されている。 Then, a button R76 for transmitting traveling parameters set for another user and a button R77 for retransmitting the traveling parameters are displayed.
 他のユーザに設定された走行パラメータを送るためのボタンR76が第2のユーザ(例えば、パイロット)によって押下された場合、第2のプロセッサ25は、他の端末すなわち第1の端末1、第3の端末3、第4の端末4に、設定された走行パラメータを送信するよう第2の通信部22を制御する。これに応じて、第1の端末1、第3の端末3、第4の端末4は、走行パラメータを受信し、例えば図9に示すように、走行パラメータを表示する。 When a button R 76 for transmitting traveling parameters set for another user is pressed by the second user (for example, a pilot), the second processor 25 selects the other terminal, that is, the first terminal 1, the third terminal 1, The second communication unit 22 is controlled to transmit the set traveling parameters to the third terminal 4 and the fourth terminal 4. In response to this, the first terminal 1, the third terminal 3 and the fourth terminal 4 receive the traveling parameters, and display the traveling parameters as shown in FIG. 9, for example.
 そして第1の端末1において、図9に示す走行パラメータを許可するためのボタンR41が押下された場合、第1のプロセッサ15は、走行パラメータを許可する旨の信号を第2の端末2へ送信するよう第1の通信部12を制御する。一方、図9に示す走行パラメータを許可しないためのボタンR42が押下された場合、第1のプロセッサ15は、走行パラメータを許可しない旨の信号を第2の端末2へ送信するよう第1の通信部12を制御する。 Then, in the first terminal 1, when the button R41 for permitting the traveling parameter shown in FIG. 9 is pressed, the first processor 15 transmits a signal to the second terminal 2 to permit the traveling parameter. Control the first communication unit 12 so that On the other hand, when the button R42 for disallowing the traveling parameter shown in FIG. 9 is pressed, the first processor 15 transmits a signal to the second terminal 2 to the effect that the traveling parameter is not permitted. Control unit 12;
 図13には、他のユーザによる走行パラメータの許可が下りたか否かを示す画像領域R75が示されている。ここで、第3のユーザ(例えばマネージャー)によって走行パラメータの許可が下りたか否かを示す表示ボックスR751が示されている。また、第2のユーザ(例えばコパイロット)によって走行パラメータの許可が下りたか否かを示す表示ボックスR752が示されている。更に第4のユーザ(例えばシステムマネージャー)によって走行パラメータの許可が下りたか否かを示す表示ボックスR753が示されている。ここで、表示ボックスR751~R753は例えば、許可が下りた場合に緑、不許可の場合に赤が表示される。 FIG. 13 shows an image area R75 indicating whether the permission of the travel parameter has been lowered by another user. Here, a display box R751 indicating whether the third user (for example, the manager) has lowered the travel parameter permission is shown. Further, a display box R752 indicating whether the second user (for example, co-pilot) has lowered the permission of the travel parameter is shown. Furthermore, a display box R753 indicating whether or not the travel parameter permission has been lowered by the fourth user (for example, the system manager) is shown. Here, the display boxes R751 to R753 display, for example, green when the permission is given and red when the permission is not given.
 本実施形態では一例として、第1のユーザ(例えばコパイロット)、第3のユーザ(例えばマネージャー)、及び第4のユーザ(例えばシステムマネージャー)の全てのユーザが走行パラメータを許可した場合に、コマンド送信ボタンR78が押下可能な状態に変更される。 In this embodiment, as an example, when all the users of the first user (e.g. co-pilot), the third user (e.g. manager), and the fourth user (e.g. system manager) permit the travel parameter, the command is issued. The send button R78 is changed to a state where it can be pressed.
 その際、本実施形態では一例として第2のプロセッサ25は、第1の端末1、第3の端末3及び第4の端末4から、走行パラメータの許可を受信した場合、当該走行コマンドの探査機8への送信指示を受付可能な表示態様で、コマンド送信用のオブジェクト(ここでは一例としてコマンド送信ボタンR78)を表示するよう制御する。この構成により、第1のユーザ(例えばコパイロット)、第3のユーザ(例えばマネージャー)、及び第4のユーザ(例えばシステムマネージャー)の全てのユーザが走行パラメータを許可した場合に、走行コマンドを探査機8に出力可能になるので、誤った走行コマンドを送信する確率を低減することができる。 At that time, in the present embodiment, as an example, when the second processor 25 receives the permission of the traveling parameter from the first terminal 1, the third terminal 3 and the fourth terminal 4, the searcher of the traveling command Control is performed to display an object for command transmission (here, as an example, a command transmission button R78) in a display mode in which a transmission instruction to 8 can be received. According to this configuration, when all the users of the first user (for example, co-pilot), the third user (for example, manager) and the fourth user (for example, system manager) permit the travel parameter, the travel command is searched Since output to the machine 8 is possible, the probability of transmitting an incorrect travel command can be reduced.
 なお、第2のプロセッサ25は、少なくとも第1の端末1から、走行パラメータの許可を受信した場合、走行コマンドの探査機8への送信指示を受付可能な表示態様で、走行指令用のオブジェクトを表示するよう制御してもよい。この構成により、第1の端末1を操作する第1のユーザ(例えばコパイロット)が走行パラメータを許可した場合に、走行コマンドを探査機8に出力可能になるので、誤った走行コマンドを送信する確率を低減することができる。 In addition, the second processor 25 receives an instruction for transmission of the traveling command to the searcher 8 when it receives permission of the traveling parameter from at least the first terminal 1, and displays the object for traveling instruction in a display mode. You may control to display. With this configuration, when the first user operating the first terminal 1 (e.g., co-pilot) permits the travel parameter, the travel command can be output to the searcher 8, so an erroneous travel command is transmitted. The probability can be reduced.
 コマンド送信ボタンR78が押下可能な状態に変更された後、コマンド送信ボタンR78が押下された場合、第2のプロセッサ25は、走行パラメータを含む走行コマンドを探査機8へ送信するよう第2の通信部22を制御する。これにより、探査機8に走行コマンドが送信される。その後、探査機8はこの走行コマンドを受信して、この走行コマンドに従って、走行する。 After the command transmission button R78 is changed to a pressable state, if the command transmission button R78 is pressed, the second processor 25 transmits the travel command including the travel parameter to the searcher 8 in the second communication. Control unit 22; As a result, a traveling command is transmitted to the searcher 8. Thereafter, the searcher 8 receives this travel command and travels according to this travel command.
 また図13には、緊急停止ボタンR79が示されている。探査機8が走行中に、前方に裂け目などの走行に障害があるものが見つかった場合などに、第2のユーザ(例えばパイロット)は、緊急停止ボタンR79が押すことにより、探査機8を緊急停止することが可能である。 Further, FIG. 13 shows an emergency stop button R79. The second user (for example, a pilot) presses the emergency stop button R 79 to emergency the searcher 8 when the searcher 8 is traveling and a obstacle such as a tear is found ahead while the searcher 8 is traveling. It is possible to stop.
 この際、緊急停止ボタンR79が押下された場合、第2のプロセッサ25は、探査機8に緊急停止コマンドを送信するよう第2の通信部22を制御する。これにより、探査機8に緊急停止コマンドが送信される。その後、探査機8はこの緊急停止コマンドを受信して、この緊急停止コマンドに従って緊急停止する。 At this time, when the emergency stop button R79 is pressed, the second processor 25 controls the second communication unit 22 to transmit an emergency stop command to the search device 8. Thus, an emergency stop command is transmitted to the searcher 8. Thereafter, the searcher 8 receives this emergency stop command, and performs an emergency stop according to the emergency stop command.
 図14は、画面領域A8の一例である。画面領域A8において、探査機8を示すオブジェクトR80が表示されている。また、画面領域A8において、探査機8の側方のカメラの視野内の障害物を示すオブジェクトR811~R817が表示されている。探査機8の先方のカメラの視野に入る障害物を示すオブジェクトR821~R825が表示されている。その際、画面領域A8において、探査機8の先方のカメラの視野に入る障害物を示すオブジェクトR821~R825は、探査機8の側方のカメラの視野内の障害物を示すオブジェクトR811~R817とは異なる表示態様で表示されている。 FIG. 14 is an example of the screen area A8. In the screen area A8, an object R80 indicating the searcher 8 is displayed. In the screen area A8, objects R811 to R817 indicating obstacles in the field of view of the camera on the side of the searcher 8 are displayed. Objects R821 to R825 indicating obstacles in the field of view of the camera ahead of the searcher 8 are displayed. At that time, in the screen area A8, objects R821 to R825 showing obstacles in the view of the camera ahead of the searcher 8 are objects R811 to R817 showing obstacles in the view of the camera on the side of the search Are displayed in different display modes.
 また画面領域A8において、探査機8の前方のカメラの視野外且つ探査機8の側方のカメラの視野外にある障害物を示すオブジェクトR831~R837が表示されている。これらのオブジェクトR831~R837は、探査機8の先方のカメラの視野に入る障害物を示すオブジェクトR821~R825、及び探査機8の側方のカメラの視野内の障害物を示すオブジェクトR811~R817と異なる表示態様で表示されている。 In the screen area A8, objects R831 to R837 indicating obstacles outside the field of view of the camera in front of the searcher 8 and outside the field of view of the camera on the side of the searcher 8 are displayed. These objects R831 to R837 are objects R821 to R825 that indicate obstacles in the field of vision of the camera ahead of the searcher 8 and objects R811 to R817 that indicate obstacles in the field of vision of the camera on the side of the spacecraft 8 It is displayed in different display modes.
 また、画面領域A8において、探査機8の移動軌跡L7が示されている。図8に示す探査機8の移動軌跡を更新するために、第2の端末2における第2のプロセッサ25は例えば、第2のユーザ(例えばパイロット)によって指定された走行方角と車輪の単位時間あたりの回転数(例えば、毎分毎の回転数:RPM(Rotation Per Minute))に応じて、画面における探査機8の移動軌跡を更新してもよい。車輪の単位時間あたりの回転数は、速度に関する情報の一例である。これにより、リアルタイムに画面における探査機8の移動軌跡を更新することができる。 Further, in the screen area A8, a movement trajectory L7 of the searcher 8 is shown. In order to update the movement trajectory of the searcher 8 shown in FIG. 8, the second processor 25 in the second terminal 2 may, for example, travel direction designated by the second user (eg, pilot) and wheel unit time The movement trajectory of the searcher 8 on the screen may be updated according to the number of rotations (for example, the number of rotations per minute: RPM (Rotation Per Minute)). The number of revolutions per unit time of the wheel is an example of information related to the speed. As a result, the movement trajectory of the searcher 8 on the screen can be updated in real time.
 なお、これに限らず、第2のプロセッサ25は、例えば走行コマンドに従うセッションの走行を完了する毎に、走行方角と走行距離とに応じて、画面における探査機8の移動軌跡を更新してもよい。 The second processor 25 may update the movement trajectory of the searcher 8 on the screen according to the traveling direction and the traveling distance every time the second processor 25 completes traveling of the session according to the traveling command, for example. Good.
 また、第2のプロセッサ25は例えば、探査機8のタイヤのスリップ率を考慮して、当該探査機の移動軌跡を更新してもよい。このスリップ率は、探査機8が指令された走行距離と、探査機8が探査する星(ここでは一例として月)で採取された土壌の上を地球上で実際に走行した距離との差に応じて、設定されていてもよい。また、このスリップ率は、探査機8が指令された走行距離と、探査機8が探査する星において実際に走行した距離との差に応じて、設定されてもよい。これらの設定は、第1のプロセッサ15で設定されてもよいし、第2のプロセッサ25で設定されてもよいし、プロセッサ55で設定されてもよい。 In addition, the second processor 25 may update the movement trajectory of the searcher 8 in consideration of, for example, the slip ratio of the tire of the searcher 8. This slip ratio is the difference between the distance traveled by the explorer 8 and the actual distance traveled on the earth on the soil collected by the star (in this case, the moon, for example) explored by the explorer 8. It may be set accordingly. In addition, this slip ratio may be set according to the difference between the distance traveled by the searcher 8 and the distance actually traveled by the star searched by the searcher 8. These settings may be set by the first processor 15, may be set by the second processor 25, or may be set by the processor 55.
 以上、本実施形態に係る表示システムSは、第1の端末1と第2の端末2を有し、探査機の周囲の状況を表示する表示システムである。第1の端末1は、探査機8の周囲に存在する対象の位置と当該対象の種類の指定を第1のユーザから受け付ける第1の操作部11と、第1の通信部12と、前記第1の操作部11が受け付けた位置と種類を第1の通信部12から送信するよう制御する第1のプロセッサ15と、を有する。第2の端末2は、対象の位置と対象の種類を受信する第2の通信部と、受信された対象の位置に、当該受信された対象の種類に対応するオブジェクトを、対応する表示部26に表示するよう制御する第2のプロセッサ25と、を有する。 As mentioned above, display system S concerning this embodiment is the display system which has the 1st terminal 1 and the 2nd terminal 2, and displays the situation around a spacecraft. The first terminal 1 receives from the first user a designation of a target position and a target type existing around the searcher 8, a first operation unit 11, a first communication unit 12, and the first communication unit 12. The first processor 15 controls to transmit the position and type received by the first operation unit 11 from the first communication unit 12. The second terminal 2 displays a second communication unit that receives the position of the object and the type of the object, and an object corresponding to the type of the received object at the position of the received object, the display unit 26 corresponding to And a second processor 25 that controls to display on the display.
 この構成により、表示部26に、第1の端末1で第1のユーザ(例えばコパイロット)指定された対象の位置に対象の種類に対応するオブジェクトが表示されるので、第2の端末2を操作する第2のユーザ(例えばパイロット)は、探査機8の周囲の状況把握を容易化することができる。 With this configuration, the display unit 26 displays an object corresponding to the type of the target at the position of the target designated by the first user (for example, co-pilot) in the first terminal 1. A second user (for example, a pilot) who operates can facilitate understanding of the situation around the searcher 8.
 ここで、本実施形態では一例として探査機は、月を探査するものとして説明したが、他の衛星、惑星、小惑星、宇宙空間または地球上の探査対象地帯(例えば、火山などの危険地帯、高温地帯、低温地帯、災害現場、事故現場など)を探査してもよい。 Here, in the present embodiment, as an example, the spacecraft has been described as searching for the moon, but other satellites, planets, asteroids, space to be explored in space or on the earth (for example, dangerous zones such as volcanoes, high temperature) Zones, cold zones, disaster sites, accident sites, etc.).
 なお、本実施形態の第1の端末1及び/または第2の端末2の各処理を実行するためのプログラムをコンピュータ読み取り可能な記録媒体に記録して、当該記録媒体に記録されたプログラムをコンピュータシステムに読み込ませ、プロセッサが実行することにより、本実施形態の第1の端末1及び/または第2の端末2に係る上述した種々の処理を行ってもよい。 A program for executing each process of the first terminal 1 and / or the second terminal 2 of the present embodiment is recorded in a computer readable recording medium, and the program recorded in the recording medium is a computer. The above-described various processes relating to the first terminal 1 and / or the second terminal 2 of the present embodiment may be performed by being read by the system and executed by the processor.
 以上、本開示は上記実施形態そのままに限定されるものではなく、実施段階ではその要旨を逸脱しない範囲で構成要素を変形して具体化できる。また、上記実施形態に開示されている複数の構成要素の適宜な組み合わせにより、種々の発明を形成できる。例えば、実施形態に示される全構成要素から幾つかの構成要素を削除してもよい。更に、異なる実施形態にわたる構成要素を適宜組み合わせてもよい。 As described above, the present disclosure is not limited to the above embodiment as it is, and in the implementation stage, the components can be modified and embodied without departing from the scope of the invention. In addition, various inventions can be formed by appropriate combinations of a plurality of constituent elements disclosed in the above embodiment. For example, some components may be deleted from all the components shown in the embodiment. Furthermore, components in different embodiments may be combined as appropriate.
 1 第1の端末
 11 第1の操作部
 12 第1の通信部
 13 記憶部
 14 RAM
 15 第1のプロセッサ
 16 表示部
 2 第2の端末
 21 第2の操作部
 22 第2の通信部
 23 記憶部
 24 RAM
 25 第2のプロセッサ
 26 表示部
 3 第3の端末
 4 第4の端末
 5 情報処理装置
 51 入力部
 52 通信部
 53 記憶部
 54 RAM
 55 プロセッサ
 6 地上局
 7 ランダー
 8 探査機

 
1 first terminal 11 first operation unit 12 first communication unit 13 storage unit 14 RAM
15 first processor 16 display unit 2 second terminal 21 second operation unit 22 second communication unit 23 storage unit 24 RAM
25 second processor 26 display unit 3 third terminal 4 fourth terminal 5 information processing device 51 input unit 52 communication unit 53 storage unit 54 RAM
55 processor 6 ground station 7 lander 8 spacecraft

Claims (18)

  1.  探査機の周囲の状況を表示する端末であって、
     前記探査機の周囲に存在する対象の位置と当該対象の種類の指定をユーザから受け付ける操作部と、
     前記操作部が受け付けた位置に、前記対象の種類または形態に対応するオブジェクトを表示部に表示するよう制御するプロセッサと、
     通信部と、
     を有し、
     前記プロセッサは、前記操作部が受け付けた位置に前記対象の種類または形態に対応するオブジェクトが他の端末において表示されるように、前記操作部が受け付けた位置と種類を前記通信部から前記他の端末へ送信するよう制御する端末。
    A terminal that displays the situation around the spacecraft, and
    An operation unit that receives from an user a designation of a position of an object existing around the search device and a type of the object;
    A processor that controls an object corresponding to the type or form of the object to be displayed on a display unit at a position received by the operation unit;
    Communication department,
    Have
    The processor is configured to receive, from the communication unit, the position and type received by the operation unit such that an object corresponding to the type or form of the object is displayed on another terminal at the position received by the operation unit. Terminal that controls sending to the terminal.
  2.  前記対象が障害物である場合、前記操作部は、前記ユーザによる障害物の形態を表すアイコンの選択を前記対象の形態の指定として受け付け、障害物の位置の指定を前記ユーザから受け付け、
     前記プロセッサは、前記操作部が受け付けた位置に、前記障害物の形態を表すオブジェクトを前記表示部に表示するよう制御し、前記操作部が受け付けた位置に前記障害物の形態を表すオブジェクトが他の端末において表示されるように、前記操作部が受け付けた位置と前記障害物の形態を前記通信部から前記他の端末へ送信するよう制御する
     請求項1に記載の端末。
    When the target is an obstacle, the operation unit receives selection of an icon representing the form of the obstacle by the user as specification of the form of the target, and receives specification of the position of the obstacle from the user,
    The processor controls an object representing the form of the obstacle to be displayed on the display unit at a position received by the operation unit, and an object representing the form of the obstacle appears at another position received by the operation unit. The terminal according to claim 1, wherein the control unit controls the communication unit to transmit the position received by the operation unit and the form of the obstacle to the other terminal so as to be displayed on the terminal.
  3.  前記操作部が受け付けた位置を、前記探査機が存在する星の緯度及び経度に変換する変換手段を更に有し、
     前記プロセッサは、前記緯度及び前記経度に対応する位置に前記オブジェクトを表示するよう制御し、前記緯度及び前記経度を前記操作部が受け付けた位置として、前記通信部から前記他の端末へ送信するよう制御する
     請求項1または2に記載の端末。
    It further comprises conversion means for converting the position received by the operation unit into the latitude and longitude of the star on which the probe is present,
    The processor controls to display the object at a position corresponding to the latitude and the longitude, and transmits the latitude and the longitude from the communication unit to the other terminal as a position accepted by the operation unit. The terminal according to claim 1 or 2, which controls.
  4.  前記探査機に設けられた太陽光パネルにおける太陽光発電の電圧に応じて、太陽の方向を決定する太陽方向決定手段を有し、
     前記プロセッサは、前記太陽の方向を前記表示部に表示するよう制御する
     請求項1から3のいずれか一項に記載の端末。
    It has a sun direction determining means for determining the direction of the sun according to the voltage of the solar power generation in the solar panel provided in the spacecraft,
    The terminal according to any one of claims 1 to 3, wherein the processor controls to display the direction of the sun on the display unit.
  5.  前記プロセッサは、ランダーの緯度及び経度を取得し、当該ランダーの緯度及び経度を用いてランダーの位置または方角を前記表示部に表示するよう制御する
     請求項1から4のいずれか一項に記載の端末。
    The processor according to any one of claims 1 to 4, wherein the processor acquires the latitude and longitude of the lander, and controls the display unit to display the position or direction of the lander using the latitude and longitude of the lander. Terminal.
  6.  探査機の周囲の状況を表示する端末であって、
     前記探査機の周囲に存在する対象の位置と当該対象の種類または形態を他の端末から受信する通信部と、
     前記受信された対象の位置に、当該受信された対象の種類または形態に対応するオブジェクトを表示部に表示するよう制御するプロセッサと、
     を有する端末。
    A terminal that displays the situation around the spacecraft, and
    A position of an object existing around the probe and a communication unit for receiving the type or form of the object from another terminal;
    A processor that controls an object corresponding to the type or form of the received object to be displayed on a display unit at the position of the received object;
    Terminal with.
  7.  前記プロセッサは、探査機のカメラ画像を表示するとともに、当該カメラ画像に距離を表す線を表示するよう制御する
     請求項6に記載の端末。
    The terminal according to claim 6, wherein the processor controls to display a camera image of the searcher and display a line indicating a distance on the camera image.
  8.  前記プロセッサは、前記探査機の加速度センサ値から当該探査機の前後の傾きを決定し、当該探査機の前後の傾きに応じて、前記距離を表す線の位置を補正する
     請求項7に記載の端末。
    8. The processor according to claim 7, wherein the processor determines an inclination of the probe back and forth from an acceleration sensor value of the probe and corrects a position of a line representing the distance according to the inclination of the probe back and forth. Terminal.
  9.  前記プロセッサは、前記探査機のカメラに搭載されたレンズの歪み情報から、前記距離を表す線を補正する
     請求項7または8に記載の端末。
    The terminal according to claim 7 or 8, wherein the processor corrects a line representing the distance from distortion information of a lens mounted on a camera of the spacecraft.
  10.  ユーザによる走行パラメータの設定を受け付ける操作部を更に有し、
     前記プロセッサは、前記他の端末において前記コマンドを許可するか否かを選択可能なように、前記設定されたコマンドを他の端末に送信するよう前記通信部を制御する
     請求項6から9のいずれか一項に記載の端末。
    It further has an operation unit for receiving setting of traveling parameters by the user,
    10. The processor according to any one of claims 6 to 9, wherein the processor controls the communication unit to transmit the set command to another terminal so as to select whether to permit the command in the other terminal. The terminal according to one item.
  11.  前記プロセッサは、少なくとも一つの他の端末から、走行パラメータの許可を受信した場合、当該走行パラメータの前記探査機への送信指示を受付可能な表示態様で、コマンド送信用のオブジェクトを表示するよう制御し、
     前記操作部が、前記走行パラメータの前記探査機への送信指示を受け付けた場合、前記プロセッサは、前記走行パラメータを含む走行コマンドを前記探査機へ送信するよう前記通信部を制御する
     請求項10に記載の端末。
    The processor is controlled to display an object for command transmission in a display mode that can receive an instruction to transmit the traveling parameter to the searcher when receiving permission of the traveling parameter from at least one other terminal. And
    When the operation unit receives an instruction to transmit the travel parameter to the searcher, the processor controls the communication unit to transmit a travel command including the travel parameter to the searcher. Terminal described.
  12.  前記プロセッサは、前記探査機のカメラの視野に入る前記障害物のオブジェクトを、前記カメラの視野外の障害物のオブジェクトとは異なる表示態様で表示するよう制御する
     請求項1から11のいずれか一項に記載の端末。
    12. The processor according to any one of claims 1 to 11, wherein the processor controls to display an object of the obstacle in the field of view of the camera of the spacecraft in a display mode different from an object of the obstacle outside the field of view of the camera. Terminal described in Section.
  13.  前記プロセッサは、前記端末を操作するユーザによって指定された走行方角と走行距離及び/または速度に関する情報とに応じて、画面における探査機の移動軌跡を更新する
     請求項1から12のいずれか一項に記載の端末。
    13. The processor according to any one of claims 1 to 12, wherein the movement trajectory of the searcher on the screen is updated according to the traveling direction designated by the user operating the terminal and the information on the traveling distance and / or the speed. Terminal described in.
  14.  前記プロセッサは、前記探査機のタイヤのスリップ率を更に考慮して、当該探査機の移動軌跡を更新する
     請求項13に記載の端末。
    The terminal according to claim 13, wherein the processor updates the movement trajectory of the searcher, further taking into consideration the tire slip ratio of the searcher.
  15.  前記スリップ率は、前記探査機が指令された走行距離と、前記探査機が探査する星で採取された土壌の上を地球上で実際に走行した距離との差に応じて、設定されている
     請求項14に記載の端末。
    The slip ratio is set according to the difference between the distance traveled by the spacecraft and the actual distance traveled on the earth on the earth sampled by the spacecraft. The terminal according to claim 14.
  16.  前記スリップ率は、前記探査機が指令された走行距離と、前記探査機が探査する星において実際に走行した距離との差に応じて、設定される
     請求項14に記載の端末。
    The terminal according to claim 14, wherein the slip ratio is set in accordance with a difference between a travel distance instructed by the probe and a distance actually traveled by a star searched by the probe.
  17.  第1の端末と第2の端末を有し、探査機の周囲の状況を表示する表示システムであって、
     前記第1の端末は、
     前記探査機の周囲に存在する対象の位置と当該対象の種類または形態の指定を第1のユーザから受け付ける第1の操作部と、
     第1の通信部と、
     前記第1の操作部が受け付けた位置と種類または形態を前記第1の通信部から送信するよう制御する第1のプロセッサと、
     を有し、
     前記第2の端末は、
     前記対象の位置と前記対象の種類または形態を受信する第2の通信部と、
     前記受信された対象の位置に、当該受信された対象の種類または形態に対応するオブジェクトを、対応する表示部に表示するよう制御する第2のプロセッサと、
     を有する表示システム。
    A display system having a first terminal and a second terminal for displaying the situation around the searcher,
    The first terminal is
    A first operation unit that receives, from a first user, designation of a position of an object present around the search device and the type or form of the object;
    A first communication unit,
    A first processor that controls the first communication unit to transmit the position and type or the form received by the first operation unit;
    Have
    The second terminal is
    A second communication unit for receiving the position of the object and the type or form of the object;
    A second processor that controls an object corresponding to the type or form of the received object to be displayed on a corresponding display unit at the position of the received object;
    A display system having:
  18.  探査機の状況を表示する表示方法であって、
     第1の端末が、前記探査機の周囲に存在する対象の位置と当該対象の種類または形態の指定を第1のユーザから受け付ける工程と、
     前記第1の端末が、受け付けた位置と種類または形態を送信する工程と、
     第2の端末が、前記位置と前記種類または形態を受信する工程と、
     前記第2の端末が、前記第1の端末が受け付けた位置に、当該指定を受けた対象の種類または形態に対応するオブジェクトを、対応する表示部に表示するよう制御する工程と、
     を有する表示方法。
     
    A display method for displaying the status of the spacecraft,
    A step of the first terminal receiving from the first user a designation of a position of a target existing around the searcher and the type or form of the target;
    The first terminal transmitting the received position and type or form;
    The second terminal receiving the location and the type or form;
    Controlling the second terminal to display, on a corresponding display unit, an object corresponding to the type or form of the target which has received the designation, at a position received by the first terminal;
    Display method.
PCT/JP2017/026094 2017-07-19 2017-07-19 Terminal, display system and display method WO2019016888A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/026094 WO2019016888A1 (en) 2017-07-19 2017-07-19 Terminal, display system and display method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/026094 WO2019016888A1 (en) 2017-07-19 2017-07-19 Terminal, display system and display method

Publications (1)

Publication Number Publication Date
WO2019016888A1 true WO2019016888A1 (en) 2019-01-24

Family

ID=65015695

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/026094 WO2019016888A1 (en) 2017-07-19 2017-07-19 Terminal, display system and display method

Country Status (1)

Country Link
WO (1) WO2019016888A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003532218A (en) * 2000-05-01 2003-10-28 アイロボット コーポレーション Method and system for remotely controlling a mobile robot
US20120072052A1 (en) * 2010-05-11 2012-03-22 Aaron Powers Navigation Portals for a Remote Vehicle Control User Interface
US20140152822A1 (en) * 2012-12-05 2014-06-05 Florida Institute for Human and Machine Cognition User Display Providing Obstacle Avoidance
US20150019043A1 (en) * 2013-07-12 2015-01-15 Jaybridge Robotics, Inc. Computer-implemented method and system for controlling operation of an autonomous driverless vehicle in response to obstacle detection

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003532218A (en) * 2000-05-01 2003-10-28 アイロボット コーポレーション Method and system for remotely controlling a mobile robot
US20120072052A1 (en) * 2010-05-11 2012-03-22 Aaron Powers Navigation Portals for a Remote Vehicle Control User Interface
US20140152822A1 (en) * 2012-12-05 2014-06-05 Florida Institute for Human and Machine Cognition User Display Providing Obstacle Avoidance
US20150019043A1 (en) * 2013-07-12 2015-01-15 Jaybridge Robotics, Inc. Computer-implemented method and system for controlling operation of an autonomous driverless vehicle in response to obstacle detection

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
DE FILIPPIS LUCA ET AL.: "Remote Control Station Design and Testing for Tele-Operated Space-Missions", INTERNATIONAL JOURNAL OF AEROSPACE SCIENCES, vol. 2, no. 3, 2013, pages 92 - 105, XP055677912 *
DEANS MATTHEW C. ET AL.: "Robotic Scouting for Human Exploration", AIAA SPACE 2009 CONFERENCE & EXPOSITION, 14 September 2009 (2009-09-14), pages 1 - 15, XP055677909 *

Similar Documents

Publication Publication Date Title
US11217112B2 (en) System and method for supporting simulated movement
KR101117207B1 (en) Auto and manual control system for unmanned aerial vehicle via smart phone
CN107077113B (en) Unmanned aerial vehicle flight display
CN110325939B (en) System and method for operating an unmanned aerial vehicle
US9104202B2 (en) Remote vehicle missions and systems for supporting remote vehicle missions
EP2895819B1 (en) Sensor fusion
EP3206768B1 (en) Inspection vehicle control device, control method, and computer program
US8521339B2 (en) Method and system for directing unmanned vehicles
US20210278834A1 (en) Method for Exploration and Mapping Using an Aerial Vehicle
US20190251851A1 (en) Navigation method and device based on three-dimensional map
JP6815479B2 (en) Display control device, display control method and storage medium
WO2015034390A1 (en) Control device for cyber-physical systems
WO2020107454A1 (en) Method and apparatus for accurately locating obstacle, and computer readable storage medium
WO2017169841A1 (en) Display device and display control method
CN110785720A (en) Information processing device, information presentation instruction method, program, and recording medium
WO2019016888A1 (en) Terminal, display system and display method
JP5969903B2 (en) Control method of unmanned moving object
JP6560479B1 (en) Unmanned aircraft control system, unmanned aircraft control method, and program
JP6684012B1 (en) Information processing apparatus and information processing method
AU2011293447B2 (en) Remote vehicle missions and systems for supporting remote vehicle missions
US20240153390A1 (en) Flight management system, flight management method, and flight management program for multiple aerial vehicles
CN114895713B (en) Aircraft landing method, aircraft landing system, aircraft and storage medium
Kevin et al. 3D terrain mapping vehicle for search and rescue
WO2020054245A1 (en) Image information combination device
KR20170109759A (en) Apparatus for operating unmanned dron

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17918625

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17918625

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP