WO2019016888A1 - Terminal, système d'affichage et procédé d'affichage - Google Patents

Terminal, système d'affichage et procédé d'affichage Download PDF

Info

Publication number
WO2019016888A1
WO2019016888A1 PCT/JP2017/026094 JP2017026094W WO2019016888A1 WO 2019016888 A1 WO2019016888 A1 WO 2019016888A1 JP 2017026094 W JP2017026094 W JP 2017026094W WO 2019016888 A1 WO2019016888 A1 WO 2019016888A1
Authority
WO
WIPO (PCT)
Prior art keywords
terminal
processor
display
searcher
type
Prior art date
Application number
PCT/JP2017/026094
Other languages
English (en)
Japanese (ja)
Inventor
香子 米澤
敏郎 清水
オリオル ガスケス
Original Assignee
株式会社ispace
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ispace filed Critical 株式会社ispace
Priority to PCT/JP2017/026094 priority Critical patent/WO2019016888A1/fr
Publication of WO2019016888A1 publication Critical patent/WO2019016888A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64GCOSMONAUTICS; VEHICLES OR EQUIPMENT THEREFOR
    • B64G1/00Cosmonautic vehicles
    • B64G1/16Extraterrestrial cars
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01VGEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
    • G01V9/00Prospecting or detecting by methods not provided for in groups G01V1/00 - G01V8/00
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04QSELECTING
    • H04Q9/00Arrangements in telecontrol or telemetry systems for selectively calling a substation from a main station, in which substation desired apparatus is selected for applying a control signal thereto or for obtaining measured values therefrom

Definitions

  • the present disclosure relates to a terminal, a display system, and a display method.
  • the terminal is a terminal that displays the situation around the searcher, and an operation unit that receives from the user the designation of the position of the target existing around the searcher and the type of the target; It has a processor which controls to display an object corresponding to the kind or form of the object on a display unit at a position received by the operation unit, and a communication unit, and the processor is a position received by the operation unit
  • the position and type received by the operation unit may be controlled to be transmitted from the communication unit to the other terminal such that an object corresponding to the type or form of the object is displayed on the other terminal.
  • the terminal which concerns on the other aspect of this indication is a terminal which displays the surrounding condition of a spacecraft, Comprising: The position of the object which exists around the spacecraft and the kind or form of the object concerned from other terminals
  • the communication system may include a communication unit to receive, and a processor that controls an object corresponding to the type or form of the received object to be displayed on a display unit at the position of the received object.
  • a display system is a display system having a first terminal and a second terminal, and displaying a situation around a search device, wherein the first terminal is A first operation unit that receives from the first user the position of an object existing around the searcher and the designation of the type or form of the object, the first communication unit, and the position received by the first operation unit And a first processor that controls the type or form to be transmitted from the first communication unit, and the second terminal receives the position of the object and the type or form of the object. And a second processor that controls an object corresponding to the type or form of the received object to be displayed on the corresponding display unit at the position of the received object. Good.
  • a display method is a display method for displaying the status of a searcher, wherein the first terminal is a position of a target existing around the searcher and a type of the target or the target A step of receiving designation of a form from a first user, a step of transmitting the received position and type or form by the first terminal, and a step of receiving the position and the type or form by the second terminal And controlling the second terminal to display an object corresponding to the type or form of the designated object at the position received by the first terminal on the corresponding display unit. You may have.
  • FIG. It is a block diagram showing a schematic structure of a display system concerning this embodiment. It is a block diagram which shows schematic structure of the 1st terminal which concerns on this embodiment. It is a block diagram which shows schematic structure of the 2nd terminal which concerns on this embodiment. It is a block diagram showing a schematic structure of an information processor concerning this embodiment. It is a figure which shows the outline
  • FIG. It is an example of screen field A1. It is an example of screen field A2. It is an example of screen field A3. It is an example of screen field A4. It is a figure which shows the outline
  • FIG. It is an example of screen field A5. It is an example of screen field A6. It is an example of screen field A7. It is an example of screen field A8.
  • a display system and a display method capable of facilitating grasping the situation around the spacecraft are provided.
  • the terminal is a terminal for displaying the situation around the searcher, and an operation unit that receives from the user the designation of the position of the target existing around the searcher and the type of the target And a processor configured to control an object corresponding to the type or form of the object to be displayed on the display unit at a position received by the operation unit, and a communication unit, the processor receiving the operation unit The position and type received by the operation unit are controlled to be transmitted from the communication unit to the other terminal such that an object corresponding to the type or form of the object is displayed on the other position on the other terminal.
  • the terminal according to the second aspect of the embodiment is the terminal according to the first aspect, and when the target is an obstacle, the operation unit selects an icon representing the form of the obstacle by the user. Accept as designation of the form of the target, accept designation of the position of the obstacle from the user, and display the object representing the form of the obstacle on the display unit at the position accepted by the operation unit.
  • the position received by the operation unit and the shape of the obstacle are controlled from the communication unit so that an object representing the form of the obstacle is displayed on the other terminal at the position received by the operation unit. Control to send to the terminal of
  • an object representing the form of the obstacle is displayed at the position of the obstacle specified by the user at the terminal, so that another user operating the other terminal (for example, a pilot) 2.) does not need to input the position and the form of the obstacle by itself, and can easily grasp the obstacle around the spacecraft. Therefore, for example, when another user is a pilot, it is possible to concentrate on the operation of the spacecraft and to know the position and the form of the obstacle, so that the obstacle can be avoided when deciding the traveling direction. it can.
  • the terminal according to the third aspect of the embodiment is the terminal according to the second aspect, and further converting means for converting the position received by the operation unit into the latitude and longitude of the star on which the probe is present
  • the processor controls the object to be displayed at a position corresponding to the latitude and the longitude, and the latitude and the longitude are positions received by the operation unit from the communication unit to the other terminal Control to send.
  • the terminal according to the fourth aspect of the embodiment is a terminal according to any one of the first to third aspects, wherein the terminal according to the solar power generation in the solar panel provided in the searcher There is a sun direction determining means for determining a direction, and the processor controls to display the direction of the sun on the display unit.
  • a terminal according to a fifth aspect of the embodiment is a terminal according to any one of the first to fourth aspects, wherein the processor acquires the latitude and longitude of the lander, and uses the latitude and longitude of the lander. The position or direction of the lander is controlled to be displayed on the display unit.
  • the terminal includes the position of an object present around the searcher and a communication unit that receives the type or form of the object from another terminal, and the position of the received object. Controlling the display unit to display an object corresponding to the type or form of the received object.
  • the terminal can display an object corresponding to the type or form of the received target at the position of the target received from another terminal. You can grasp the situation around the spacecraft without it.
  • a terminal according to a seventh aspect of the embodiment is a terminal according to the sixth aspect, wherein the processor is controlled to display a camera image of the searcher and to display a line indicating a distance on the camera image. Do.
  • a line representing the distance is displayed on the camera image, so that the user can grasp the distance of the object in the camera image.
  • the terminal according to the eighth aspect of the embodiment is the terminal according to the seventh aspect, wherein the processor determines the inclination of the spacecraft back and forth from the acceleration sensor value of the spacecraft, and the spacecraft of the spacecraft The position of the line representing the distance is corrected according to the back and forth inclination.
  • the terminal according to a ninth aspect of the embodiment is a terminal according to the seventh or eighth aspect, wherein the processor is configured to calculate a line representing the distance from distortion information of a lens mounted on a camera of the spacecraft. to correct.
  • the terminal according to a tenth aspect of the embodiment is a terminal according to any one of the sixth to ninth aspects, further comprising an operation unit that receives setting of traveling parameters by a user, and the processor is configured to The communication unit is controlled to transmit the set command to another terminal such that the terminal can select whether to permit the command or not.
  • the terminal according to an eleventh aspect of the embodiment is a terminal according to the tenth aspect, and when the processor receives from the at least one other terminal permission for the travel parameter, the search for the travel parameter Control to display an object for command transmission in a display mode that allows reception of a transmission instruction to the aircraft, and the processor receives an instruction to transmit the traveling parameter to the searcher when the operation unit receives the transmission instruction to the searcher, The communication unit is controlled to transmit a travel command including the travel parameter to the searcher.
  • the user of the terminal when permission of the travel parameter is received from at least one other terminal, the user of the terminal can transmit the travel parameter to the searcher, so the probability of transmitting the incorrect travel parameter is determined. It can be reduced.
  • the terminal according to a twelfth aspect of the embodiment is a terminal according to any one of the first to eleventh aspects, wherein the processor is configured to set the object of the obstacle within the field of view of the camera of the searcher to the camera. It controls to display in the display mode different from the object of the obstacle of out of view.
  • the user of the terminal can easily distinguish the object of the obstacle in the view of the camera of the searcher from the object of the obstacle outside the view of the camera.
  • the terminal according to a thirteenth aspect of the embodiment is a terminal according to any one of the first to twelfth aspects, wherein the processor is configured to calculate a traveling direction and a traveling distance and / or a distance designated by a user operating the terminal.
  • the movement trajectory of the searcher on the screen is updated according to the information on the velocity.
  • the terminal according to a fourteenth aspect of the embodiment is a terminal according to the thirteenth aspect, wherein the processor updates the movement trajectory of the searcher, further considering the slip ratio of the tire of the searcher .
  • a terminal according to a fifteenth aspect of the embodiment is a terminal according to the fourteenth aspect, wherein the slip ratio is obtained by the traveling distance to which the probe is instructed and a star searched by the probe. It is set according to the difference with the distance actually traveled on the earth on the soil.
  • a terminal according to a sixteenth aspect of the embodiment is a terminal according to the fourteenth aspect, wherein the slip ratio is an actual travel distance at which the spacecraft is instructed and a star which the spacecraft searches. It is set according to the difference between the distance and the distance.
  • a display system is a display system having a first terminal and a second terminal, and displaying a situation around a search device, the first terminal being the search
  • a first operation unit that receives, from a first user, the position of an object existing around the machine and the designation of the type or form of the object, a first communication unit, and a position received by the first operation unit;
  • a second processor configured to control the type or form to be transmitted from the first communication unit, and the second terminal receives the position of the object and the type or form of the object.
  • a communication unit, and a second processor that controls an object corresponding to the type or form of the received object to be displayed on a corresponding display unit at the position of the received object.
  • the second terminal displays the object corresponding to the type of the target at the position of the target specified by the user of the first terminal. You can easily grasp the surrounding situation.
  • a display method is a display method for displaying a state of a searcher, wherein a first terminal is a position of an object existing around the searcher and the type or form of the object Receiving from the first user the designation of the first user, transmitting the received position and type or form by the first terminal, and receiving the position and the type or form by the second terminal. And controlling the second terminal to display an object corresponding to the type or form of the target subjected to the designation on a corresponding display unit at a position received by the first terminal. .
  • the second terminal displays the object corresponding to the type of the target at the position of the target specified by the user of the first terminal. You can easily grasp the surrounding situation.
  • the pilot operating the spacecraft must control the spacecraft while constantly monitoring whether the danger to the spacecraft is approaching by watching the camera image sent from the spacecraft. For this reason, it did not reach attention to grasping
  • the spacecraft explores the moon, which is an example of a star, and a display system for grasping the situation around the spacecraft will be described.
  • FIG. 1 is a block diagram showing a schematic configuration of a display system according to the present embodiment.
  • the display system S includes a first terminal 1, a second terminal 2, a third terminal 3, a fourth terminal 4, an information processing device 5, and a ground station 6.
  • the first terminal 1, the second terminal 2, the third terminal 3, and the fourth terminal 4 can communicate with the information processing device 5 via the communication network N.
  • Communication by the first terminal 1, the second terminal 2, the third terminal 3, and the fourth terminal 4 may be wireless or wired.
  • these communication units will be described as an example of wireless communication.
  • the first terminal 1 is a terminal operated by a first user (here, as an example, a co-pilot that assists a pilot operating a searcher).
  • the second terminal 2 is a terminal operated by (here, as an example, a pilot operating a searcher).
  • the third terminal 3 is a terminal operated by a third user (for example, a project manager or the like).
  • the fourth terminal 4 is a terminal operated by a fourth user (for example, a system manager).
  • a team for operation of a spacecraft is formed by four people of a pilot, a co-pilot, a project manager, and a system manager.
  • the ground station 6 can communicate with the lander 7 by radio.
  • the ground station 6 can acquire the latitude and longitude of the lander from the lander 7.
  • the searcher 8 can communicate with the lander 7 by radio.
  • the ground station 6 can receive information from the searcher 8 via the lander 7. For example, image data captured by the searcher 8, solar power generation in a solar panel provided in the searcher 8 It can receive voltage.
  • the information processing device 5 is, for example, a server, and can communicate with the ground station 6.
  • the information processing apparatus 5 acquires information (for example, image data captured by the searcher 8, voltage of solar power generation in a solar panel provided in the searcher, latitude and longitude of a lander, etc.) from the ground station 6 Can.
  • FIG. 2 is a block diagram showing a schematic configuration of the first terminal 1 according to the present embodiment.
  • the first operation unit 11, the first communication unit 12, the storage unit 13, a random access memory (RAM) 14, a first processor 15, and a display unit 16 are provided.
  • each part is connected via a bus.
  • the first operation unit 11 receives an operation by the first user.
  • the first operation unit 11 is, for example, a touch panel.
  • the first communication unit 12 communicates with another terminal or the information processing apparatus 5. This communication may be wired or wireless.
  • the storage unit 13 stores a program to be executed by the first processor 15.
  • the RAM 14 temporarily stores information.
  • the first processor 15 reads a program from the storage unit 13 and executes the program.
  • FIG. 3 is a block diagram showing a schematic configuration of a second terminal according to the present embodiment.
  • the second operation unit 21, the second communication unit 22, the storage unit 23, a random access memory (RAM) 24, a second processor 25, and a display unit 26 are provided.
  • each part is connected via a bus.
  • the second operation unit 21 receives an operation by the second user.
  • the second operation unit 21 is, for example, a touch panel.
  • the second communication unit 22 communicates with another terminal or the information processing apparatus 5. This communication may be wired or wireless.
  • the storage unit 23 stores a program to be executed by the second processor 25.
  • the RAM 24 temporarily stores information.
  • the second processor 25 reads the program from the storage unit 23 and executes the program.
  • FIG. 4 is a block diagram showing a schematic configuration of the information processing apparatus according to the present embodiment.
  • an input unit 51 As shown in FIG. 4, an input unit 51, a communication unit 52, a storage unit 53, a random access memory (RAM) 54, and a processor 55 are provided. As shown in FIG. 4, each part is connected via a bus.
  • RAM random access memory
  • the input unit 51 receives an input from the user.
  • the communication unit 52 communicates with other terminals. This communication may be wired or wireless.
  • the storage unit 53 stores a program for the processor 55 to execute.
  • the RAM 54 temporarily stores information.
  • the processor 55 reads the program from the storage unit 53 and executes the program.
  • FIG. 5 is a diagram showing an outline of the screen G1 displayed on the first terminal 1. As shown in FIG. As shown in FIG. 5, the screen G1 displayed on the first terminal 1 includes screen areas A1 to A4.
  • FIG. 6 is an example of the screen area A1.
  • an object R10 indicating a searcher is displayed.
  • objects R11 to R15 indicating obstacles outside the field of view of the camera of the searcher are displayed.
  • objects R16 to R17 indicating obstacles in the visual field of the camera of the spacecraft are displayed in a display mode different from objects R11 to R15 indicating obstacles outside the visual field of the camera of the spacecraft .
  • the first processor 15 controls to display an obstacle object in the camera view of the searcher 8 in a display mode different from that of the obstacle object outside the camera view. ing.
  • an arrow object R181 indicating the direction of the lander is displayed.
  • the first processor 15 obtains the latitude and longitude of the lander 7 and controls the direction of the lander to be displayed on the corresponding display unit 16 using the latitude and longitude of the lander 7.
  • the first processor 15 may display not only the direction of the lander 7 but also the position of the lander 7.
  • the second processor 25 may perform the same processing to display the direction or position of the lander 7.
  • an arrow object R182 indicating the direction of the sun is displayed.
  • the first processor 15, the second processor 25 or the processor 55 determines the direction of the sun according to the voltage of solar power generation in the solar panel provided in the spacecraft. It may function as a sun direction determining means. Specifically, for example, the solar direction determining means may determine the perpendicular direction of the solar panel as the direction of the sun when the voltage of solar power generation is maximum.
  • the first processor 15 controls to display the direction of the sun on the corresponding display unit 16.
  • the second processor 25 may perform the same processing to display the direction of the sun.
  • an arrow object R183 indicating the direction of the earth is displayed.
  • an arrow object R184 indicating the direction of the next destination is displayed.
  • FIG. 7 is an example of the screen area A2.
  • an icon R21 representing a convex form obstacle an icon R22 representing a concave form obstacle, an icon R23 representing an object requiring attention, and icons R24, R25, R26 representing a marker are displayed. ing.
  • the first operation unit 11 receives from the first user the designation of the position of an object present around the searcher and the type or form of the object.
  • the subject here is an obstacle, a danger point or a danger zone, resources (for example, minerals, water, etc.) or other explorers.
  • Hazardous points include the boundaries of the sun and the shade. Because the potential difference is large at the boundary between the sun and the shade, there is a possibility that a failure or abnormality of the electronic device mounted on the probe 8 may occur.
  • the first operation unit 11 receives the selection of the icon representing the form of the obstacle by the first user as designation of the form of the target.
  • the icon R21 of FIG. 7 is selected by the first user.
  • the icon R22 of FIG. 7 is selected by the first user. In this way, the form of the target obstacle is specified.
  • the first processor 15 causes the position and fault of the first operation unit 11 to be displayed on the second terminal 2 so that the object representing the form of the obstacle is displayed at the position of the first operation unit 11.
  • the form of the object is controlled to be transmitted from the first communication unit 12 to the second terminal 2.
  • the first user specifies the position of the obstacle by specifying the icon R21 or the icon R22, and then touching the position on the screen corresponding to the position of the obstacle in the screen area A1.
  • the first operation unit 11 receives designation of the position of the obstacle from the first user.
  • the first processor 15 controls, for example, to display an object corresponding to the specified type or form of the obstacle on the corresponding display unit 16 at the specified position. At that time, even if the first processor 15 functions as conversion means for converting the position received by the first operation unit 11 into the latitude and longitude of the star (in this example, the moon as an example) where the searcher 8 is present. Good. The first processor 15 may then control to display the object at a position corresponding to the latitude and longitude. Thereby, the screen area A1 of FIG. 6 is obtained.
  • FIG. 8 is an example of the screen area A3.
  • an object R31 of the searcher 8 and an object R32 of the lander 7 are displayed. Further, in the screen area A3, a movement trajectory L1 of the searcher 8 is shown. Further, in the screen area A3, an image area R33 representing the danger zone and an image area R34 representing the danger zone are shown.
  • the first processor 15 in the first terminal 1 is, for example, designated by a second user (for example, a pilot) operating the second terminal 2
  • the movement trajectory of the searcher 8 on the screen may be updated according to the traveling direction and the number of revolutions of the wheel per unit time (for example, the number of revolutions per minute: RPM (Rotation Per Minute)).
  • the number of revolutions per unit time of the wheel is an example of information related to the speed.
  • the movement trajectory of the searcher 8 on the screen can be updated in real time.
  • the present invention is not limited to this, and the first processor 15 may update the movement trajectory of the searcher 8 on the screen according to the traveling direction and the traveling distance every time the traveling of the session according to the traveling parameter is completed, for example. Good.
  • the first processor 15 may update the movement trajectory of the searcher in consideration of, for example, the slip ratio of the tire of the searcher 8.
  • This slip ratio is the difference between the distance traveled by the explorer 8 and the actual distance traveled on the earth on the soil collected by the star (in this case, the moon, for example) explored by the explorer 8. It may be set accordingly.
  • this slip ratio may be set according to the difference between the distance traveled by the searcher 8 and the distance actually traveled by the star searched by the searcher 8.
  • These settings may be set by the first processor 15, may be set by the second processor 25, or may be set by the processor 55.
  • FIG. 9 is an example of the screen area A4.
  • buttons R41 and R42 are displayed.
  • a travel command set by the second user in this case, a pilot
  • a travel command traveling at a speed of 30 rpm and a distance of 0.85 m is displayed in the direction of 2.85 °.
  • a button R41 for permitting the driving parameter and a button R42 for not permitting the driving parameter are shown.
  • FIG. 10 is a diagram showing an outline of the screen G2 displayed on the second terminal 2. As shown in FIG. As shown in FIG. 10, the screen G2 displayed on the second terminal 2 includes screen areas A5 to A8.
  • FIG. 11 is an example of the screen area A5.
  • an object R50 indicating a searcher is displayed.
  • objects R51 to R57 indicating obstacles outside the field of view of the camera of the searcher are displayed.
  • objects R58 to R59 indicating obstacles entering the field of vision of the camera of the spacecraft are displayed in a display mode different from objects R51 to R57 indicating obstacles outside the field of vision of the camera of the spacecraft .
  • the first processor 15 controls, for example, the position received by the first operation unit 11 and the form of the obstacle to be transmitted from the first communication unit 12 .
  • the form of the obstacle is an example of the type of object.
  • the second communication unit 22 receives the position of the obstacle and the form of the obstacle transmitted from the first terminal 1. At this time, the second communication unit 22 may receive the position of the obstacle and the form of the obstacle via the information processing device 5. Then, the second processor 25 controls the display unit 26 to display an object representing the form of the obstacle at the position received by the first operation unit 11.
  • the second processor 25 may receive the latitude and longitude of the star (in this example, the moon as an example here) where the explorer 8 is present, as the position received by the first operation unit 11. The second processor 25 may then control to display the object at a position corresponding to the latitude and longitude.
  • Objects representing "" can be displayed. Therefore, for example, the second user (for example, a pilot) does not have to input the position and the form of the obstacle by himself, and can concentrate on the operation of the spacecraft and can grasp the position and the form of the obstacle. So, when deciding on the direction of travel, obstacles can be avoided.
  • an obstacle is displayed as an example of the object, but the present invention is not limited to this.
  • a dangerous spot or a dangerous zone, a place where resources (for example, water, minerals, etc.) exist, or another explorer The place may be displayed as an object.
  • the second processor 25 also controls to display an obstacle object in the field of view of the camera of the searcher 8 in a display manner different from that of the obstacle object outside the field of view of the camera. At that time, since the view angle of the camera of the searcher 8 is known in advance, the second processor 25 determines whether or not the object is within the range of the view angle for each object. An obstacle object that falls within the field of view of the eight cameras may be determined.
  • the second user for example, the pilot
  • the second user can separately recognize the obstacle particles in the field of view of the camera and the obstacle outside the field of view of the camera. This prevents the spacecraft 8 from colliding with obstacles outside the camera's field of view.
  • FIG. 12 is an example of the screen area A6.
  • an image captured by the camera of the searcher 8 is displayed. This image is sequentially updated.
  • lines L2 to L6 indicating distances are displayed.
  • the second processor 25 controls the camera image of the searcher 8 to display a line representing the distance. The position of the line representing the distance is determined by taking a mark (for example, line, etc.) attached at a predetermined distance interval with the camera in advance on the ground, and the position to be displayed in the image is determined for each distance. There is.
  • the second user for example, a pilot
  • the second processor 25 determines the inclination of the probe 8 forward and backward from the acceleration sensor value of the probe 8 and corrects the position of the line representing the distance according to the inclination of the probe 8 forward and backward. It is also good. This is because if there is an anteroposterior inclination of the searcher 8, the anteroposterior inclination is attached, and the distance is extended accordingly, so the second processor 25 sets the line representing the distance closer to the front side. It may be corrected to This configuration can improve the accuracy of the position of the line representing the distance, even if there is an anteroposterior tilt of the searcher 8.
  • the second processor 25 may correct the line representing the distance from the distortion information of the lens mounted on the camera of the searcher 8. For example, as the image is distorted towards the end of the lens, the second processor 25 may correct the position of the line towards the end of the image depending on the distortion. According to this configuration, even if the lens is distorted, the accuracy of the position of the line representing the distance can be improved.
  • FIG. 13 is an example of the screen area A7.
  • the screen area A7 is a screen for setting a travel command.
  • a slider R71 for setting the direction a slider R72 for setting the traveling distance from this, and a slider R73 for setting the number of rotations of the wheels of the searcher 8 are displayed.
  • a button R74 for setting a travel command is displayed.
  • a second user for example, a pilot
  • the second operation unit 21 in the second terminal 2 receives the setting of the travel parameter by the second user (for example, the pilot).
  • the travel parameters include, by way of example, the direction, the distance, and the number of rotations of the wheel.
  • a button R76 for transmitting traveling parameters set for another user and a button R77 for retransmitting the traveling parameters are displayed.
  • the second processor 25 selects the other terminal, that is, the first terminal 1, the third terminal 1,
  • the second communication unit 22 is controlled to transmit the set traveling parameters to the third terminal 4 and the fourth terminal 4.
  • the first terminal 1, the third terminal 3 and the fourth terminal 4 receive the traveling parameters, and display the traveling parameters as shown in FIG. 9, for example.
  • the first processor 15 transmits a signal to the second terminal 2 to permit the traveling parameter. Control the first communication unit 12 so that On the other hand, when the button R42 for disallowing the traveling parameter shown in FIG. 9 is pressed, the first processor 15 transmits a signal to the second terminal 2 to the effect that the traveling parameter is not permitted. Control unit 12;
  • FIG. 13 shows an image area R75 indicating whether the permission of the travel parameter has been lowered by another user.
  • a display box R751 indicating whether the third user (for example, the manager) has lowered the travel parameter permission is shown.
  • a display box R752 indicating whether the second user (for example, co-pilot) has lowered the permission of the travel parameter is shown.
  • a display box R753 indicating whether or not the travel parameter permission has been lowered by the fourth user (for example, the system manager) is shown.
  • the display boxes R751 to R753 display, for example, green when the permission is given and red when the permission is not given.
  • the command is issued.
  • the send button R78 is changed to a state where it can be pressed.
  • the searcher of the traveling command Control is performed to display an object for command transmission (here, as an example, a command transmission button R78) in a display mode in which a transmission instruction to 8 can be received.
  • a command transmission button R78 a command transmission button
  • the second processor 25 receives an instruction for transmission of the traveling command to the searcher 8 when it receives permission of the traveling parameter from at least the first terminal 1, and displays the object for traveling instruction in a display mode. You may control to display.
  • the travel command can be output to the searcher 8, so an erroneous travel command is transmitted. The probability can be reduced.
  • the second processor 25 transmits the travel command including the travel parameter to the searcher 8 in the second communication. Control unit 22; As a result, a traveling command is transmitted to the searcher 8. Thereafter, the searcher 8 receives this travel command and travels according to this travel command.
  • FIG. 13 shows an emergency stop button R79.
  • the second user for example, a pilot presses the emergency stop button R 79 to emergency the searcher 8 when the searcher 8 is traveling and a obstacle such as a tear is found ahead while the searcher 8 is traveling. It is possible to stop.
  • the second processor 25 controls the second communication unit 22 to transmit an emergency stop command to the search device 8.
  • an emergency stop command is transmitted to the searcher 8.
  • the searcher 8 receives this emergency stop command, and performs an emergency stop according to the emergency stop command.
  • FIG. 14 is an example of the screen area A8.
  • an object R80 indicating the searcher 8 is displayed.
  • objects R811 to R817 indicating obstacles in the field of view of the camera on the side of the searcher 8 are displayed.
  • Objects R821 to R825 indicating obstacles in the field of view of the camera ahead of the searcher 8 are displayed.
  • objects R821 to R825 showing obstacles in the view of the camera ahead of the searcher 8 are objects R811 to R817 showing obstacles in the view of the camera on the side of the search Are displayed in different display modes.
  • objects R831 to R837 indicating obstacles outside the field of view of the camera in front of the searcher 8 and outside the field of view of the camera on the side of the searcher 8 are displayed.
  • These objects R831 to R837 are objects R821 to R825 that indicate obstacles in the field of vision of the camera ahead of the searcher 8 and objects R811 to R817 that indicate obstacles in the field of vision of the camera on the side of the spacecraft 8 It is displayed in different display modes.
  • a movement trajectory L7 of the searcher 8 is shown.
  • the second processor 25 in the second terminal 2 may, for example, travel direction designated by the second user (eg, pilot) and wheel unit time
  • the movement trajectory of the searcher 8 on the screen may be updated according to the number of rotations (for example, the number of rotations per minute: RPM (Rotation Per Minute)).
  • the number of revolutions per unit time of the wheel is an example of information related to the speed.
  • the movement trajectory of the searcher 8 on the screen can be updated in real time.
  • the second processor 25 may update the movement trajectory of the searcher 8 on the screen according to the traveling direction and the traveling distance every time the second processor 25 completes traveling of the session according to the traveling command, for example. Good.
  • the second processor 25 may update the movement trajectory of the searcher 8 in consideration of, for example, the slip ratio of the tire of the searcher 8.
  • This slip ratio is the difference between the distance traveled by the explorer 8 and the actual distance traveled on the earth on the soil collected by the star (in this case, the moon, for example) explored by the explorer 8. It may be set accordingly.
  • this slip ratio may be set according to the difference between the distance traveled by the searcher 8 and the distance actually traveled by the star searched by the searcher 8. These settings may be set by the first processor 15, may be set by the second processor 25, or may be set by the processor 55.
  • display system S concerning this embodiment is the display system which has the 1st terminal 1 and the 2nd terminal 2, and displays the situation around a spacecraft.
  • the first terminal 1 receives from the first user a designation of a target position and a target type existing around the searcher 8, a first operation unit 11, a first communication unit 12, and the first communication unit 12.
  • the first processor 15 controls to transmit the position and type received by the first operation unit 11 from the first communication unit 12.
  • the second terminal 2 displays a second communication unit that receives the position of the object and the type of the object, and an object corresponding to the type of the received object at the position of the received object, the display unit 26 corresponding to And a second processor 25 that controls to display on the display.
  • the display unit 26 displays an object corresponding to the type of the target at the position of the target designated by the first user (for example, co-pilot) in the first terminal 1.
  • a second user for example, a pilot who operates can facilitate understanding of the situation around the searcher 8.
  • the spacecraft has been described as searching for the moon, but other satellites, planets, asteroids, space to be explored in space or on the earth (for example, dangerous zones such as volcanoes, high temperature) Zones, cold zones, disaster sites, accident sites, etc.).
  • a program for executing each process of the first terminal 1 and / or the second terminal 2 of the present embodiment is recorded in a computer readable recording medium, and the program recorded in the recording medium is a computer.
  • the above-described various processes relating to the first terminal 1 and / or the second terminal 2 of the present embodiment may be performed by being read by the system and executed by the processor.
  • the present disclosure is not limited to the above embodiment as it is, and in the implementation stage, the components can be modified and embodied without departing from the scope of the invention.
  • various inventions can be formed by appropriate combinations of a plurality of constituent elements disclosed in the above embodiment. For example, some components may be deleted from all the components shown in the embodiment. Furthermore, components in different embodiments may be combined as appropriate.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Life Sciences & Earth Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Geophysics (AREA)
  • Navigation (AREA)

Abstract

L'invention concerne un terminal permettant d'afficher l'environnement d'un véhicule sonde, le terminal comprenant : une unité de fonctionnement qui reçoit, en provenance d'un utilisateur, la position d'une cible présente autour du véhicule sonde et la désignation du type de ladite cible ; un processeur qui commande à une unité d'affichage d'afficher un objet correspondant au type ou à la forme de la cible à la position reçue par l'unité de fonctionnement ; et une unité de communication, le processeur commandant à l'unité de communication de transmettre la position et le type de la cible reçue par l'unité de fonctionnement à un autre terminal de telle sorte que l'objet correspondant au type ou à la forme de la cible soit affiché à la position reçue par l'unité de fonctionnement sur l'autre terminal.
PCT/JP2017/026094 2017-07-19 2017-07-19 Terminal, système d'affichage et procédé d'affichage WO2019016888A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/026094 WO2019016888A1 (fr) 2017-07-19 2017-07-19 Terminal, système d'affichage et procédé d'affichage

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/026094 WO2019016888A1 (fr) 2017-07-19 2017-07-19 Terminal, système d'affichage et procédé d'affichage

Publications (1)

Publication Number Publication Date
WO2019016888A1 true WO2019016888A1 (fr) 2019-01-24

Family

ID=65015695

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/026094 WO2019016888A1 (fr) 2017-07-19 2017-07-19 Terminal, système d'affichage et procédé d'affichage

Country Status (1)

Country Link
WO (1) WO2019016888A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003532218A (ja) * 2000-05-01 2003-10-28 アイロボット コーポレーション 移動ロボットを遠隔操作するための方法およびシステム
US20120072052A1 (en) * 2010-05-11 2012-03-22 Aaron Powers Navigation Portals for a Remote Vehicle Control User Interface
US20140152822A1 (en) * 2012-12-05 2014-06-05 Florida Institute for Human and Machine Cognition User Display Providing Obstacle Avoidance
US20150019043A1 (en) * 2013-07-12 2015-01-15 Jaybridge Robotics, Inc. Computer-implemented method and system for controlling operation of an autonomous driverless vehicle in response to obstacle detection

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003532218A (ja) * 2000-05-01 2003-10-28 アイロボット コーポレーション 移動ロボットを遠隔操作するための方法およびシステム
US20120072052A1 (en) * 2010-05-11 2012-03-22 Aaron Powers Navigation Portals for a Remote Vehicle Control User Interface
US20140152822A1 (en) * 2012-12-05 2014-06-05 Florida Institute for Human and Machine Cognition User Display Providing Obstacle Avoidance
US20150019043A1 (en) * 2013-07-12 2015-01-15 Jaybridge Robotics, Inc. Computer-implemented method and system for controlling operation of an autonomous driverless vehicle in response to obstacle detection

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
DE FILIPPIS LUCA ET AL.: "Remote Control Station Design and Testing for Tele-Operated Space-Missions", INTERNATIONAL JOURNAL OF AEROSPACE SCIENCES, vol. 2, no. 3, 2013, pages 92 - 105, XP055677912 *
DEANS MATTHEW C. ET AL.: "Robotic Scouting for Human Exploration", AIAA SPACE 2009 CONFERENCE & EXPOSITION, 14 September 2009 (2009-09-14), pages 1 - 15, XP055677909 *

Similar Documents

Publication Publication Date Title
US11217112B2 (en) System and method for supporting simulated movement
KR101117207B1 (ko) 스마트폰을 이용한 무인비행체 자동 및 수동 조종시스템
CN107077113B (zh) 无人飞行器飞行显示
CN110325939B (zh) 用于操作无人驾驶飞行器的系统和方法
US9104202B2 (en) Remote vehicle missions and systems for supporting remote vehicle missions
Chaimowicz et al. Deploying air-ground multi-robot teams in urban environments
EP3206768B1 (fr) Dispositif de commande de véhicule d'inspection, procédé de commande, et programme informatique
US8521339B2 (en) Method and system for directing unmanned vehicles
US10240930B2 (en) Sensor fusion
WO2018090208A1 (fr) Procédé et dispositif de navigation fondés sur une carte tridimensionnelle
US20210278834A1 (en) Method for Exploration and Mapping Using an Aerial Vehicle
JP6815479B2 (ja) 表示制御装置、表示制御方法及び記憶媒体
JP6674914B2 (ja) ナビゲーションシステム、ナビゲーションプログラム、車載装置
WO2015034390A1 (fr) Dispositif de commande de systèmes cybernétiques-physiques
WO2020107454A1 (fr) Procédé et appareil pour localiser avec précision un obstacle, et support de stockage lisible par ordinateur
WO2017169841A1 (fr) Dispositif d'affichage et procédé de commande d'affichage
CN110785720A (zh) 信息处理装置、信息提示指示方法、程序以及记录介质
WO2019016888A1 (fr) Terminal, système d'affichage et procédé d'affichage
JP5969903B2 (ja) 無人移動体の制御方法
JP6560479B1 (ja) 無人航空機制御システム、無人航空機制御方法、及びプログラム
AU2011293447B2 (en) Remote vehicle missions and systems for supporting remote vehicle missions
WO2021064982A1 (fr) Dispositif et procédé de traitement d'informations
EP4357868A1 (fr) Système, procédé et programme de gestion de vol de multiples aéronefs
CN114895713B (zh) 飞行器着陆方法、系统、飞行器及存储介质
US10023310B2 (en) Unmanned flying object and flight control method thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17918625

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17918625

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP