JP6154563B1 - Image display system, display terminal device, and program - Google Patents

Image display system, display terminal device, and program Download PDF

Info

Publication number
JP6154563B1
JP6154563B1 JP2017000890A JP2017000890A JP6154563B1 JP 6154563 B1 JP6154563 B1 JP 6154563B1 JP 2017000890 A JP2017000890 A JP 2017000890A JP 2017000890 A JP2017000890 A JP 2017000890A JP 6154563 B1 JP6154563 B1 JP 6154563B1
Authority
JP
Japan
Prior art keywords
monitoring target
image
current position
terminal device
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2017000890A
Other languages
Japanese (ja)
Other versions
JP2018109925A (en
Inventor
佳明 澤田
佳明 澤田
孝幸 齊藤
孝幸 齊藤
顕輝 河野
顕輝 河野
卓登 中村
卓登 中村
チュン ファン チャン
チュン ファン チャン
峰雄 小野
峰雄 小野
Original Assignee
株式会社ユニテック
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ユニテック filed Critical 株式会社ユニテック
Priority to JP2017000890A priority Critical patent/JP6154563B1/en
Application granted granted Critical
Publication of JP6154563B1 publication Critical patent/JP6154563B1/en
Publication of JP2018109925A publication Critical patent/JP2018109925A/en
Application status is Active legal-status Critical

Links

Images

Abstract

An image close to an image viewed by a monitoring target holding a mobile terminal device is displayed without mounting an imaging device such as a CCD camera on the monitoring target. In a mobile terminal device 10, a moving distance, a moving direction, and an elevation angle of a line of sight of a monitoring subject are calculated as posture information. The display terminal device 60 is a map information server that is a device that specifies the current position of the monitoring subject based on the moving distance and moving direction detected by the mobile terminal device 10 and stores a panoramic image of the surrounding area at each point in advance. From 50, an image corresponding to the line-of-sight direction specified by the current position of the specified monitoring target person, the posture and movement direction of the monitoring target person is acquired, and the display or the like is approximated to the image viewed by the monitoring target person To display. [Selection] Figure 1

Description

  The present invention relates to an image display system, a display terminal device, and a program for displaying an image for monitoring a monitoring target holding a mobile terminal device.

  In recent years, with the aging of society, it has become a problem that elderly people with dementia are deceiving the streets. There is also a problem that children are involved in dangerous incidents in the city.

  Therefore, a system has been proposed in which the position of a monitoring target person such as an elderly person or a child is measured by a system such as GPS (Global Positioning System) and the current position of the monitoring target person is displayed on a map.

  However, simply displaying the position information indicating the current position of the monitoring target person cannot grasp what kind of action the monitoring target person is performing in what state.

  Such a problem can be solved by mounting an imaging device such as a CCD (Charge Coupled Device) camera on the person to be monitored and displaying the captured image.

  For example, a system has been proposed in which a CCD camera is mounted on a device such as a robot or helicopter or a person so that the surrounding image of the robot or helicopter or an image viewed by a person can be seen even at a remote place ( For example, see Patent Documents 1, 2, and 3).

Japanese Patent Laid-Open No. 10-42282 JP 2004-255552 A JP 2011-243085 A

  However, the above system has a problem that it takes time and effort to mount an imaging device such as a CCD camera on a person or a robot. There is also a problem that it is difficult to mount the imaging device so that the shooting direction does not shift. Furthermore, if a video in the city is taken without permission, a third party may be taken and there may be a problem of infringement of privacy. In the first place, if you walk around the street with a camera clearly attached, there is a possibility that other people in the surroundings may perceive that your privacy has been violated.

  An object of the present invention is to provide an image display system and program capable of displaying an image close to an image viewed by the monitoring target holding the mobile terminal device without mounting an imaging device such as a CCD camera on the monitoring target. Is to provide.

The image display system of the present invention includes a moving distance detecting unit that detects a moving distance of a monitoring target, a moving direction detecting unit that detects a moving direction of the monitoring target, and a posture detecting unit that detects the posture of the monitoring target. A mobile terminal device held by the monitoring target;
From the current position specifying means for specifying the current position of the monitoring target based on the moving distance and moving direction detected in the mobile terminal apparatus, and the apparatus in which surrounding images at each point are stored in advance, the current position specifying means A display that acquires an image corresponding to the current position of the monitoring target specified by, the line-of-sight direction specified by the posture and movement direction of the monitoring target, and displays it as an image that approximates the image that the monitoring target is looking at And a display terminal device provided with the means.

Moreover, in another image display system of the present invention, the display terminal device includes a receiving unit that receives designation of an initial position of the monitoring target,
The current position specifying unit is configured to determine the relative position of the monitoring target based on the initial position received by the receiving unit, the moving distance detected by the moving distance detecting unit, and the moving direction detected by the moving direction detecting unit. The position may be specified as the current position.

  Furthermore, in another image display system of the present invention, the display unit acquires a panoramic image corresponding to the current position of the monitoring target from an apparatus in which surrounding images at each point are stored in advance as a panoramic image, and the posture An image approximating the image seen by the monitoring target is obtained by cutting out the image corresponding to the posture of the monitoring target detected by the detection unit and the moving direction detected by the moving direction detection unit from the acquired panoramic image. You may make it do.

  Furthermore, in another image display system of the present embodiment, the moving distance detecting unit measures the number of steps to be monitored based on acceleration data detected by an acceleration sensor, and the monitoring set in advance to the measured number of steps. The moving distance may be detected by multiplying the target stride.

  Furthermore, in another image display system of the present embodiment, the posture detection means calculates the elevation angle of the line of sight of the monitoring target using the gyro data detected by the gyro sensor, and detects it as posture information of the monitoring target. You may make it do.

  Furthermore, in another image display system according to the present embodiment, the moving direction detection unit detects the moving direction of the monitoring target using the direction data detected by the geomagnetic sensor and the gyro data detected by the gyro sensor. May be.

  Furthermore, in another image display system of the present embodiment, the display unit approximates a map image indicating the current position of the monitoring target specified by the current position specifying unit to an image viewed by the monitoring target. You may make it display with an image.

  Furthermore, in another image display system of the present embodiment, the display means looks at the satellite image showing the current position of the monitoring target specified by the current position specifying means as the map image and the monitoring target. It may be displayed together with an image that approximates the existing image.

Furthermore, in another image display system of the present embodiment, a current position specifying unit that specifies the current position of the monitoring target, a moving direction detection unit that detects the moving direction of the monitoring target, and a posture of the monitoring target are detected. A mobile terminal device held by the monitoring target,
An image corresponding to the gaze direction specified by the current position of the monitoring target specified by the current position specifying means, the attitude of the monitoring target, and the moving direction is acquired from a device in which surrounding images at each point are stored in advance. And a display terminal device including display means for displaying as an image approximate to an image viewed by the monitoring target.

Further, the display terminal device of the present invention includes a current position specifying means for specifying the current position of the monitoring target based on the moving distance and moving direction detected in the mobile terminal device,
An image corresponding to the gaze direction specified by the current position of the monitoring target specified by the current position specifying means, the attitude of the monitoring target, and the moving direction is acquired from a device in which surrounding images at each point are stored in advance. Display means for displaying as an image that approximates the image that the monitoring target is viewing.

Further, another program of the present embodiment includes a current position specifying step of specifying the current position of the monitoring target based on the moving distance and moving direction of the monitoring target detected in the mobile terminal device held by the monitoring target. ,
A posture detection step for detecting the posture of the monitoring target;
An image corresponding to the line-of-sight direction specified by the current position of the monitoring target specified in the current position specifying step, the posture of the monitoring target, and the moving direction is acquired from a device in which surrounding images at each point are stored in advance. Then, the computer is caused to execute a display step of displaying as an image approximate to the image viewed by the monitoring target.

  According to the present invention, it is possible to display an image close to an image viewed by a monitoring target holding the mobile terminal device without mounting an imaging device such as a CCD camera on the monitoring target. be able to.

It is a figure which shows the structure of the image display system of one Embodiment of this invention. It is a block diagram which shows the hardware constitutions of the mobile terminal device 10 in the image display system of one Embodiment of this invention. It is a block diagram which shows the function structure of the mobile terminal device 10 in the image display system of one Embodiment of this invention. It is a figure for demonstrating the moving direction information calculated by the data processing part. It is a figure for demonstrating the posture information (elevation angle) calculated by the data processing part. It is a block diagram which shows the function structure of the display terminal device 20 in the image display system of one Embodiment of this invention. It is a figure for demonstrating a control part 33 calculating a monitoring subject's present position based on the moving distance and moving direction from the initial position of a monitoring subject. It is a flowchart for demonstrating operation | movement of the image display system of one Embodiment of this invention. It is a figure which shows an example of the visual field image, map image, and satellite photo image which are displayed on the display terminal device 60. It is a figure which shows an example of the visual field image displayed when the monitoring subject is looking at the horizontal direction. It is a figure which shows an example of the visual field image displayed when the monitoring subject is looking upward. It is a figure which shows an example of the visual field image displayed when the monitoring subject is looking down. It is a block diagram which shows the function structure of the mobile terminal device 10a with a GPS function.

  Next, embodiments of the present invention will be described in detail with reference to the drawings.

  FIG. 1 is a diagram showing a system configuration of an image display system according to an embodiment of the present invention.

  As shown in FIG. 1, an image display system according to an embodiment of the present invention includes a mobile terminal device 10 such as a smartphone (hereinafter abbreviated as a smartphone), a personal computer (hereinafter abbreviated as a personal computer), and the like. Display terminal device 60 and a map information server 50.

  The mobile terminal device 10 is a device for holding a monitoring target person such as a child or an elderly person whose behavior is to be monitored, and performs mobile communication with the radio base station 20 to perform mobile communication. It is configured so that it can be connected to the Internet 40 via the network 30. The mobile terminal device 10 may be held in the monitoring subject's pocket, bag or the like.

  The display terminal device 60 is a device for displaying an image for monitoring a person to be monitored who holds the mobile terminal device 10, and is realized by installing dedicated software on a terminal device such as a personal computer.

  The map information server 50 stores two-dimensional map image information, satellite photograph images (or aerial photograph images), and panoramic images in which surrounding images at each point are captured in advance.

  As an example of the map information server 50, a server that provides various data such as Google map, Google earth, and Street view (registered trademark) provided by Google (registered trademark) can be cited.

  Next, FIG. 2 shows a hardware configuration of the mobile terminal device 10 in the image display system of the present embodiment.

  As shown in FIG. 2, the mobile terminal device 10 includes a CPU 11, a memory 12, a storage device 13 such as a flash memory, and a wireless communication unit that transmits and receives data to and from the wireless base station 20 via a wireless line. 14, a user interface (UI) device 15 such as a touch panel or a liquid crystal display, an acceleration sensor 16, a gyro sensor 17, and a geomagnetic sensor 18. These components are connected to each other via a control bus 19.

  The CPU 11 controls the operation of the mobile terminal device 10 by executing a predetermined process based on a control program stored in the memory 12 or the storage device 13. In the present embodiment, the CPU 11 has been described as reading and executing a control program stored in the memory 12 or the storage device 13, but the program is stored in a storage medium such as a CD-ROM and stored in the CPU 11. It is also possible to provide. Alternatively, the control program may be downloaded as an application program (hereinafter abbreviated as an application), and the downloaded application may be executed by the CPU 11.

  FIG. 3 is a block diagram showing a functional configuration of the mobile terminal apparatus 10 realized by executing the control program.

  In the mobile terminal device 10 of the present embodiment, as shown in FIG. 3, in addition to the acceleration sensor 16, the gyro sensor 17, the geomagnetic sensor 18, and the wireless communication unit 14 described above, a data processing unit 21 is configured. Yes.

  In the data processing unit 21, the moving distance detecting unit measures the number of steps of the monitoring target person based on the acceleration data detected by the acceleration sensor 16, and multiplies the step number of the monitoring target person set in advance by the measured number of steps. It functions as a movement distance detecting means for detecting the movement distance of the person being monitored by (multiplying).

  Further, the data processing unit 21 functions as a moving direction detection unit that detects the moving direction (azimuth angle) of the monitoring subject using the direction data detected by the geomagnetic sensor 18 and the gyro data detected by the gyro sensor 17. . For example, as illustrated in FIG. 4, the data processing unit 21 calculates α, which is an angle formed by the gaze direction of the monitoring target person and the reference direction, as the moving direction information with the north direction as the reference direction. The moving direction includes not only the direction in which the monitoring target person is moving, but also the direction in which the monitoring target person is facing.

  Note that when the data processing unit 21 actually detects the moving direction, the data processing unit 21 performs various correction calculations using the acceleration data from the acceleration sensor 16 to detect the moving direction.

  Further, the data processing unit 21 functions as posture detection means for calculating the elevation angle of the line of sight of the monitoring subject using the gyro data detected by the gyro sensor 17 and detecting it as posture information of the monitoring subject. For example, as shown in FIG. 5, the data processing unit 21 detects the elevation angle β with the horizontal direction as the reference direction as posture information of the monitoring subject.

  The data processing unit 21 performs a moving average process on the detection data detected by the acceleration sensor 16, the gyro sensor 17, and the geomagnetic sensor 18, a noise removal process using an LPF (Low Pass Filter), and an error correction using a linear Kalman filter. Various calculation processes such as a process are performed to calculate movement distance information, movement direction information, and posture information.

  In this embodiment, the acceleration sensor 16, the gyro sensor 17, and the geomagnetic sensor 18 are used to detect the moving distance, moving azimuth, and posture of the monitoring subject. However, although the detection accuracy is deteriorated, the geomagnetic sensor 18 is used. If only the acceleration sensor 16 and the gyro sensor 17 are used without using, it is possible to detect the moving distance, moving direction, and posture of the monitoring subject.

  Then, the travel distance information, the travel direction information, and the attitude information calculated by the data processing unit 21 are transmitted to the radio base station 20 via the radio communication unit 14, and the display terminal device via the mobile communication network 30 and the Internet 40. 60.

  Next, the functional configuration of the display terminal device 60 that has received the movement distance information, the movement direction information, and the posture information transmitted from the mobile terminal device 10 will be described with reference to FIG.

  As illustrated in FIG. 6, the display terminal device 60 according to the present embodiment includes a communication unit 31, an operation input unit 32, a control unit 33, and a display unit 34.

  Note that the display terminal device 60 in the present embodiment implements various functions by installing a control program in a general-purpose personal computer, for example, and the CPU executing predetermined processing based on the control program.

  The communication unit 31 receives movement distance information, movement direction information, attitude information, and the like transmitted from the mobile terminal device 10 via the Internet 40, and accesses the map information server 50 to access map image data and satellite photographs. Receives image data, panoramic image data, and the like.

  The control unit 33 functions as a current position specifying unit that specifies the current position of the monitoring subject based on the moving distance and moving direction detected by the mobile terminal device 10.

  Then, the control unit 33 determines the line of sight specified by the current position of the specified monitoring target person, the posture of the monitoring target person, and the moving direction from the map information server 50 that is a device in which panoramic images around each point are stored in advance. An image corresponding to the direction is acquired and displayed on the display unit 34 as an image that approximates the image that the monitoring subject is viewing.

  Specifically, the control unit 33 acquires a panoramic image corresponding to the current position of the monitoring target person from the map information server 50 in which surrounding images at each point are stored in advance as a panoramic image. The image corresponding to the moving direction is cut out from the acquired panoramic image, thereby obtaining an image that approximates the image viewed by the monitoring subject.

  Furthermore, the operation input unit 32 functions as a receiving unit that receives designation of the initial position (start position) of the monitoring target person.

  Then, the control unit 33 calculates the relative position of the monitoring target person from the initial position received by the operation input unit 32, the moving distance and the moving direction of the monitoring target person, and specifies the current position.

  Specifically, as shown in FIG. 7, the control unit 33 sets the movement position of the monitoring target person based on the moving distance from the initial position (latitude / longitude information) of the monitoring target person and the moving directions α1 to α4. By calculating, the current position (latitude and longitude information) of the monitoring subject is calculated.

  And the control part 33 displays on the display part 34 of the display terminal device 60 the image approximated to the image which the monitoring subject obtained is obtained in this way as a visual field image.

  Further, the control unit 33 displays a map image and a satellite photograph image indicating the current position of the identified monitoring subject on the display unit 34 of the display terminal device 60 together with the visual field image.

  Next, the operation of the image display system of this embodiment will be described with reference to the flowchart of FIG.

  First, the display terminal device 60 designates the departure point of the monitoring subject as an initial position by designating a position on a map image, for example (step S101).

  Then, when the monitoring subject holding the mobile terminal device 10 moves, the mobile terminal device 10 performs the above-described processing to detect the moving distance, the moving direction, and the posture (elevation angle) of the monitoring subject. Is transmitted to the display terminal device 60 (step S102).

  Then, the display terminal device 60 specifies the current position of the monitoring target person from the transmitted moving direction and moving distance (step S103).

  Then, the display terminal device 60 accesses the map information server 50 to acquire a map image around the current position of the monitoring target person, a satellite photograph image, and a panoramic image that captures the periphery of the current position (step S104).

  Furthermore, the display terminal device 60 cuts out an image corresponding to the direction (moving direction) and posture (elevation angle) of the monitoring target person from the acquired panoramic image, and converts the image into the image viewed by the monitoring target person. A visual field image that is an approximate image is generated (step S105).

  Finally, in the display terminal device 60, the generated visual field image, map image, and satellite photograph image are displayed on the display unit 34 such as a display (step S106).

  And the process of step S102-S106 is repeated until the management process with respect to a monitoring subject is complete | finished (step S107).

  FIG. 9 shows an example in which the visual field image, the map image, and the satellite photograph image thus obtained are displayed on the display terminal device 60.

  In the display example illustrated in FIG. 9, an image corresponding to the line of sight of the monitoring target person is displayed as a field-of-view image together with a map image indicating the current position of the monitoring target person and a satellite photograph image.

  In the map image and satellite photograph image displayed on the display unit 34 of the display terminal device 60, the current position of the person to be monitored is displayed by an icon or the like, and the past movement history information is displayed. It is possible to grasp what route the person moves from the initial position and is currently located. It is also possible to prepare a plurality of icons indicating the current position of the monitoring subject and select and switch the icon to be displayed from among the plurality of icons.

  In particular, it is possible to quickly determine whether or not there is a dangerous place in the monitoring subject's destination by displaying not only the position of the monitoring subject on the map image but also the position on the satellite image in parallel. It becomes possible.

  In addition, by displaying an image that is close to the image being viewed by the person being monitored as the field-of-view image, it is possible to grasp not only where the person to be monitored is, but also where the area around the position is. Is possible. Furthermore, by displaying an image in a direction in which the person to be monitored is facing, it becomes easier to understand what the person to be monitored is looking at and where he is going to go. That is, the current position and state (normal or abnormal) of the monitoring subject and the safety of the surroundings can be confirmed from the map image, the field-of-view image, the satellite photograph image, and the like.

  For example, when the monitoring target person is looking in the horizontal direction as shown in FIG. 10A, it is assumed that the visual field image as shown in FIG. 10B is displayed. Here, a case where the monitoring target person holds the mobile terminal device 10 in the breast pocket will be described.

  When the monitoring subject looks upward in such a state, as shown in FIG. 11A, the inclination of the mobile terminal device 10 changes. As a result, the mobile terminal device 10 detects a change in posture by the gyro sensor 17, and the visual field image changes to an image seen from above the place as shown in FIG.

  Furthermore, when the monitoring target person looks down, the inclination of the mobile terminal device 10 changes as shown in FIG. As a result, in the mobile terminal device 10, the posture change is detected by the gyro sensor 17, and the field-of-view image changes to an image viewed from the lower side as shown in FIG.

  In addition, when the person to be monitored rotates in the direction of viewing at the same place without moving, the displayed visual field image also changes in accordance with the rotation.

  Note that the field-of-view image displayed by the image display system according to the present embodiment is not an image that is actually viewed by the person being monitored, for example, Street View (registered trademark) provided by Google (registered trademark). It is an image. That is, an image captured in the past at that location is displayed as a visual field image.

  For this reason, the real-time property is low as compared with a case where an image pickup device such as a CCD camera is attached to the monitoring target person and an image viewed by the monitoring target person is picked up and displayed.

  However, unlike displaying a captured image by attaching an imaging device such as a CCD camera to the person to be monitored, it does not cause a privacy infringement problem such as capturing a surrounding third party. That's it.

  In addition, since it is only necessary to have the mobile terminal device 10 placed in the chest pocket, bag, etc. of the person to be monitored, operation on the system becomes easy.

  Furthermore, in the image display system of the present embodiment, the mobile terminal device 10 detects the moving distance and moving direction of the monitoring subject, and the display terminal device 60 calculates the relative position from the initial position.

  Therefore, it is possible to specify the position of the monitoring subject without using a positioning device such as GPS (Global Positioning System) that consumes a large amount of power, and to display an image for grasping the situation of the monitoring subject. .

  In addition, by calculating the relative position from the initial position and specifying the current position of the monitoring subject, it is possible to set an arbitrary location on the earth as the initial position. Therefore, according to the image display system of the present embodiment, a virtual experience can be displayed by setting a place other than the actual place as the initial position, and a simulated experience of a place visited for the first time can be performed.

  For example, by setting the initial position around the Eiffel Tower in France, it is possible to perform a simulation that displays an image of a stroll around the Eiffel Tower while in Japan.

  Such a simulation can be used, for example, in the case of confirming in advance the distance of a place to be visited or the surrounding scenery. For example, by setting the initial position as a place to be investigated and shaking the mobile terminal device 10 by hand, the place to be investigated on the display terminal device 60 by giving a vibration equivalent to that when a person wears and walks. It is possible to realize a usage method for displaying a peripheral image or the like.

  Further, the business trip person holds the mobile terminal device 10 to go to a certain place, and by viewing a large number of images displayed on the display terminal device 60 in the conference room, an image around a specific place can be obtained. It is possible to see in large numbers. In such a case, it is possible to display an image in a desired direction by instructing a direction toward the business traveler.

  Note that if the position of the monitoring subject is continuously specified by calculating the relative position, the position error may increase. Therefore, when the mobile terminal device 10 is provided with absolute position detection means such as GPS, it is possible to prevent the position error from becoming too large by periodically correcting the position.

  A functional configuration of such a mobile terminal device 10a with a GPS function is shown in FIG.

  The mobile terminal device 10a shown in FIG. 13 is different from the mobile terminal device 10 shown in FIG. 3 in that a GPS 22 is added.

As described above, when the GPS 22 is provided in the mobile terminal device 10a as the current position specifying means for specifying the current position of the person to be monitored, the display terminal device 60 uses the position information measured by the GPS 22 as a monitoring target. It is possible to specify the current position of the person.

  Therefore, it is possible to omit the process of setting the initial position of the person to be monitored in the display terminal device 60 and calculating the movement distance in the mobile terminal device 10a.

  However, in such a configuration, it is possible to save time and labor for setting the initial position, but it is not possible to perform a simulation using a position different from the current position as described above as the initial position.

  In the present embodiment described above, the case where a smartphone is used as the mobile terminal device 10 has been described. However, a dedicated terminal device can be used as the mobile terminal device 10. For example, by using a dedicated terminal device having only a communication function and various sensors that do not have a touch panel or the like as the mobile terminal device 10, it is possible to reduce the size of the device and make it easier to attach to the person being monitored. Become.

  Further, when a gyro sensor, a geomagnetic sensor, an acceleration sensor, or the like is provided in a glasses-like wearable terminal device or a wristwatch-type wearable terminal device, such a wearable terminal device can be used as a mobile terminal device. . In particular, by using a glasses-like wearable terminal, it is possible to specify the gaze direction of the person to be monitored in detail, and to improve the accuracy of the visual field image displayed on the display terminal device 60.

  When a wristwatch-type wearable terminal device is used, if the wristwatch-type wearable terminal device is provided with a sensor for measuring vital data such as blood pressure and heart rate, the vital data is displayed on the display terminal device 60. It is also possible to display the vital data of the person to be monitored together with the field-of-view image, the map image, the satellite photograph image, etc. And by performing such a display, it becomes possible to grasp | ascertain in detail states, such as a normal diagonal abnormality of a monitoring subject.

  Furthermore, in addition to the sensors described above, a humidity sensor, a temperature sensor, an atmospheric pressure sensor, and the like are provided in the mobile terminal device 10, and the obtained humidity information, temperature information, atmospheric pressure information, and the like are displayed on the display terminal device 60. It is also possible. In addition, it is possible to calculate the altitude of the place where the monitoring target person exists by using the atmospheric pressure information.

  In the present embodiment, the case where there is one mobile terminal device 10 has been described. However, a plurality of mobile terminal devices 10 are prepared, and images for monitoring the actions of a plurality of monitoring subjects are displayed on the display terminal device. It is also possible to display them side by side on 60.

[Modification]
In the above-described embodiment, the mobile terminal device 10 is carried for a person such as an elderly person or a child to be monitored, and the movement route is tracked and the image being viewed is confirmed. For example, even when the mobile terminal device 10 is attached to an animal such as a pet such as a dog or a cat, or a robot that can move autonomously, and the animal or the robot is a monitoring target, this book is similarly applied. The invention can be applied.

10, 10a Mobile terminal device 11 CPU
DESCRIPTION OF SYMBOLS 12 Memory 13 Storage device 14 Wireless communication part 15 User interface (UI) apparatus 16 Acceleration sensor 17 Gyro sensor 18 Geomagnetic sensor 19 Control bus 20 Wireless base station 21 Data processing part 22 GPS
DESCRIPTION OF SYMBOLS 30 Mobile communication network 31 Communication part 32 Operation input part 33 Control part 34 Display part 40 Internet 50 Map information server 60 Display terminal device

Claims (9)

  1. A moving distance detecting unit for detecting a moving distance of the monitoring target; a moving direction detecting unit for detecting the moving direction of the monitoring target; and a posture detecting unit for detecting the posture of the monitoring target; A mobile terminal device,
    From the current position specifying means for specifying the current position of the monitoring target based on the moving distance and moving direction detected in the mobile terminal apparatus, and the apparatus in which surrounding images at each point are stored in advance, the current position specifying means A display that acquires an image corresponding to the current position of the monitoring target specified by, the line-of-sight direction specified by the posture and movement direction of the monitoring target, and displays it as an image that approximates the image that the monitoring target is looking at A display terminal device comprising means ,
    The display terminal device includes a receiving unit that receives designation of an initial position of the monitoring target,
    The current position specifying unit is configured to determine the relative position of the monitoring target based on the initial position received by the receiving unit, the moving distance detected by the moving distance detecting unit, and the moving direction detected by the moving direction detecting unit. Specify position as current position
    Image display system.
  2. The display means acquires a panoramic image corresponding to the current position of the monitoring target from an apparatus in which surrounding images at each point are stored in advance as a panoramic image, and the posture of the monitoring target detected by the posture detection means and the An image that approximates the image that the monitoring target is viewing is obtained by cutting out the panoramic image obtained from the image corresponding to the movement direction detected by the movement direction detection means.
    The image display system according to claim 1 .
  3. The moving distance detecting means measures the number of steps to be monitored based on acceleration data detected by an acceleration sensor, and detects the moving distance by multiplying the measured number of steps by a preset step length of the monitoring target. The image display system according to claim 1 or 2 .
  4. The image display according to any one of claims 1 to 3 , wherein the posture detection means calculates the elevation angle of the line of sight of the monitoring target using the gyro data detected by the gyro sensor, and detects it as posture information of the monitoring target. system.
  5. Said moving direction detection means, the orientation data detected by the geomagnetic sensor, an image display system according to any one of claims 1 to 4 for detecting the moving direction of the monitored using a gyro data detected by the gyro sensor.
  6. The display means, wherein the map image that indicates the current position of the monitoring target identified by the current position determining means, according to any one of claims 1 to 5 for displaying together images that approximates the image which the monitoring target is watching Image display system.
  7. The display means, the satellite image that indicates the current position of the monitoring target identified by the current position determining means, according to claim 6, wherein said the map image and the monitoring target displayed together with an image which approximates to that image seen Image display system.
  8. Receiving means for accepting designation of the initial position of the monitoring target;
    The initial position accepted by the accepting means, the moving distance is detected in the mobile terminal apparatus carried by the monitored and the current position specifying means for specifying the relative position of the monitored object based on the moving direction as the current position When,
    An image corresponding to the gaze direction specified by the current position of the monitoring target specified by the current position specifying means, the attitude of the monitoring target, and the moving direction is acquired from a device in which surrounding images at each point are stored in advance. Display means for displaying as an image approximating the image that the monitoring target is viewing;
    A display terminal device comprising:
  9. A reception step for accepting designation of the initial position of the monitoring target;
    The relative position of the monitoring target is specified as the current position based on the initial position received in the receiving step and the moving distance and moving direction of the monitoring target detected in the mobile terminal device held by the monitoring target. A current location step;
    A posture detection step for detecting the posture of the monitoring target;
    An image corresponding to the line-of-sight direction specified by the current position of the monitoring target specified in the current position specifying step, the posture of the monitoring target, and the moving direction is acquired from a device in which surrounding images at each point are stored in advance. Then, a program for causing a computer to execute a display step of displaying as an image approximate to an image viewed by the monitoring target.
JP2017000890A 2017-01-06 2017-01-06 Image display system, display terminal device, and program Active JP6154563B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2017000890A JP6154563B1 (en) 2017-01-06 2017-01-06 Image display system, display terminal device, and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2017000890A JP6154563B1 (en) 2017-01-06 2017-01-06 Image display system, display terminal device, and program

Publications (2)

Publication Number Publication Date
JP6154563B1 true JP6154563B1 (en) 2017-06-28
JP2018109925A JP2018109925A (en) 2018-07-12

Family

ID=59218521

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2017000890A Active JP6154563B1 (en) 2017-01-06 2017-01-06 Image display system, display terminal device, and program

Country Status (1)

Country Link
JP (1) JP6154563B1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011039130A (en) * 2009-08-07 2011-02-24 Hyogo Prefecture Linked display device, linked display method, and program
JP2015513808A (en) * 2012-01-24 2015-05-14 アクシピター ラダー テクノロジーズ, インコーポレイテッド Personal electronic target vision system, apparatus and method
JP2015230580A (en) * 2014-06-05 2015-12-21 三菱電機ビルテクノサービス株式会社 Program and information processor

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011039130A (en) * 2009-08-07 2011-02-24 Hyogo Prefecture Linked display device, linked display method, and program
JP2015513808A (en) * 2012-01-24 2015-05-14 アクシピター ラダー テクノロジーズ, インコーポレイテッド Personal electronic target vision system, apparatus and method
JP2015230580A (en) * 2014-06-05 2015-12-21 三菱電機ビルテクノサービス株式会社 Program and information processor

Also Published As

Publication number Publication date
JP2018109925A (en) 2018-07-12

Similar Documents

Publication Publication Date Title
JP4167263B2 (en) Mobile terminal device
US8676498B2 (en) Camera and inertial measurement unit integration with navigation data feedback for feature tracking
JP4632793B2 (en) Portable terminal with navigation function
US20120176525A1 (en) Non-map-based mobile interface
AU2007252840B2 (en) Methods and system for communication and displaying points-of-interest
US20130200882A1 (en) Methods and devices for detecting magnetic interference affecting the operation of a magnetometer
JP4515497B2 (en) Traveling direction measuring device and traveling direction measuring method
JP2008227877A (en) Video information processor
JP2010537300A (en) Method and apparatus for transmitting data about a target to a mobile device
US10281911B1 (en) System and method for controlling a remote aerial device for up-close inspection
US9586682B2 (en) Unmanned aerial vehicle control apparatus and method
US20090214082A1 (en) Image management apparatus
US9335416B2 (en) Portable biometric monitoring devices having location sensors
JP6469962B2 (en) Monitoring system and monitoring method
WO2015025271A1 (en) Fishing statistics display
US9779517B2 (en) Method and system for representing and interacting with augmented reality content
JP2001503134A (en) Portable hand-held digital geographic data manager
KR20130085435A (en) Haptic based personal navigation
US20150276402A1 (en) Enhanced Position Measurement Systems and Methods
WO2007093641A2 (en) Motion capture device and associated method
WO2007077613A1 (en) Navigation information display system, navigation information display method and program for the same
Ahn et al. An indoor augmented-reality evacuation system for the Smartphone using personalized Pedometry
KR101864814B1 (en) Method and device for providing guidance to street view destination
CN104937604A (en) Location based process-monitoring
CN102338639B (en) Information processing device and information processing method

Legal Events

Date Code Title Description
A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20170420

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20170421

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20170530

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20170601

R150 Certificate of patent or registration of utility model

Ref document number: 6154563

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150