WO2020026414A1 - Système de communication d'informations, dispositif terminal et procédé de fonctionnement d'un système de communication d'informations - Google Patents

Système de communication d'informations, dispositif terminal et procédé de fonctionnement d'un système de communication d'informations Download PDF

Info

Publication number
WO2020026414A1
WO2020026414A1 PCT/JP2018/029052 JP2018029052W WO2020026414A1 WO 2020026414 A1 WO2020026414 A1 WO 2020026414A1 JP 2018029052 W JP2018029052 W JP 2018029052W WO 2020026414 A1 WO2020026414 A1 WO 2020026414A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
terminal
detection signal
terminal device
communication system
Prior art date
Application number
PCT/JP2018/029052
Other languages
English (en)
Japanese (ja)
Inventor
大内 敏
瀬尾 欣穂
千代 大野
孝弘 松田
Original Assignee
株式会社日立製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立製作所 filed Critical 株式会社日立製作所
Priority to PCT/JP2018/029052 priority Critical patent/WO2020026414A1/fr
Priority to JP2020533999A priority patent/JP7227975B2/ja
Publication of WO2020026414A1 publication Critical patent/WO2020026414A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units

Definitions

  • the present invention relates to an information communication system, a terminal device, and an operation method of the information communication system.
  • terminal devices such as wearable terminals that display predetermined AR (Argument Reality), MR (Mixed Reality), and VR (Virtual Reality) information using a spectacle-type head mounted display have begun to appear on the market.
  • AR Augment Reality
  • MR Magnetic Reality
  • VR Virtual Reality
  • wearable terminals having interfaces that appeal to the senses of the human being such as voice, vibration, temperature change, smell, taste, electric signal, and physical stimulation have begun to appear on the market. .
  • Patent Literature 1 proposes a technique for automatically switching a display mode (summary display mode and detailed display mode) of information displayed on a display panel according to a user's action state.
  • the server device when transmitting information from the server device installed in a cloud or the like to the above-described terminal device and displaying the information, or temporarily outputting an image captured by a camera provided in the terminal device to the server device, the server device performs a predetermined process on an image and then outputs and displays a processing result from the server device to the terminal device, the time required for information communication between the terminal device and the server device is caused by the time.
  • a display delay occurs in the terminal device or an appropriate information is not displayed at an appropriate position in the real world, and the user may feel more troublesome.
  • the conventional terminal device cannot accurately determine the user's intention or action and execute a process corresponding to the intention or action.
  • the object of the present invention is to present necessary information to a user who uses a terminal device without delay and with good timing in an information communication system including a terminal device and a server device.
  • One embodiment of the present invention that solves the above problem is an information communication system including a terminal device and a server device that communicate via a network, wherein the terminal device includes a sensor unit including at least one sensor, A terminal processing unit that performs processing based on the detection signal of the sensor unit, a determination unit that determines whether the detection signal of the sensor unit satisfies a preset condition, and a terminal communication unit that communicates with the server device.
  • the terminal device includes a sensor unit including at least one sensor, A terminal processing unit that performs processing based on the detection signal of the sensor unit, a determination unit that determines whether the detection signal of the sensor unit satisfies a preset condition, and a terminal communication unit that communicates with the server device.
  • an information communication system including a terminal device and a server device
  • necessary information is presented to a user who uses the terminal device without delay and with good timing.
  • FIG. 1 It is a figure showing the example of system composition of the information communications system concerning one embodiment of the present invention. It is a block diagram showing the example of functional composition of the terminal unit and the server unit concerning one embodiment of the present invention. 4 is a flowchart illustrating an example of a basic operation of the information communication system according to an embodiment of the present invention.
  • FIG. 1 is a diagram showing a configuration example of an information communication system 1 according to an embodiment of the present invention.
  • the information communication system 1 assumed in the present embodiment includes a terminal device 100 and a server device 300.
  • the terminal device 100 and the server device 300 communicate via the network 200.
  • the network 200 includes communication lines such as a wireless LAN (Local Area Network), a wired line, and an Internet line.
  • a personal computer 400, a tablet terminal 500, a wearable terminal 600, and a cloud server device 700 are connected to the network 200.
  • the connected devices are configured to be able to communicate with each other.
  • the devices connected to the network 200 are not limited to these devices, and other devices may be connected to enable communication.
  • the terminal device 100 of the present embodiment is a device called a wearable terminal that is configured to be detachable from a part of the body of the worker 2.
  • the terminal device 100 according to the present embodiment is a glasses-type wearable terminal that is worn on the head of the worker 2, captures image information similar to or wider than the field of view of the worker 2, and stores the captured image information. And other various information are configured to be visible to the worker 2.
  • an example of a glasses-type wearable terminal is given as the terminal device 100.
  • the terminal device 100 may be a wearable terminal other than a glasses-type terminal, for example, a list worn on the wrist of the worker 2.
  • a combination of a plurality of types of wearable terminals such as a band-type wearable terminal, a clip-type wearable terminal attached to clothes of the worker 2 by a clip, or a combination of a glasses-type and a wristband-type wearable terminal May be configured from the terminal device 100.
  • the terminal device 100 includes a display unit 110 (corresponding to a notification unit of the present invention) and a frame unit 120.
  • the display unit 110 includes, for example, a pair of left and right see-through liquid crystal devices as viewed from the operator 2, and the image information and various information captured are displayed in an AR (Augmented Reality) on the liquid crystal device.
  • the frame unit 120 is formed in a shape that can be put on the ears of the worker 2 like a spectacle frame, and various electrical components are built in the housing (see FIG. 2).
  • FIG. 2 is a block diagram illustrating a functional configuration example of the terminal device 100 and the server device 300 according to an embodiment of the present invention.
  • the terminal device 100 includes a sensor unit 101, a terminal processing unit 102, a determination unit 103, a terminal communication unit 104, a terminal storage unit 105, and a control unit 106 in the frame unit 120.
  • the sensor unit 101 includes at least one sensor, for example, an image sensor (complementary metal oxide semiconductor (CMOS) sensor, charge coupled device (CCD) sensor, time of flight (TOF) sensor) that captures the above-described image information, and n.
  • An axis sensor, an acceleration sensor, a GPS (Global Positioning System) sensor, and a direction sensor are provided.
  • the sensor unit 101 is not limited to these sensors, and may include any sensor that detects the current state of the worker.
  • a temperature sensor that detects the ambient temperature or body temperature around the worker 2
  • a humidity sensor for detecting the ambient humidity
  • a pressure sensor for detecting the atmospheric pressure around the worker 2
  • a sound sensor microphone
  • the terminal processing unit 102 performs a process based on a detection signal detected by each sensor of the sensor unit 101. Specifically, the terminal processing unit 102 generates a video signal that can be displayed on the display unit 110 based on the detection signal detected by the image sensor, or generates a video signal that is detected by the n-axis sensor or the acceleration sensor. Based on the detection signal of the n-axis sensor, the GPS sensor or the direction sensor, the position information of the worker 2 and the azimuth information such as the line-of-sight direction and the moving direction are generated based on the detection signal of the n-axis sensor, the GPS sensor or the direction sensor. Generate
  • the video signal and various information generated by the terminal processing unit 102 are output to the control unit 106, and the control unit 106 causes the display unit 110 to display the video signal and various information.
  • the video signal and various information are output to the terminal storage unit 105 and stored. Further, the video signal and various information are output to the terminal communication unit 104, and are configured to be output from the terminal communication unit 104 to the network 200.
  • the determination unit 103 determines whether the detection signal detected by the sensor unit 101 satisfies a preset condition. Specifically, the determination unit 103 detects, for example, whether the detection signal of the acceleration sensor is within a preset threshold range, or is generated by the terminal processing unit 102 based on the detection signal of the acceleration sensor. It is determined whether or not the acceleration profile is within a preset shape range. Note that the acceleration profile is a graph in which a change with time of the acceleration is plotted.
  • the determination unit 103 determines whether the position information of the worker 2 generated by the terminal processing unit 102 based on the detection signal of the n-axis sensor or the GPS sensor is within a preset real three-dimensional space. Determine whether or not. In addition, the determination unit 103 determines whether the load of image processing performed on the video signal generated from the detection signal of the image sensor is within a preset range. Note that the load of image processing may be determined by, for example, performing edge detection processing on a video signal and calculating the edge amount or edge complexity. The determination process in the determination unit 103 will be described later in detail.
  • the terminal communication unit 104 communicates with the server device 300 and other devices connected to the network 200 via the network 200. Then, the terminal communication unit 104 outputs a detection signal detected by the sensor unit 101, a video signal generated by the terminal processing unit 102, and various information to the server device 300 and other devices of the communication destination. .
  • the terminal communication unit 104 is configured to be able to communicate with the portable terminal device 130 used by the worker 2.
  • the mobile terminal device 130 is configured by a smartphone or a tablet terminal, and has installed therein an application that is linked to the operation of the terminal device 100.
  • the mobile terminal device 130 receives various setting inputs related to the operation of the terminal device 100 on the interface screen displayed by the application, performs all the processing in the terminal processing unit 102 instead, or performs a part of the processing. Is shared with the terminal processing unit 102. It can be said that the terminal device 100 alone or the set of the terminal device 100 and the portable terminal device 130 corresponds to the terminal device of the present invention.
  • the terminal storage unit 105 stores a detection signal detected by the sensor unit 101, a video signal generated by the terminal processing unit 102, and various information.
  • the control unit 106 controls the entire terminal device 100.
  • the control unit 106 of the present embodiment switches between performing the process by the terminal processing unit 102 or performing communication with the server device 300 by the terminal communication unit 104 based on the determination result of the determination unit 103. . That is, the control unit 106 of the present embodiment continuously performs processing based on the detection signal detected by the sensor unit 101 on the terminal processing unit 102 side, or performs communication with the server device 300 by the terminal communication unit 104. Whether the processing is performed on the server device 300 side is switched according to the determination result of the determination unit 103. A specific example will be described later in detail.
  • the terminal processing unit 102, the determination unit 103, the terminal communication unit 104, the terminal storage unit 105, and the control unit 106 are controlled by a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), and the like.
  • the control board is provided in a housing of the frame unit 120. Then, the functions of each unit of the terminal device 100 are realized by the CPU executing the program stored in the ROM or the RAM.
  • the server device 300 of the present embodiment is a so-called cloud server device, and is configured to be able to communicate with the terminal device 100 and other devices connected to the network 200 via the network 200 as described above.
  • the server device 300 of the present embodiment includes a server communication unit 301, a server processing unit 302, a server storage unit 303, and a control unit 304, as shown in FIG.
  • the server communication unit 301 communicates with the terminal device 100 and other devices connected to the network 200 via the network 200.
  • the server communication unit 301 receives the detection signal of the sensor unit 101 output from the terminal device 100, the video signal generated by the terminal processing unit 102, and various information.
  • the server device 300 of the present embodiment outputs, to the terminal device 100, the conditions used for the determination process in the determination unit 103 of the terminal device 100 via the server communication unit 301.
  • the terminal device 100 receives the condition of the determination process output from the server device 300 and newly sets the received condition in the determination unit 103 or updates the previously set condition.
  • the condition of the determination process output from the server device 300 to the terminal device 100 may be set and input by an administrator or the like in the server device 300, or may be a condition acquired by an arithmetic process by the server device 300.
  • the server device 300 communicates with the personal computer 400, the tablet terminal 500, the wearable terminal 600, and the cloud server device 700 connected to the network 200.
  • a condition such as a threshold value of acceleration set in the device is obtained, a statistic such as an average value, a maximum value, and a minimum value of the condition is calculated, and the statistic is output to the terminal device 100 as a condition of the determination process. Is also good.
  • the server device 300 obtains, for example, acceleration profiles according to the operation of the worker from each of the above-described devices connected to the network 200, and machine-learns a large number of the acceleration profiles, The result of the machine learning may be output to the terminal device 100 as a condition for the determination process.
  • the server processing unit 302 performs a process based on a detection signal detected by each sensor of the sensor unit 101.
  • the server processing unit 302 of the present embodiment basically performs processing when the determination unit 103 of the terminal device 100 determines that the detection signal detected by the sensor unit 101 does not satisfy a preset condition. I do.
  • the server processing unit 302 determines that the acceleration detected by the acceleration sensor of the terminal device 100 is out of the preset threshold range or that the acceleration profile is out of the preset shape range. In this case, it is determined that the operation of the worker 2 is abnormal, and a process of outputting a warning or a predetermined instruction to return to the correct operation is performed.
  • the preset threshold range corresponds to the first condition of the present invention
  • the outside of the preset threshold range corresponds to the second condition of the present invention.
  • a preset range of the shape corresponds to the first condition of the present invention
  • a range outside the preset shape corresponds to the second condition of the present invention.
  • the first and second conditions in the present invention may be mutually exclusive conditions as described above, or may be completely different numerical ranges or completely different shape ranges.
  • the server device 300 when the acceleration detected by the acceleration sensor is out of the preset threshold range or when the acceleration profile is determined to be out of the preset shape range, the server device 300
  • the server communication unit 301 receives the detection signal of the sensor unit 101, the video signal generated by the terminal processing unit 102 and various information, and the server processing unit 302 transmits the received detection signal and the like to the information of the worker 2. May be stored in the server storage unit 303 in association with. This makes it possible to manage that the worker 2 has performed an abnormal action.
  • the server storage unit 303 includes a storage medium such as a semiconductor memory and a hard disk, and has a storage area with a larger capacity than the terminal storage unit 105 of the terminal device 100.
  • the server storage unit 303 stores the detection signal of the sensor unit 101 output from the terminal device 100, the video signal generated by the terminal processing unit 102, various information, and the like as described above.
  • the control unit 304 controls the entire server device 300.
  • the control unit 304 of the present embodiment controls communication between the server device 300 and the terminal device 100, and controls processing based on a detection signal of the sensor unit 101 output from the terminal device 100.
  • the server communication unit 301, the server processing unit 302, and the control unit 304 include a CPU, a ROM, a RAM, and the like.
  • the functions of each unit of the server device 300 are realized by the CPU executing a program stored in the ROM or the RAM.
  • FIG. 3 is a flowchart illustrating an example of a basic operation of the information communication system 1 according to an embodiment of the present invention.
  • a power switch (not shown) of the terminal device 100 is turned on (step S10), and detection by each sensor of the sensor unit 101 of the terminal device 100 is started (step S12).
  • a detection signal detected by each sensor of the sensor unit 101 is input to the terminal processing unit 102, performs predetermined processing, and is input to the determination unit 103.
  • the determination unit 103 determines whether or not the input detection signal of each sensor satisfies a preset condition (step S14), and while it is determined that the condition is satisfied (step S14). (S14, YES), the processing based on the detection signal in terminal processing section 102 is continued (step S16).
  • the processing result in the terminal processing unit 102 is displayed on the display unit 110 under the control of the control unit 106. It is desirable that the time T from the start of detection by the sensor unit 101 to the display by the display unit 110 is 10 ⁇ s ⁇ T ⁇ 60 ms.
  • the processing in the terminal processing unit 102 will be described in detail in an application example later.
  • the terminal communication unit is controlled by the control unit 106.
  • 104 starts communication with the server communication unit 301 of the server device 300 (step S18).
  • the server communication unit 301 receives the detection signal of the sensor unit 101 output from the terminal device 100, and the server processing unit 302 performs a process based on the detection signal of the sensor unit 101 (step S20).
  • the processing in the server processing unit 302 will be described in detail in an application example later.
  • the processing result of the server processing unit 302 is output to the terminal device 100 via the server communication unit 301 (Step S22).
  • the processing result output from the server device 300 is received by the terminal communication unit 104 of the terminal device 100, and is displayed on the display unit 110 under the control of the control unit 106.
  • the processing is switched between the terminal device 100 and the server device 300 according to the determination result of the determination unit 103 of the terminal device 100, and the display unit of the terminal device 100 The processing result is displayed on 110, and in parallel with this processing, the control unit 106 of the terminal device 100 measures the elapsed time from the start of detection by the sensor unit 101 (step S12).
  • control unit 106 determines the relevant information such as the detection signal of the sensor unit 101 and the processing result of the terminal processing unit 102. Is stored in the terminal storage unit 105 (step S26). Thereafter, the process returns to step S12.
  • control unit 106 controls terminal communication unit 104 to communicate with server communication unit 301 of server device 300. Is started (step S28). Then, under the control of the control unit 106, the above-described related information stored in the terminal storage unit 105 is read and output to the server device 300 via the terminal communication unit 104.
  • the related information stored in the terminal storage unit 105 is deleted, and the storage area of the terminal storage unit 105 is released.
  • the related information output from the terminal device 100 is received by the server communication unit 301 of the server device 300, and is stored in the server storage unit 303 (Step S30).
  • step S32, NO the processes of steps S12 to S30 are repeatedly performed. That is, processing is switched between the terminal device 100 side and the server device 300 side in accordance with the determination result in the determination unit 103 of the terminal device 100, and the processing result is displayed on the display unit 110 of the terminal device 100 and set in advance.
  • the associated information is read from the terminal storage unit 105 of the terminal device 100 for each elapsed time, output from the terminal device 100 to the server device 300, and stored in the server storage unit 303. Then, the terminal device 100 ends the process when the power switch is turned off (step S32, YES).
  • the process is continued on the terminal device 100 side while the determination unit 103 determines that the detection signal of the sensor unit 101 satisfies the preset condition.
  • the terminal device 100 starts communication with the server device 300 and performs processing on the server device 300 side. Necessary information can be presented to the user using the terminal device 100 without delay and with good timing.
  • the terminal device 100 since the terminal device 100 has a limited physical size and a limited battery capacity, if the terminal device 100 and the server device 300 frequently communicate, the terminal device 100 can be used. Time may be reduced excessively. Further, there is a concern that a communication fee for communication between the terminal device 100 and the server device 300 may increase.
  • the terminal device 100 when the determination unit 103 determines that the detection signal of the sensor unit 101 does not satisfy the preset condition, the terminal device 100 transmits the server device. Since the communication with the communication device 300 is started, it is possible to suppress a decrease in the available time and an increase in the communication fee as described above.
  • the control unit 106 of the terminal device 100 measures the elapsed time from the start of the detection by the sensor unit 101, and the measured elapsed time exceeds a preset time.
  • the control unit 106 reads the related information stored in the terminal storage unit 105 and outputs it to the server device 300.
  • the data amount of the information may be monitored, and when the data amount becomes equal to or greater than a preset threshold value, the related information stored in the terminal storage unit 105 may be read and output to the server device 300.
  • the first application example is an example in which a wristband-type wearable terminal is used as the terminal device 100.
  • a wristband-type wearable terminal is used as the terminal device 100.
  • the first application example is an example in which a warning or the like is displayed at an appropriate timing on the display unit 110 of the terminal device 100 when the worker 2 performs an abnormal operation on the work target device.
  • the terminal device 100 detects the acceleration of the hand when the worker 2 turns the handle of the work target device.
  • a threshold determined based on the acceleration of the hand when the expert turns the steering wheel is set in advance.
  • the determination unit 103 compares the acceleration profile generated by the terminal processing unit 102 based on the detection signal of the sensor unit 101 with the threshold, and determines whether the acceleration profile is within the threshold by the determination unit 103.
  • the processing of generating the acceleration profile is continuously performed by the terminal processing unit 102, and the determination processing by the determination unit 103 is continued.
  • the monitoring of the acceleration profile is performed on the terminal device 100 side and the monitoring by the server device 300 is not performed.
  • the delay caused by communication with the T.30 can be eliminated. That is, for example, when the worker 2 performs an abnormal operation, the operation can be immediately detected.
  • the terminal device 100 starts communication with the server device 300 and indicates to the server device 300 that the acceleration has exceeded the threshold.
  • An abnormality occurrence signal and an acceleration profile generated by the terminal processing unit 102 are output.
  • the server device 300 receives the abnormality occurrence signal and the acceleration profile output from the terminal device 100, and in response thereto, the server processing unit 302 generates a warning for the worker 2, the next work content, and the like, and the control unit 304 Outputs these to the terminal device 100 via the server communication unit 301. Then, the warning output from the server device 300 and the next work content are displayed on the display unit 110 of the terminal device 100. Thereby, the worker 2 can obtain necessary information.
  • the control unit 304 of the server device 300 stores, in the server storage unit 303, identification information and an acceleration profile of the worker 2 who has performed abnormal work.
  • the information stored in the server storage unit 303 is read out and displayed later in the server device 300, for example, and can be used for new employee education or the like. Further, the information stored in the server storage unit 303 may be used for statistical calculation of the above-described threshold value or machine learning, and the newly calculated threshold value may be transmitted to the terminal device 100 and set. . As a result, more accurate determination processing can be performed.
  • the acceleration profile generated by the terminal processing unit 102 is temporarily stored in the terminal storage unit 105. However, as described above, except when the acceleration profile exceeds the threshold, the acceleration detection by the sensor unit 101 is started. Thereafter, the information is read from the terminal storage unit 105 of the terminal device 100 at each preset elapsed time, output from the terminal device 100 to the server device 300, and stored in the server storage unit 303. Thereby, the storage area of the terminal storage unit 105 is periodically opened, and the server device 300 can manage the information on the operation of the worker 2.
  • the second application example is an example in which a wristband-type wearable terminal is used as the terminal device 100 as in the first application example, and the worker 2 is operated by the acceleration sensor of the sensor unit 101 of the terminal device 100. The acceleration of the hand when operating a predetermined work target device is detected.
  • the second application example is an example in which the next work content is displayed on the display unit 110 of the terminal device 100 at an appropriate timing when the worker 2 completes a series of work.
  • the determination unit 103 of the terminal device 100 of the second application example an acceleration profile of a hand when a series of operations is performed in a predetermined procedure is set in advance. Then, the determination unit 103 compares the acceleration profile generated by the terminal processing unit 102 based on the detection signal of the sensor unit 101 with the preset acceleration profile, and detects the preset acceleration profile. Until then, the processing of generating an acceleration profile in the terminal processing unit 102 is continuously performed, and the determination processing by the determination unit 103 is continued. As described above, in the second application example, the monitoring of the acceleration profile is performed on the terminal device 100 side and the monitoring by the server device 300 is not performed while the preset acceleration profile is not detected. Delay due to communication with the server device 30 can be eliminated. That is, in the second application example, it is possible to immediately detect that the worker 2 has completed a series of operations.
  • the terminal device 100 starts communication with the server device 300 and indicates to the server device 300 that a series of operations has been completed.
  • the end signal and the acceleration profile generated by the terminal processing unit 102 are output.
  • Server device 300 receives the end signal and the acceleration profile output from terminal device 100, and in response thereto, server processing unit 302 generates information indicating the next work content, and control unit 304 converts this information into The data is output to the terminal device 100 via the server communication unit 301. Then, the next work content output from the server device 300 is displayed on the display unit 110 of the terminal device 100. Thereby, the worker 2 can obtain necessary information.
  • the control unit 304 of the server device 300 stores, in the server storage unit 303, the identification information and the acceleration profile of the worker 2 who has completed the series of operations.
  • the information stored in the server storage unit 303 is read out and displayed later in the server device 300, for example, so that it is possible to later confirm that the worker 2 has properly performed the work.
  • the information stored in the server storage unit 303 may be used for statistical calculation or machine learning of a preset acceleration profile, and the newly calculated acceleration profile is transmitted to the terminal device 100 and set. May be done. As a result, more accurate determination processing can be performed.
  • the acceleration profile generated by the terminal processing unit 102 is temporarily stored in the terminal storage unit 105.
  • the acceleration profile preset as described above is detected by the determination unit 103.
  • the acceleration is read from the terminal storage unit 105 of the terminal device 100 at every preset elapsed time, output from the terminal device 100 to the server device 300, and stored in the server storage unit 303. .
  • the storage area of the terminal storage unit 105 is periodically opened, and the server device 300 can manage the work information of the worker 2.
  • the third application example is an example in which a glasses-type wearable terminal is used as the terminal device 100, and an image of the field of view of the worker 2 is captured by the image sensor of the sensor unit 101.
  • the image processing is performed by the terminal processing unit 102 of the terminal device 100, The image processing is performed by the server processing unit 302 of the server device 300 only when the load of the image processing is heavy.
  • the space in the housing of the frame unit 120 of the terminal device 100 is narrow, and the processing capacity of the terminal processing unit 102 provided in the narrow space is limited, so that it may be difficult to perform heavy-load image processing at high speed. .
  • image processing is performed by the server processing unit 302 that can perform processing at a higher speed than the terminal processing unit 102. Then, the terminal device 100 receives the image processing result output from the server device 300 and causes the display unit 110 to display the image processing result, whereby the image processing result can be displayed at an appropriate timing.
  • the video signal generated by the terminal processing unit 102 of the terminal device 100 is input to the determination unit 103. Then, the determination unit 103 determines whether the load of image processing performed on the video signal is within a preset range, based on, for example, an edge amount included in the input video signal. I do. That is, the determination unit 103 determines whether the load of the image processing performed on the video signal is high.
  • the terminal processing unit 102 performs the image processing on the video signal. Specifically, the terminal processing unit 102 performs image processing on the video signal, extracts an object included in the video signal, and generates information on the object. The information regarding the object is AR-displayed on the display unit 110 by the control unit 106.
  • the objects extracted from the video signal include, for example, a part of a device used by the worker 2 and a work target existing within a range of the visual field of the worker 2.
  • Information on the object is set in the terminal processing unit 102 in advance in association with the object.
  • the information on the object for example, when the object is a part of a device, the operation content for a part of the device is set.
  • the object is a work target, the description of the work target, the content of work performed on the work target, and the like are set.
  • the terminal device 100 starts communication with the server device 300 and outputs a video signal to the server device 300.
  • the server device 300 receives the video signal output from the terminal device 100, and the server processing unit 302 performs image processing on the video signal.
  • the content of the image processing is the same as the image processing in the terminal processing unit 102 described above.
  • control unit 304 of the server device 300 outputs information on the object generated in the server processing unit 302 to the terminal device 100 via the server communication unit 301.
  • the information about the object output from the server device 300 is received by the terminal device 100, and is displayed in an AR on the display unit 110 of the terminal device 100.
  • the terminal apparatus 100 when image processing of a video signal is low in load, the terminal apparatus 100 performs image processing to generate information about the object and does not perform communication with the server apparatus 300. Information can be presented to the worker 2 immediately.
  • the server apparatus 300 capable of processing at a higher speed than the terminal apparatus 100 performs the image processing to generate information about the object, Output to the terminal device 100. Thereby, the terminal device 100 can immediately present the information on the object to the worker 2.
  • the video signal generated by the terminal processing unit 102 and the image processing result thereof are temporarily stored in the terminal storage unit 105.
  • the image data is read from the terminal storage unit 105 of the terminal device 100 at every preset elapsed time, output from the terminal device 100 to the server device 300, and stored in the server device 300.
  • the information is stored in the unit 303. Accordingly, the storage area of the terminal storage unit 105 is periodically opened, and the server device 300 can manage the shooting information of the terminal device 100.
  • the fourth application example is an example in which a glasses-type wearable terminal is used as the terminal device 100.
  • the position information of the worker 2 is detected by a GPS sensor or the like of the sensor unit 101, and the work is performed based on the position information.
  • the determining unit 103 determines whether the person 2 has entered the dangerous area.
  • the terminal processing unit 102 while the determination unit 103 determines that the worker 2 has entered the dangerous area, the terminal processing unit 102 generates a warning for the worker 2, A dangerous substance is extracted from the video signal detected by the method, and information on the dangerous substance is generated. Then, the display unit 110 of the terminal device 100 displays information on warnings and dangerous substances.
  • the terminal device 100 transmits the video signal detected by the image sensor to the server.
  • the signal is output to the device 300, and the server processing unit 302 of the server device 300 performs image processing on the video signal.
  • the position information of the worker 2 detected by the sensor unit 101 is input to the determination unit 103. Then, the determination unit 103 compares the input position information of the worker 2 with information of the preset dangerous area, and determines whether the position information of the worker 2 is within the dangerous area.
  • the position information of the worker 2 may be two-dimensional position information in the horizontal direction or three-dimensional position information including the vertical direction.
  • the terminal processing unit 102 While the determination unit 103 determines that the worker 2 has entered the danger area, the terminal processing unit 102 generates a warning to the worker and information on dangerous substances as described above, and displays the display unit.
  • the AR is displayed at 110.
  • the detection unit 103 detects the position information.
  • the obtained video signal is output from the terminal device 100 to the server device 300.
  • the video signal output from the terminal device 100 is received by the server device 300, and the server processing unit 302 performs image processing on the video signal. Specifically, as in the third application example, the server processing unit 302 performs image processing on the video signal to extract an object included in the video signal and generate information about the object. I do.
  • control unit 304 of the server device 300 outputs information on the object generated in the server processing unit 302 to the terminal device 100 via the server communication unit 301.
  • the information about the object output from the server device 300 is received by the terminal device 100, and is displayed in an AR on the display unit 110 of the terminal device 100.
  • the terminal apparatus 100 when the worker 2 has entered the dangerous area, the terminal apparatus 100 performs processing to generate information such as a warning and a dangerous substance, and does not perform communication with the server apparatus 300. , Warnings and information on dangerous goods can be presented to the operator 2 immediately.
  • the server device 300 when the worker 2 is working in a safe area, the server device 300 performs image processing to generate information about the object and outputs the information to the terminal device 100. Thereby, the terminal device 100 can present the worker 2 with more detailed information on the object.
  • the position information of the worker 2 detected by the sensor unit 101 is temporarily stored in the terminal storage unit 105. However, after the detection of the video signal by the sensor unit 101 is started, at every preset time, The information is read from the terminal storage unit 105 of the terminal device 100, output from the terminal device 100 to the server device 300, and stored in the server storage unit 303. Thereby, the person in charge can manage the trajectory of the worker 2.
  • the determination unit 103 determines whether or not the worker 2 has entered the danger area based on the position information of the worker 2. For example, it may be determined whether or not the dangerous signal is included in the video signal detected by the sensor unit 101.
  • the terminal processing unit 102 When the determination unit 103 determines that the worker 2 has entered the dangerous area and the video signal contains a dangerous substance, the terminal processing unit 102 generates a warning or the like for the worker 2.
  • the display unit 110 may perform AR display of a warning or the like.
  • the determination unit 103 determines whether or not the worker 2 is moving toward the dangerous object, or if the line of sight of the worker 2 is dangerous. It may be determined whether or not the object is facing. Then, the determination unit 103 determines that the worker 2 has entered the dangerous area and the video signal includes a dangerous substance, and the worker 2 is moving toward the dangerous substance or the line of sight of the worker 2 When it is determined that the user is facing the dangerous material, the terminal processing unit 102 may generate a warning or the like for the worker 2, and the display unit 110 may display the warning or the like in the AR. As described above, by performing the determination process using the detection signals detected by the plurality of sensors included in the sensor unit 101, the determination accuracy can be improved.
  • control unit 106 may control the image sensor of the sensor unit 101 to capture a still image in the line of sight, and store the still image in the terminal storage unit 105.
  • the terminal device 100 is not limited to the server device 300 as described above. Instead, it can also communicate with other personal computers 400, tablet terminals 500, and cloud server devices 700, and these devices may perform the processing of the server device 300 described above. Further, the terminal device 100 may select the communication destination device according to the processing content of the communication destination. For example, when the terminal device 100 selects the personal computer 400 when performing relatively light display information generation processing in the device at the communication destination and performs relatively heavy image processing or the like, the terminal device 100 has a high processing capability. Server device 300 or cloud server device 700 that has been selected.
  • the present invention is not limited to the above-described embodiment, and includes various modifications.
  • each of the above embodiments has been described in detail in order to explain the present invention in an easily understandable manner, and the present invention is not necessarily limited to a configuration including all the described components.
  • a part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of one embodiment can be added to the configuration of another embodiment.
  • each of the above-described configuration and each function may be realized by software by a processor interpreting and executing a program that realizes each function.
  • Information such as programs, tables, and files that realize each function should be stored in a storage device such as a memory, a hard disk and an SSD (Solid State Drive), or a storage medium such as an IC (Integrated Circuit) card, an SD card, and a DVD. Can be.
  • the control lines and the information lines are shown to be necessary for the description, and do not necessarily indicate all the control lines and the information lines on the product. In fact, it can be considered that almost all components are interconnected.
  • the present invention provides not only an information communication system, a terminal device, and an operation method of an information communication system, but also a computer-readable program executed by an information communication system, a terminal device, and a server device, and a processing method in the terminal device and the server device. It can be provided in various aspects.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Information Transfer Between Computers (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephonic Communication Services (AREA)

Abstract

La présente invention a pour objet de permettre à un système de communication d'informations comprenant un dispositif terminal et un dispositif serveur de présenter des informations nécessaires à un utilisateur du dispositif terminal sans retard et avec une bonne synchronisation. Le dispositif terminal (100) comprend une unité de commande (106) permettant d'effectuer une commande de telle sorte qu'un processus par une unité de traitement de terminal (102) se poursuive pendant qu'il est déterminé par une unité de détermination (103) qu'un signal de détection satisfait une première condition prédéfinie et d'effectuer une commande de telle sorte qu'une unité de communication de terminal (104) communique avec une unité de communication serveur (301) s'il a été déterminé par l'unité de détermination (103) que le signal de détection satisfait une seconde condition prédéfinie.
PCT/JP2018/029052 2018-08-02 2018-08-02 Système de communication d'informations, dispositif terminal et procédé de fonctionnement d'un système de communication d'informations WO2020026414A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2018/029052 WO2020026414A1 (fr) 2018-08-02 2018-08-02 Système de communication d'informations, dispositif terminal et procédé de fonctionnement d'un système de communication d'informations
JP2020533999A JP7227975B2 (ja) 2018-08-02 2018-08-02 情報通信システム、端末装置、及び情報通信システムの作動方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/029052 WO2020026414A1 (fr) 2018-08-02 2018-08-02 Système de communication d'informations, dispositif terminal et procédé de fonctionnement d'un système de communication d'informations

Publications (1)

Publication Number Publication Date
WO2020026414A1 true WO2020026414A1 (fr) 2020-02-06

Family

ID=69231536

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/029052 WO2020026414A1 (fr) 2018-08-02 2018-08-02 Système de communication d'informations, dispositif terminal et procédé de fonctionnement d'un système de communication d'informations

Country Status (2)

Country Link
JP (1) JP7227975B2 (fr)
WO (1) WO2020026414A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09204220A (ja) * 1996-01-25 1997-08-05 Yokogawa Electric Corp 設備診断装置
JP2004265009A (ja) * 2003-02-28 2004-09-24 Mitsubishi Electric Corp 診断システム
JP2016001425A (ja) * 2014-06-12 2016-01-07 株式会社日立製作所 監視制御システム及び監視制御方法
WO2017199807A1 (fr) * 2016-05-19 2017-11-23 日本電信電話株式会社 Dispositif de relais de capteur et système de relais de capteur

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3270346A4 (fr) 2015-03-12 2019-03-13 Sony Corporation Dispositif de traitement d'informations, procédé de traitement d'informations et programme
WO2017038100A1 (fr) 2015-09-03 2017-03-09 日本電気株式会社 Serveur de surveillance, procédé de détermination de traitement distribué, et support lisible par ordinateur non-temporaire stockant un programme
US11030918B2 (en) 2015-09-10 2021-06-08 Kinetic Telemetry, LLC Identification and analysis of movement using sensor devices

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09204220A (ja) * 1996-01-25 1997-08-05 Yokogawa Electric Corp 設備診断装置
JP2004265009A (ja) * 2003-02-28 2004-09-24 Mitsubishi Electric Corp 診断システム
JP2016001425A (ja) * 2014-06-12 2016-01-07 株式会社日立製作所 監視制御システム及び監視制御方法
WO2017199807A1 (fr) * 2016-05-19 2017-11-23 日本電信電話株式会社 Dispositif de relais de capteur et système de relais de capteur

Also Published As

Publication number Publication date
JPWO2020026414A1 (ja) 2021-08-02
JP7227975B2 (ja) 2023-02-22

Similar Documents

Publication Publication Date Title
CN110058694B (zh) 视线追踪模型训练的方法、视线追踪的方法及装置
CN108919958B (zh) 一种图像传输方法、装置、终端设备及存储介质
US10152129B2 (en) Electronic device, method and computer program product for providing vibratory feedback
US9798394B2 (en) Camera-assisted motion estimation for application control
US10684469B2 (en) Detecting and mitigating motion sickness in augmented and virtual reality systems
US10474411B2 (en) System and method for alerting VR headset user to real-world objects
US20170160795A1 (en) Method and device for image rendering processing
US10535113B2 (en) Image processing apparatus, image processing method, and storage medium for generating a mask image
CN105978848A (zh) 一种收集传感器数据的处理方法和装置
US20170163958A1 (en) Method and device for image rendering processing
US20150371444A1 (en) Image processing system and control method for the same
EP3742269A1 (fr) Procédé et appareil de réglage d'angle de visualisation, support de stockage et appareil électronique
JP7146087B2 (ja) ニューラルネットワークのトレーニング方法、視線追跡方法及び装置並びに電子機器
US9411162B2 (en) Mixed reality presenting system, virtual reality presenting system, display apparatus, information processing apparatus, control method, and program
US10478079B2 (en) Pulse estimation device, pulse estimation system, and pulse estimation method
JP2022532825A (ja) ヘッドマウントディスプレイに対する動的な障害物衝突警告を生成するシステムおよび方法
KR20160113169A (ko) 이미지 처리 방법 및 장치, 및 컴퓨터 디바이스
US10614590B2 (en) Apparatus for determination of interference between virtual objects, control method of the apparatus, and storage medium
US20180079078A1 (en) Robot simulation device
TWI532377B (zh) 影像感測系統、影像感測方法以及眼球追蹤系統、眼球追蹤方法
US9721538B2 (en) Information display apparatus, information display system, control method thereof, and program
US9619707B2 (en) Gaze position estimation system, control method for gaze position estimation system, gaze position estimation device, control method for gaze position estimation device, program, and information storage medium
US20120050275A1 (en) Image processing apparatus and image processing method
US20140098139A1 (en) Display apparatus, storage medium having stored in information processing program, information processing apparatus, information processing system, and image display method
EP3508951A1 (fr) Dispositif électronique de commande d'affichage d'images sur la base d'une entrée de défilement et procédé associé

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18928812

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020533999

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18928812

Country of ref document: EP

Kind code of ref document: A1