WO2022249900A1 - Information processing device - Google Patents

Information processing device Download PDF

Info

Publication number
WO2022249900A1
WO2022249900A1 PCT/JP2022/020208 JP2022020208W WO2022249900A1 WO 2022249900 A1 WO2022249900 A1 WO 2022249900A1 JP 2022020208 W JP2022020208 W JP 2022020208W WO 2022249900 A1 WO2022249900 A1 WO 2022249900A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
unit
area
information
user
Prior art date
Application number
PCT/JP2022/020208
Other languages
French (fr)
Japanese (ja)
Inventor
徳道 柴田
優紀 北本
隆寿 吉見
敬太郎 稲本
好朗 栃澤
Original Assignee
Necソリューションイノベータ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Necソリューションイノベータ株式会社 filed Critical Necソリューションイノベータ株式会社
Priority to JP2023523411A priority Critical patent/JPWO2022249900A1/ja
Publication of WO2022249900A1 publication Critical patent/WO2022249900A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L25/00Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00
    • G10L25/78Detection of presence or absence of voice signals

Definitions

  • the present invention relates to an information processing device, an information processing method, a recording medium, a display device, and a display method.
  • Patent Literature 1 describes a system for displaying image data on a head-mounted display, which has a view control section, an image generation section, and an output section.
  • a view control unit detects the range of the user's view based on the orientation of the head mounted display.
  • the image generator also generates an image by extracting the range of the user's field of view from the image data.
  • the output unit then outputs the generated image to the head-mounted display for display.
  • an object of the present invention is to provide an information processing apparatus, an information processing method, a recording medium, and a display that solve the problem that it is sometimes difficult to smoothly share information in a system that shares information such as image data.
  • An object of the present invention is to provide an apparatus and a display method.
  • an information processing device which is one aspect of the present disclosure, An information processing device connected to a plurality of user devices that display at least a partial area of all data, a detection unit that detects a predetermined condition; an instruction unit that instructs at least some of the connected user devices to display the area displayed by the user device corresponding to the condition according to the detection result of the detection unit; It has a configuration of
  • an information processing method that is another aspect of the present disclosure includes: An information processing device connected to a plurality of user devices that display at least a partial area of all data, detecting a predetermined condition; According to the detection result, at least some of the connected user devices are instructed to display the area displayed by the user device corresponding to the condition.
  • a recording medium that is another aspect of the present disclosure includes: An information processing device connected to a plurality of user devices that display at least a partial area of all data, detecting a predetermined condition; A program is recorded for executing the process of instructing at least some of the connected user devices to display the area displayed by the user device corresponding to the conditions according to the detection result.
  • a program is recorded for executing the process of instructing at least some of the connected user devices to display the area displayed by the user device corresponding to the conditions according to the detection result.
  • a display device that is another aspect of the present disclosure includes: a receiving unit that receives display area information, which is information for specifying an area being displayed on another user device in the entire data; a display unit that displays an area corresponding to the state of the user out of the entire data and displays an area that is being displayed on another user device based on the display area information; has The display unit displays a region currently displayed on the user device indicated by the display instruction out of the entire data, based on the display instruction transmitted from the external device as a detection result of the predetermined condition.
  • display area information which is information for specifying an area being displayed on another user device in the entire data
  • a display unit that displays an area corresponding to the state of the user out of the entire data and displays an area that is being displayed on another user device based on the display area information
  • has The display unit displays a region currently displayed on the user device indicated by the display instruction out of the entire data, based on the display instruction transmitted from the external device as a detection result of the predetermined condition.
  • a display method which is another aspect of the present disclosure, the user device receiving display area information, which is information for specifying an area being displayed on another user device in the entire data; displaying an area corresponding to the state of the user out of the entire data, and displaying an area being displayed on another user device based on the display area information;
  • display area information which is information for specifying an area being displayed on another user device in the entire data
  • displaying an area corresponding to the state of the user out of the entire data and displaying an area being displayed on another user device based on the display area information
  • a display instruction transmitted from an external device as a result of detection of a predetermined condition is received, based on the received display instruction, an area currently being displayed on the user device indicated by the display instruction is displayed among the entire data.
  • an information processing device, information processing method, recording medium, display device, and display method that enable smooth sharing of information in a system for sharing information such as image data are provided.
  • FIG. 1 is a diagram illustrating a configuration example of an image sharing system according to a first embodiment of the present disclosure
  • FIG. 2 is a block diagram showing a configuration example of the on-site device shown in FIG. 1
  • FIG. 2 is a block diagram showing a configuration example of a relay device shown in FIG. 1
  • FIG. 2 is a block diagram showing a configuration example of a user device shown in FIG. 1
  • FIG. 4 is a diagram showing a display example of a screen display unit
  • FIG. 4 is a diagram showing a display example of a screen display unit
  • FIG. 4 is a diagram showing a display example of a screen display unit
  • FIG. 4 is a diagram showing a display example of a screen display unit
  • FIG. 1 is a diagram illustrating a configuration example of an image sharing system according to a first embodiment of the present disclosure
  • FIG. 2 is a block diagram showing a configuration example of the on-site device shown in FIG. 1
  • FIG. 2 is a block diagram showing a configuration
  • 4 is a diagram showing a display example of a screen display unit; 4 is a flowchart showing an operation example of a relay device; 4 is a flow chart showing an operation example of a user device; It is a figure which shows the hardware structural example of the information processing apparatus concerning 2nd Embodiment of this indication. It is a block diagram which shows the structural example of an information processing apparatus. It is a block diagram which shows the structural example of a display apparatus.
  • FIG. 1 is a diagram showing a configuration example of an image sharing system 100.
  • FIG. 2 is a block diagram showing a configuration example of the field device 200.
  • FIG. 3 is a block diagram showing a configuration example of the relay device 300.
  • FIG. 4 is a block diagram showing a configuration example of the user device 400.
  • FIG. 5 to 7 are diagrams showing display examples of the screen display unit.
  • FIG. 8 is a flow chart showing an operation example of the relay device 300 .
  • FIG. 9 is a flow chart showing an operation example of the user device 400 .
  • an image sharing system 100 in which a plurality of user devices 400 share image data acquired by a 360-degree camera of the field device 200 will be described.
  • the image sharing system 100 at least one of the entire image data acquired by the 360-degree camera, such as an image of an area extracted according to the line-of-sight direction of the user, in each user device 400 is selected according to the state of the user. view the region of the Further, as will be described later, in the case of the present embodiment, the image sharing system 100 detects predetermined conditions such as conditions regarding sound data. Then, the image sharing system 100 causes each user device 400 included in the image sharing system 100 to display the area of the image data displayed by the user corresponding to the detected condition.
  • FIG. 1 shows a configuration example of the image sharing system 100.
  • image sharing system 100 includes field device 200 , relay device 300 and user device 400 .
  • image sharing system 100 includes a plurality of user devices 400 (user device 400-1, user device 400-2, . . . ). device 400).
  • the image sharing system 100 may have a plurality of field devices 200 and relay devices 300 .
  • the field device 200 and the relay device 300 are connected so as to be able to communicate with each other by wireless communication or the like. Also, the relay device 300 and the user device 400 are connected so as to be able to communicate with each other by wireless communication or the like.
  • the on-site device 200 is an information processing device that acquires image data.
  • the field device 200 has an imaging device such as a 360-degree camera, and acquires image data using the imaging device.
  • the on-site device 200 has a microphone, a speaker, and the like, and is configured to be capable of inputting and outputting sound data representing human voices and the like.
  • the on-site device 200 is placed around a factory, store, or other site to be imaged, and is operated by a person on site.
  • FIG. 2 shows a configuration example of the on-site device 200.
  • the field device 200 has an image data acquisition unit 210, a sound data input/output unit 220, and a transmission/reception unit 230, for example.
  • the field device 200 has an arithmetic device such as a CPU and a storage device.
  • the on-site device 200 implements each of the above-described processing units by executing a program stored in a storage device by an arithmetic device.
  • the image data acquisition unit 210 acquires image data by, for example, transmitting a predetermined instruction to the imaging device.
  • the image data acquisition unit 210 acquires omnidirectional (or hemispherical) image data from a 360-degree camera that is an imaging device.
  • the image data acquisition unit 210 can store the acquired image data in the storage device of the field device 200 .
  • the sound data input/output unit 220 acquires sound data and outputs sound data by using a microphone, a speaker, or the like.
  • the sound data input/output unit 220 can acquire sound data such as voices uttered by people on site. Further, the sound data input/output unit 220 can output sound data received from the relay device 300 by the transmission/reception unit 230 described later.
  • the transmission/reception unit 230 transmits/receives various data to/from the relay device 300 .
  • the transmission/reception unit 230 transmits image data acquired by the image data acquisition unit 210 and sound data acquired by the sound data input/output unit 220 to the relay device 300, which is an external device. Further, the transmission/reception unit 230 receives sound data and the like from the relay device 300 .
  • the above is a configuration example of the on-site device 200 .
  • the on-site device 200 may be composed of one information processing device, or may be composed of a plurality of communicatively connected devices such as a 360-degree camera, a speaker, and an information processing device. good.
  • the on-site device 200 may be configured so that the conversation destination for transmitting and receiving sound data can be set by the operation of the on-site person.
  • the on-site device 200 may be configured to be able to have a conversation only with the user device 400 set by the operation of the on-site person. Note that even in a case where a conversation is performed only with the set user device 400, as will be described later, the sound data transmitted by the field device 200 and the user device 400 set as a conversation partner are The transmitted sound data can be configured to be transmitted to the user device 400 that is not set as a conversation partner.
  • the relay device 300 is an information processing device such as a server device that transmits and receives data between the field device 200 and the user device 400 . Also, as will be described later, the relay device 300 detects a predetermined condition. Then, the relay device 300 instructs each user device 400 to display the area of the image data displayed by the user corresponding to the detected condition.
  • the relay device 300 has, for example, a communication I/F section 310, a storage section 320, and an arithmetic processing section 330 as main components.
  • FIG. 3 illustrates a case where the relay device 300 is realized by one information processing device.
  • the function as the relay device 300 may be realized by a plurality of information processing devices that are communicably connected to each other, such as being realized on a cloud, for example.
  • the communication I/F unit 310 consists of a data communication circuit. Communication I/F section 310 performs data communication with external devices such as field device 200 and user device 400 connected via a communication line.
  • the storage unit 320 is a storage device such as a hard disk or memory.
  • the storage unit 320 stores processing information and programs 324 required for various processes in the arithmetic processing unit 330 .
  • the program 324 realizes various processing units by being read and executed by the arithmetic processing unit 330 .
  • the program 324 is read in advance from an external device or recording medium via a data input/output function such as the communication I/F unit 310 and stored in the storage unit 320 .
  • Main information stored in the storage unit 320 includes, for example, image information 321, sound information 322, display information 323, and the like.
  • the image information 321 is information including image data received from the field device 200 .
  • the image information 321 is updated, for example, when the data receiving unit 331 receives image data from the field device 200 .
  • the sound information 322 is information including sound data received from the field device 200 or the user device 400 .
  • the sound information 322 stores sound data for each field device 200 and user device 400 .
  • the sound information 322 associates identification information for identifying the field device 200 and the user device 400 with sound data.
  • the sound information 322 is updated, for example, when the data receiving unit 331 receives sound data from the field device 200 or the user device 400 .
  • the display information 323 indicates display area information that is information for specifying the area being displayed on the user device 400 in the entire image data.
  • the display information 323 includes display area information for each user device 400 .
  • the display information 323 associates identification information for identifying the user device 400 with display area information.
  • the display information 323 is updated when the data receiving unit 331 acquires display area information from the user device 400 .
  • the display area information is not specifically limited as long as the area being displayed on the user device 400 can be specified.
  • the display area information may be information corresponding to the direction of the user's line of sight or orientation, information indicating the area being displayed on the user device 400 such as coordinates, or any other information for specifying the area.
  • the arithmetic processing unit 330 has an arithmetic device such as a CPU and its peripheral circuits.
  • the arithmetic processing unit 330 reads the program 324 from the storage unit 320 and executes it, thereby realizing various processing units by cooperating the hardware and the program 324 .
  • Main processing units realized by the arithmetic processing unit 330 include, for example, a data reception unit 331, a data transmission unit 332, a condition detection unit 333, a display instruction unit 334, and the like.
  • the data receiving unit 331 receives various data from the field device 200 and the user device 400 .
  • the data receiving unit 331 receives image data from the field device 200 . Then, the data receiving section 331 stores the received image data in the storage section 320 as the image information 321 . The data receiving unit 331 also receives sound data from the field device 200 . Then, the data receiving section 331 stores the received sound data in the storage section 320 as the sound information 322 . At this time, the data receiving unit 331 stores the sound data so that the sound data received from the field device 200 can be identified, such as by associating the sound data with identification information for identifying the field device 200. can do
  • the data receiving unit 331 receives sound data from the user device 400 . Then, the data receiving section 331 stores the received sound data in the storage section 320 as the sound information 322 . At this time, the data receiving unit 331 can store sound data for each user device 400 , such as by associating the sound data with identification information for identifying the user device 400 . Also, the data receiving unit 331 receives display area information from the user device 400 . Then, data receiving section 331 stores the received display area information in storage section 320 as display information 323 . At this time, the data receiving unit 331 can store the display area information for each user device 400 , such as associating the display area information with identification information for identifying the user device 400 .
  • the data transmission unit 332 transmits various data to the field device 200 and the user device 400 .
  • the data transmission unit 332 transmits the received data to the user device 400 or the on-site device 200 each time it receives data from the on-site device 200 or the user device 400 .
  • the data transmission unit 332 transmits sound data received from the user device 400 to the field device 200 .
  • the data transmission unit 332 may transmit the sound data and the identification information in association with each other. It should be noted that the data transmission unit 332 may be configured to transmit only the sound data received from the preset user device 400 to the field device 200 when the conversation partner is set.
  • the data transmission unit 332 also transmits image data and sound data received from the field device 200 to the user device 400 .
  • the data transmission unit 332 also transmits sound data received from another user device 400 to the user device 400 .
  • the data transmission unit 332 may transmit the sound data and the identification information in association with each other.
  • the data transmission unit 332 can transmit display area information received from another user device 400 to the user device 400 .
  • the data transmission unit 332 may be configured to transmit sound data to all the user devices 400 regardless of whether or not a conversation partner is set.
  • the condition detection unit 333 detects a predetermined condition. For example, the condition detection unit 333 detects a condition according to the transmission/reception status of sound data. Also, the condition detection unit 333 identifies the user device 400 corresponding to the detected condition.
  • the condition detection unit 333 detects a condition corresponding to the transmission/reception status of sound data according to the setting of the conversation partner. Specifically, for example, when the field device 200 sets a conversation between the field device 200 and a certain user device 400 , the condition detection unit 333 transmits and receives sound data between the field device 200 and the user device 400 . Detects that the condition that Then, the condition detection unit 333 identifies the user device 400 set as the conversation partner as the user device 400 corresponding to the detected condition.
  • the condition detection unit 333 detects a condition corresponding to the transmission/reception status of the sound data according to the status of the sound data included in the sound information 322 . Specifically, for example, if the condition detection unit 333 can determine that users or on-site users are having a conversation according to the state of the sound data, the on-site device 200 and the user device 400 or between the on-site device 200 and the user device 400 , that the condition that sound data is being transmitted and received between the user devices 400 is satisfied. Then, the condition detection unit 333 identifies the user device 400 determined to be having a conversation as the user device 400 corresponding to the detected condition.
  • condition detection unit 333 may determine whether or not it is possible to determine that the user is having a conversation using arbitrary means such as the amount or ratio of sound data per predetermined time. For example, the condition detection unit 333 may identify the user device 400 that is conversing with the field device 200 or the user device 400 that is conversing between users based on the ratio of the amount of sound data. In addition, when it is determined that no one is talking based on the amount of sound data, the condition detection unit 333 detects that the condition is that no one is in a conversation state, and determines that the pre-determined moderator or the like is satisfied. User device 400 or the like may be identified. In other words, the condition detection unit 333 may perform detection under the condition that no sound is detected (or the sound is below a predetermined level).
  • condition detection unit 333 detects conditions according to the transmission/reception status of sound data.
  • condition detection unit 333 may detect conditions other than those exemplified above. For example, the condition detection unit 333 may detect a condition that sound data including a specific keyword is being transmitted and received when a specific keyword is detected after characterizing the sound data. The condition detection unit 333 may detect conditions for sound data other than those exemplified above. Further, the condition detection unit 333 may detect a condition according to information other than sound data, such as PC button operation, mouse movement, line of sight, etc., to identify the user device.
  • information other than sound data such as PC button operation, mouse movement, line of sight, etc.
  • the display instruction unit 334 instructs each user device 400 including the specified user device 400 to display the area being displayed on the user device 400 specified by the condition detection unit 333 in response to detection of the condition by the condition detection unit 333 . instruct against.
  • the display instruction unit 334 may issue the above instruction to each user device 400 included in the image sharing system 100 other than the specified user device 400 .
  • the display instruction unit 334 transmits a display instruction including identification information for identifying the user device 400 specified by the condition detection unit 333 to each user device 400 in response to detection of the condition by the condition detection unit 333 . do. For example, by transmitting such a display instruction, the display instruction unit 334 instructs each user device 400 to display the area being displayed on the specified user device 400 .
  • the display instruction may include display area information corresponding to the specified user device 400 in addition to the identification information.
  • the condition detection unit 333 detects a condition that sound data is being transmitted/received between the user devices 400, the condition detection unit 333 identifies the plurality of user devices 400 that transmit/receive the sound data. Become. Therefore, the display instruction transmitted by the display instruction unit 334 may include a plurality of pieces of identification information.
  • the function as the data reception unit 331 and the function as the data transmission unit 332 may be realized using known technology.
  • the display instruction unit 334 may be configured to transmit the display instruction only to some of the user devices 400 included in the image sharing system 100 that satisfy a predetermined condition. In other words, the display instruction unit 334 may be configured to transmit a display instruction to user devices 400 that satisfy a predetermined condition among user devices 400 other than the specified user device 400 . In other words, the display instruction unit 334 does not necessarily have to transmit display instructions to all user devices 400 included in the image sharing system 100 .
  • the display instruction unit 334 may be configured to transmit a display instruction to user devices 400 belonging to a preset group among user devices 400 other than the specified user device 400 .
  • the data transmission unit 332 may be configured to transmit the display area information only to some of the user devices 400 included in the image sharing system 100 that satisfy a predetermined condition.
  • the user device 400 is an information processing device that displays at least a partial area of image data.
  • the user device 400 is an information processing device having a display device such as a head-mounted display, smart phone, or tablet.
  • the user device 400 has a sensor or the like, and can be configured to be able to detect the user's state such as the user's line-of-sight direction.
  • the user device 400 has a microphone, a speaker, and the like, and is configured to be capable of inputting and outputting sound data indicating a person's voice.
  • the user device 400 is owned by each user at a location different from the site, and is operated by the user. Note that at least some of the user devices 400 may exist on site.
  • FIG. 4 shows a configuration example of the user device 400.
  • the user device 400 has main components such as a communication I/F unit 410, a screen display unit 420, an operation input unit 430, a sound input/output unit 440, and a storage unit 450. , and an arithmetic processing unit 460 .
  • the communication I/F unit 410 consists of a data communication circuit. Communication I/F section 410 performs data communication with an external device such as relay device 300 connected via a communication line.
  • the screen display unit 420 consists of a screen display device such as an LCD (Liquid Crystal Display).
  • the screen display unit 420 can display various information stored in the storage unit 450, such as at least a partial area of the image data, according to an instruction from the arithmetic processing unit 460.
  • the screen display unit 420 may be a display with a touch panel that functions as the operation input unit 430, or the like.
  • the operation input unit 430 consists of operation input devices such as a keyboard, mouse, and various sensors.
  • the operation input unit 430 detects the operation of the user who has the user device 400 and outputs it to the arithmetic processing unit 460 .
  • the sound input/output unit 440 consists of sound input/output devices such as a microphone and a speaker.
  • the sound input/output unit 440 acquires sound data and outputs sound data according to instructions from the arithmetic processing unit 460 .
  • the sound input/output unit 440 can acquire sound data such as voice uttered by the user.
  • the sound input/output unit 440 can output sound data and the like included in the sound information 452 described later.
  • the storage unit 450 is a storage device such as a hard disk or memory.
  • the storage unit 450 stores processing information and programs 455 necessary for various processes in the arithmetic processing unit 460 .
  • the program 455 realizes various processing units by being read into the arithmetic processing unit 460 and executed.
  • the program 455 is read in advance from an external device or recording medium via a data input/output function such as the communication I/F unit 410 and stored in the storage unit 450 .
  • Main information stored in the storage unit 450 includes, for example, image information 451, sound information 452, display information 453, instruction information 454, and the like.
  • the image information 451 is information including image data received from the field device 200 via the relay device 300 .
  • the image information 451 is updated, for example, when the data receiving unit 461 receives image data from the relay device 300 .
  • the sound information 452 is information including sound data received from the field device 200 or the user device 400 via the relay device 300 .
  • the sound information 452 stores sound data for each field device 200 and user device 400 .
  • the sound information 452 associates identification information for identifying the field device 200 and the user device 400 with sound data.
  • the sound information 452 is updated, for example, when the data receiving unit 461 receives sound data from the relay device 300 .
  • the display information 453 indicates display area information that is information for specifying the area being displayed on each user device 400 included in the image sharing system 100 .
  • the display information 453 includes display area information for each user device 400 .
  • display information 453 associates identification information for identifying user device 400 with display area information.
  • the display information 453 is updated when the data receiving unit 461 acquires display area information from the relay device 300 .
  • the instruction information 454 indicates the display instruction received from the relay device 300.
  • the instruction information 454 is updated when a display instruction is received from the relay device 300 or the like.
  • the arithmetic processing unit 460 has an arithmetic device such as a CPU and its peripheral circuits.
  • the arithmetic processing unit 460 reads the program 455 from the storage unit 450 and executes it, thereby realizing various processing units by cooperating the hardware and the program 455 .
  • Main processing units realized by the arithmetic processing unit 460 include, for example, a data reception unit 461, a data transmission unit 462, an image display unit 463, a display area information acquisition unit 464, a display area information transmission unit 465, and the like.
  • the data receiving unit 461 receives various data from the relay device 300 .
  • the data receiving unit 461 receives image data from the relay device 300 . Then, the data receiving section 461 stores the received image data in the storage section 450 as the image information 451 . Also, the data receiving unit 461 receives sound data from the relay device 300 . Then, the data receiving section 461 stores the received sound data in the storage section 450 as the sound information 452 . At this time, the data receiving section 461 may store the sound data and the identification information in the storage section 450 in association with each other.
  • the data receiving unit 461 also receives display instructions from the relay device 300 . Then, data receiving section 461 stores the received display instruction in storage section 450 as instruction information 454 .
  • the data transmission unit 462 transmits various data to the relay device 300 .
  • the data transmission unit 462 transmits the acquired sound data to the relay device 300 every time it acquires the sound data.
  • the data transmission unit 462 may transmit the sound data and the identification information in association with each other.
  • the image display unit 463 causes the screen display unit 420 to display at least a partial area of the image data included in the image information 451 .
  • the image display unit 463 can display a partial area according to the state of the user operating the user device 400, and display the area being displayed on another user device 400 based on the display information 453. I can.
  • the image display unit 463 refers to the image information 451 and extracts an area corresponding to the user's state, such as the user's gaze direction detected by the sensor, from the image data. At this time, the image display unit 463 may extract the region by any method. Then, the image display unit 463 causes the screen display unit 420 to display the extracted area. In this manner, the image display unit 463 can cause the screen display unit 420 to display an area corresponding to the state of the user.
  • the image display unit 463 may extract a region according to the user's selection state such as an operation on the operation input unit 430 instead of detecting the line-of-sight direction, or in addition to detecting the line-of-sight direction.
  • the image display unit 463 may specify the user's state using any means and extract a region according to the specified state.
  • the image display unit 463 can refer to the image information 451 and the display information 453 and cause the screen display unit 420 to display the area being displayed on the other user device 400 .
  • the image display unit 463 can cause the screen display unit 420 to display the other person's display area, which is the area being displayed on any user device 400 .
  • the image display unit 463 may be configured to switch the user to be displayed on the screen display unit 420 according to the user's operation on the operation input unit 430 .
  • the screen display unit 463 may be configured so as to be able to switch the user to be displayed on the screen display unit 420 according to another user's operation on the other user device 400 .
  • the screen display unit 463 may be configured to display the other person's display area corresponding to the common user in each of the user devices 400 included in the screen sharing system 100, or display the other person's display area corresponding to the user selected by the own device. It may be configured to display a corresponding others display area.
  • switching of the user to be displayed may be performed by one or more predetermined user devices 400, or one or more user devices determined according to arbitrary conditions. 400 may be configured to do so.
  • the image display unit 463 can cause the screen display unit 420 to display the area being displayed on the user device 400 indicated by the display instruction, according to the display instruction received by the data reception unit 461 .
  • the image display unit 463 can cause the screen display unit 420 to display the designated area, which is the area being displayed on the designated user device 400 .
  • the image display unit 463 refers to the display information 453 to specify the display area information corresponding to the identification information included in the display instruction. Then, image display section 463 causes screen display section 420 to display the area indicated by the identified display area information.
  • image display section 463 causes screen display section 420 to display the area indicated by the display area information included in the display instruction.
  • the image display unit 463 can cause the screen display unit 420 to display the instruction area instead of the other person's display area in response to the display instruction.
  • the image display section 463 may cause the screen display section 420 to display the designation area together with the other person's display area.
  • the timing at which the image display unit 463 cancels the display according to the display instruction is not particularly limited.
  • the image display unit 463 may cancel the display according to the display instruction at any timing.
  • FIGS. 5 to 7 show display examples on the screen display section 420 displayed by the image display section 463.
  • FIG. 5 shows a display example when no display instruction has been received.
  • the image display unit 463 causes the display unit to display the region extracted according to the state of the user.
  • the image display unit 463 causes the other person's display area to be displayed on the other person's display area.
  • the image display unit 463 can display the other person's display area together with the area extracted according to the state of the user when no display instruction is received. It should be noted that it may be configured such that the user can select whether or not to display the other person's display area.
  • the image display unit 463 may display the extracted area on the entire screen display unit 420 according to the state of the user.
  • FIG. 6 shows a display example in a state in which a display instruction is received.
  • the image display unit 463 causes the display unit to display the region extracted according to the state of the user.
  • the image display unit 463 causes the other person's display unit to display the designation area.
  • the image display unit 463 can display the instruction area instead of the other person's display area. Note that even if the user has selected not to display the other person's display area, upon receiving the display instruction, image display unit 463 displays the instruction area as shown in FIG.
  • FIG. 7 shows a display example when a display instruction includes a plurality of pieces of identification information.
  • the image display section 463 can divide the other person's display section into a plurality of sections and display an instruction area on each of the divided sections.
  • the divided display as shown in FIG. 7 is performed, at least one of the divided regions can be selected according to the user's selection action such as the user's operation on the operation input unit 430 or the user's line-of-sight direction detection result. It can be configured as Further, when the user selects, only the selected area of the instruction area may be displayed on the other person's display section as shown in FIG. 6, for example.
  • the image display section 463 may cause the screen display section 420 to display only some of the plurality of instruction areas according to the user's selection action.
  • FIGS. 5 to 7 show display examples, and the display on the screen display unit 420 is not limited to the cases shown in FIGS.
  • the positions and sizes of the display section and the other person's display section may be changed arbitrarily.
  • the display area information acquisition unit 464 acquires display area information that is information for specifying the area displayed by the image display unit 463 according to the state of the user. Then, display area information acquisition section 464 stores the acquired display area information in storage section 450 as display information 453 .
  • the display area information acquisition unit 464 obtains, as the display area information, information corresponding to the line-of-sight direction of the user and the orientation of the user, information indicating the area being displayed on the user device 400 such as coordinates, and other information for specifying the area.
  • the display area information acquisition unit 464 may acquire the display area information at arbitrary timing such as predetermined intervals.
  • the display area information transmission unit 465 transmits the display area information acquired by the display area information acquisition unit 464 to the relay device 300 .
  • the display area information transmitting unit 465 can transmit the display area information to the relay apparatus 300 in association with the identification information of its own apparatus.
  • the display area information transmission unit 465 can transmit the display area information to the relay device 300 every time the display area information acquisition unit 464 acquires the display area information.
  • FIG. 8 is a configuration example of the user device 400 .
  • an operation example of the relay device 300 and the user device 400 will be described with reference to FIGS. 8 and 9.
  • FIG. 8 is a configuration example of the user device 400 .
  • FIG. 8 shows an operation example of the relay device 300.
  • the condition detection unit 333 confirms whether or not a preset condition is satisfied (step S101). If the condition is not satisfied (step S101, No), the condition detection unit 333 continues detection of the condition. On the other hand, if the condition is satisfied (step S101, Yes), the condition detection unit 333 identifies the user device 400 corresponding to the detected condition (step S102).
  • the display instruction unit 334 instructs the user device 400 to display the area being displayed on the user device 400 specified by the condition detection unit 333 (step S103). In other words, the display instruction unit 334 transmits a display instruction to the user device 400 . For example, the display instruction unit 334 transmits display instructions to all user devices 400 included in the image sharing system 100 .
  • the display instruction unit 334 may be configured to transmit a display instruction to user devices 400 satisfying a predetermined condition among all user devices 400 included in the image sharing system 100 .
  • the above is an operation example of the relay device 300 .
  • an operation example of the user device 400 will be described with reference to FIG.
  • the image display unit 463 displays the other person's display area, which is the area being displayed on any user device 400, when no display instruction is received (step S201, No) (step S203). ).
  • the user device 400 to be displayed may be selectable according to a user's operation or the like.
  • the image display unit 463 displays the designated area, which is the area being displayed on the user device 400 designated by the display instruction (step S202). For example, the image display unit 463 displays the designated area instead of the other person's display area. The image display unit 463 may display the designation area together with the other person's display area.
  • the above is an operation example of the user device 400 .
  • the relay device 300 has the condition detection section 333 and the display instruction section 334 .
  • the display instruction unit 334 instructs each predetermined user device to display the area being displayed on the user device 400 specified by the condition detection unit 333 in response to detection of a condition by the condition detection unit 333 . 400 can be instructed.
  • the user device 400 that has received the display instruction it is possible to display the area that is being displayed in the user device 400 instructed by the display instruction.
  • an image sharing system for sharing image data it becomes possible for the user, who is the originator of the conversation, to check the area being checked by the other user device 400 as well, thereby facilitating smooth sharing of information between sharers. It is possible to do
  • the image display unit 463 of the user device 400 is configured to display the area being displayed on the user device 400 indicated by the display instruction, according to the display instruction.
  • the user operating the user device 400 can quickly check the area being checked by the user who is the originator of the conversation, and information can be smoothly shared with the originator of the conversation. becomes possible.
  • the present invention can be utilized in the field of education, such as when educating about work details in the manufacturing process, and in the field of business, such as when multiple users remotely check and review display examples in a store.
  • a teacher such as an expert
  • a user operating another user device 400 to easily confirm from which point of view the conversation is based, and it is possible to smoothly share information.
  • the present invention may be utilized in situations other than those exemplified above.
  • the present invention may be applied to shared systems other than those exemplified in this embodiment.
  • the present invention may be applied to an image sharing system capable of displaying a portion of two-dimensional image data on each user device, such as by each zooming the image data.
  • the present invention may be applied to an information sharing system other than image data that displays at least part of information on a user device.
  • the present invention may be applied to a system for sharing information such as a website that transmits display information indicating the website being browsed to a relay device.
  • FIG. 10 to 12 an outline of the configuration of an information processing device 500 and a display device 600 connected to a plurality of user devices that display at least a partial area of all data will be described.
  • FIG. 10 shows a hardware configuration example of the information processing device 500 .
  • the information processing apparatus 500 has the following hardware configuration as an example.
  • - CPU Central Processing Unit
  • 501 Arimetic unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • Program group 504 loaded into RAM 503 - Storage device 505 for storing program group 504
  • a drive device 506 that reads and writes a recording medium 510 outside the information processing device
  • a communication interface 507 that connects to a communication network 511 outside the information processing apparatus
  • a bus 509 connecting each component
  • the information processing apparatus 500 can realize the functions of the detection unit 521 and the instruction unit 522 shown in FIG.
  • the program group 504 is stored in the storage device 505 or the ROM 502 in advance, for example, and is loaded into the RAM 503 or the like by the CPU 501 as necessary and executed.
  • the program group 504 may be supplied to the CPU 501 via the communication network 511 or may be stored in the recording medium 510 in advance, and the drive device 506 may read the program and supply it to the CPU 501 .
  • FIG. 10 shows a hardware configuration example of the information processing device 500 .
  • the hardware configuration of the information processing device 500 is not limited to the case described above.
  • the information processing apparatus 500 may be composed of part of the above-described configuration, such as not having the drive device 506 .
  • the detection unit 521 detects a predetermined condition. For example, the detection unit 521 detects conditions according to the state of transmission and reception of sound data, such as whether or not it is determined that the user is having a conversation based on the sound data.
  • the instruction unit 522 instructs at least some of the connected user devices to display the area displayed by the user device corresponding to the condition according to the detection result of the detection unit 521 .
  • the instruction unit 522 can perform the above instruction by transmitting a display instruction to instruct the user device to be displayed.
  • the information processing device 500 has the detection unit 521 and the instruction unit 522 .
  • the instruction unit 522 causes at least one of the connected user devices to display the area displayed by the user device corresponding to the condition according to the detection result of the detection unit 521. You can give instructions to the department. As a result, the user device that has received the instruction can display the area that is being displayed on the target user device. This enables smooth sharing of information in an image sharing system that shares image data.
  • the information processing device 500 described above can be realized by installing a predetermined program in the information processing device 500 .
  • a program that is another aspect of the present invention detects a predetermined condition in an information processing device 500 connected to a plurality of user devices that display at least a partial area of all data,
  • the information processing device 500 connected to a plurality of user devices displaying at least a partial area of the entire data detects a predetermined condition. Then, according to the detected result, at least some of the connected user devices are instructed to display the area displayed by the user device corresponding to the condition.
  • the present invention can achieve the object of the present invention for the same reason even with a display device 600 having a configuration as shown in FIG.
  • the hardware configuration of the display device 600 may be the same as that of the information processing device 500 described with reference to FIG. 10 . Therefore, description of the hardware configuration of the display device 600 is omitted.
  • the display device 600 can realize the functions of the reception unit 621 and the display unit 622 by having the CPU acquire a group of programs and have the CPU execute them.
  • the receiving unit 621 receives display area information, which is information for specifying an area being displayed on another user device, among the entire data.
  • the display unit 622 displays an area corresponding to the state of the user out of the entire data, and displays an area being displayed on another user device based on the display area information. Further, when the display unit 622 receives a display instruction transmitted from an external device as a detection result of a predetermined condition, the display unit 622 displays the area currently displayed on the user device indicated by the display instruction out of the entire data based on the display instruction.
  • the present invention can be used as described above. can achieve the purpose of
  • Appendix 1 An information processing device connected to a plurality of user devices that display at least a partial area of all data, a detection unit that detects a predetermined condition; an instruction unit that instructs at least some of the connected user devices to display the area displayed by the user device corresponding to the condition according to the detection result of the detection unit; An information processing device.
  • Appendix 2 The information processing apparatus according to appendix 1, wherein the detection unit detects a condition according to a transmission/reception state of the sound data.
  • Appendix 3 The information processing apparatus according to appendix 1 or appendix 2, wherein the detection unit detects that a predetermined condition is satisfied when it can be determined that a conversation is being held according to the transmission/reception status of the sound data.
  • the detection unit detects that a predetermined condition is satisfied when it can be determined that the field device and the user device are having a conversation;
  • the instruction unit causes at least one of the connected user devices to display an area displayed by the user device that can be determined to be having a conversation with the field device according to the detection result of the detection unit.
  • the information processing apparatus according to any one of appendices 1 to 3, wherein the instruction is given to the unit.
  • Appendix 5 a receiving unit that receives display area information for specifying the area being displayed on the user device; a transmitting unit configured to transmit the display area information received by the receiving unit to at least some of the connected user devices;
  • the information processing apparatus according to any one of appendices 1 to 4.
  • Appendix 6 6.
  • Appendix 7) An information processing device connected to a plurality of user devices that display at least a partial area of all data, detecting a predetermined condition; An information processing method for instructing at least a part of connected user devices to display an area displayed by a user device corresponding to a condition according to a detection result.
  • Appendix 8 An information processing device connected to a plurality of user devices that display at least a partial area of all data, detecting a predetermined condition; A program for realizing a process of instructing at least some of the connected user devices to display the area displayed by the user device corresponding to the condition according to the detection result.
  • (Appendix 9) a receiving unit that receives display area information, which is information for specifying an area being displayed on another user device in the entire data; a display unit that displays an area corresponding to the state of the user out of the entire data and displays an area that is being displayed on another user device based on the display area information; has The display unit displays, based on a display instruction transmitted from an external device as a detection result of a predetermined condition, a region being displayed on the user device indicated by the display instruction out of the entire data.
  • a display method comprising: upon receiving a display instruction transmitted from an external device as a result of detection of a predetermined condition, displaying an area currently displayed on the user device indicated by the display instruction out of the entire data, based on the received display instruction.
  • the programs described in each of the above embodiments and supplementary notes are stored in a storage device or recorded in a computer-readable recording medium.
  • the recording medium is a portable medium such as a flexible disk, an optical disk, a magneto-optical disk, and a semiconductor memory.
  • image sharing system 200 field device 210 image data acquisition unit 220 sound data input/output unit 230 transmission/reception unit 300 relay device 310 communication I/F unit 320 storage unit 321 image information 322 sound information 323 display information 324 program 330 arithmetic processing unit 331 data Reception unit 332 Data transmission unit 333 Condition detection unit 334 Display instruction unit 400 User device 410 Communication I/F unit 420 Screen display unit 430 Operation input unit 440 Sound input/output unit 450 Storage unit 451 Image information 452 Sound information 453 Display information 454 Instruction information 455 program 460 arithmetic processing unit 461 data reception unit 462 data transmission unit 463 image display unit 464 display area information acquisition unit 465 display area information transmission unit 500 information processing device 501 CPU 502 ROMs 503 RAM 504 program group 505 storage device 506 drive device 507 communication interface 508 input/output interface 509 bus 510 recording medium 511 communication network 521 detection unit 522 instruction unit 600 display device 621 reception unit 622 display unit

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Acoustics & Sound (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Signal Processing (AREA)
  • Computational Linguistics (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

Provided is an information processing device that is connected with a plurality of user devices which display at least a partial region of entire data, said information processing device comprising: a detection unit that detects a predetermined condition; and an instruction unit that instructs, according to the result of the detection by the detection unit, at least some of the connected user devices to display a region which a user device corresponding to the condition is displaying.

Description

情報処理装置Information processing equipment
 本発明は、情報処理装置、情報処理方法、記録媒体、表示装置、表示方法に関する。 The present invention relates to an information processing device, an information processing method, a recording medium, a display device, and a display method.
 360度カメラなどの撮像装置が取得した画像データを配信することがある。例えば、特許文献1には、視野制御部と、画像生成部と、出力部と、を有する、ヘッドマウントディスプレイに画像データを表示するシステムが記載されている。特許文献1によると、視野制御部は、ヘッドマウントディスプレイの姿勢に基づいて、ユーザの視野の範囲を検出する。また、画像生成部は、画像データからユーザの視野の範囲を抽出した画像を生成する。そして、出力部は、生成した画像をヘッドマウントディスプレイへ出力して表示させる。  Image data acquired by an imaging device such as a 360-degree camera may be distributed. For example, Patent Literature 1 describes a system for displaying image data on a head-mounted display, which has a view control section, an image generation section, and an output section. According to Patent Literature 1, a view control unit detects the range of the user's view based on the orientation of the head mounted display. The image generator also generates an image by extracting the range of the user's field of view from the image data. The output unit then outputs the generated image to the head-mounted display for display.
特開2019-139673号公報JP 2019-139673 A
 特許文献1に記載のような状況の場合、複数のユーザが各々の姿勢で画像データを表示すると、それぞれのユーザが見ている画像データは異なることになる。その結果、例えば、各ユーザで表示中の範囲が異なるため、あるユーザが指摘した箇所を他のユーザが的確に把握することが難しいことがあった。 In the situation described in Patent Document 1, if multiple users display image data in their respective postures, the image data viewed by each user will be different. As a result, for example, since each user has a different display range, it may be difficult for another user to accurately grasp the point pointed out by one user.
 このように、画像データなどの情報を共有するシステムにおいて、情報の円滑な共有を行うことが難しい場合がある、という課題が生じていた。 In this way, in systems that share information such as image data, there are cases where it is difficult to share information smoothly.
 そこで、本発明の目的は、画像データなどの情報を共有するシステムにおいて、情報の円滑な共有を行うことが難しい場合がある、という課題を解決する情報処理装置、情報処理方法、記録媒体、表示装置、表示方法を提供することにある。 SUMMARY OF THE INVENTION Accordingly, an object of the present invention is to provide an information processing apparatus, an information processing method, a recording medium, and a display that solve the problem that it is sometimes difficult to smoothly share information in a system that shares information such as image data. An object of the present invention is to provide an apparatus and a display method.
 かかる目的を達成するため本開示の一形態である情報処理装置は、
 データ全体のうちの少なくとも一部の領域を表示するユーザ装置複数と接続された情報処理装置であって、
 予め定められた条件を検出する検出部と、
 前記検出部が検出した結果に応じて、条件に対応するユーザ装置が表示している領域を表示するよう、接続中のユーザ装置のうちの少なくとも一部に対して指示する指示部と、
 を有する
 という構成をとる。
In order to achieve such an object, an information processing device, which is one aspect of the present disclosure,
An information processing device connected to a plurality of user devices that display at least a partial area of all data,
a detection unit that detects a predetermined condition;
an instruction unit that instructs at least some of the connected user devices to display the area displayed by the user device corresponding to the condition according to the detection result of the detection unit;
It has a configuration of
 また、本開示の他の形態である情報処理方法は、
 データ全体のうちの少なくとも一部の領域を表示するユーザ装置複数と接続された情報処理装置が、
 予め定められた条件を検出し、
 検出した結果に応じて、条件に対応するユーザ装置が表示している領域を表示するよう、接続中のユーザ装置のうちの少なくとも一部に対して指示する
 という構成をとる。
Further, an information processing method that is another aspect of the present disclosure includes:
An information processing device connected to a plurality of user devices that display at least a partial area of all data,
detecting a predetermined condition;
According to the detection result, at least some of the connected user devices are instructed to display the area displayed by the user device corresponding to the condition.
 また、本開示の他の形態である記録媒体は、
 データ全体のうちの少なくとも一部の領域を表示するユーザ装置複数と接続された情報処理装置が、
 予め定められた条件を検出し、
 検出した結果に応じて、条件に対応するユーザ装置が表示している領域を表示するよう、接続中のユーザ装置のうちの少なくとも一部に対して指示する
 処理を実現するためのプログラムを記録した、コンピュータが読み取り可能な記録媒体である。
In addition, a recording medium that is another aspect of the present disclosure includes:
An information processing device connected to a plurality of user devices that display at least a partial area of all data,
detecting a predetermined condition;
A program is recorded for executing the process of instructing at least some of the connected user devices to display the area displayed by the user device corresponding to the conditions according to the detection result. , is a computer-readable recording medium.
 また、本開示の他の形態である表示装置は、
 データ全体のうち他のユーザ装置で表示中の領域を特定するための情報である表示領域情報を受信する受信部と、
 データ全体のうちユーザの状態に応じた領域を表示させるとともに、前記表示領域情報に基づいて他のユーザ装置で表示中の領域を表示させる表示部と、
 を有し、
 前記表示部は、所定の条件の検出結果として外部装置から送信される表示指示に基づいて、データ全体のうち前記表示指示が示すユーザ装置で表示中の領域を表示させる
 という構成をとる。
In addition, a display device that is another aspect of the present disclosure includes:
a receiving unit that receives display area information, which is information for specifying an area being displayed on another user device in the entire data;
a display unit that displays an area corresponding to the state of the user out of the entire data and displays an area that is being displayed on another user device based on the display area information;
has
The display unit displays a region currently displayed on the user device indicated by the display instruction out of the entire data, based on the display instruction transmitted from the external device as a detection result of the predetermined condition.
 また、本開示の他の形態である表示方法は、
 ユーザ装置が、
 データ全体のうち他のユーザ装置で表示中の領域を特定するための情報である表示領域情報を受信し、
 データ全体のうちユーザの状態に応じた領域を表示させるとともに、前記表示領域情報に基づいて他のユーザ装置で表示中の領域を表示させ、
 所定の条件の検出結果として外部装置から送信される表示指示を受信すると、受信した表示指示に基づいて、データ全体のうち前記表示指示が示すユーザ装置で表示中の領域を表示させる
 という構成をとる。
In addition, a display method, which is another aspect of the present disclosure,
the user device
receiving display area information, which is information for specifying an area being displayed on another user device in the entire data;
displaying an area corresponding to the state of the user out of the entire data, and displaying an area being displayed on another user device based on the display area information;
When a display instruction transmitted from an external device as a result of detection of a predetermined condition is received, based on the received display instruction, an area currently being displayed on the user device indicated by the display instruction is displayed among the entire data. .
 上述したような各構成によると、画像データなどの情報を共有するシステムにおいて、情報の円滑な共有を行うことを可能とする情報処理装置、情報処理方法、記録媒体、表示装置、表示方法を提供することが出来る。 According to each configuration as described above, an information processing device, information processing method, recording medium, display device, and display method that enable smooth sharing of information in a system for sharing information such as image data are provided. can do
本開示の第1の実施形態にかかる画像共有システムの構成例を示す図である。1 is a diagram illustrating a configuration example of an image sharing system according to a first embodiment of the present disclosure; FIG. 図1で示す現場装置の構成例を示すブロック図である。2 is a block diagram showing a configuration example of the on-site device shown in FIG. 1; FIG. 図1で示す中継装置の構成例を示すブロック図である。2 is a block diagram showing a configuration example of a relay device shown in FIG. 1; FIG. 図1で示すユーザ装置の構成例を示すブロック図である。2 is a block diagram showing a configuration example of a user device shown in FIG. 1; FIG. 画面表示部の表示例を示す図である。FIG. 4 is a diagram showing a display example of a screen display unit; 画面表示部の表示例を示す図である。FIG. 4 is a diagram showing a display example of a screen display unit; 画面表示部の表示例を示す図である。FIG. 4 is a diagram showing a display example of a screen display unit; 中継装置の動作例を示すフローチャートである。4 is a flowchart showing an operation example of a relay device; ユーザ装置の動作例を示すフローチャートである。4 is a flow chart showing an operation example of a user device; 本開示の第2の実施形態にかかる情報処理装置のハードウェア構成例を示す図である。It is a figure which shows the hardware structural example of the information processing apparatus concerning 2nd Embodiment of this indication. 情報処理装置の構成例を示すブロック図である。It is a block diagram which shows the structural example of an information processing apparatus. 表示装置の構成例を示すブロック図である。It is a block diagram which shows the structural example of a display apparatus.
[第1の実施形態]
 本開示の第1の実施形態について、図1から図9までを参照して説明する。図1は、画像共有システム100の構成例を示す図である。図2は、現場装置200の構成例を示すブロック図である。図3は、中継装置300の構成例を示すブロック図である。図4は、ユーザ装置400の構成例を示すブロック図である。図5から図7までは、画面表示部の表示例を示す図である。図8は、中継装置300の動作例を示すフローチャートである。図9は、ユーザ装置400の動作例を示すフローチャートである。
[First Embodiment]
A first embodiment of the present disclosure will be described with reference to FIGS. 1 to 9. FIG. FIG. 1 is a diagram showing a configuration example of an image sharing system 100. As shown in FIG. FIG. 2 is a block diagram showing a configuration example of the field device 200. As shown in FIG. FIG. 3 is a block diagram showing a configuration example of the relay device 300. As shown in FIG. FIG. 4 is a block diagram showing a configuration example of the user device 400. As shown in FIG. 5 to 7 are diagrams showing display examples of the screen display unit. FIG. 8 is a flow chart showing an operation example of the relay device 300 . FIG. 9 is a flow chart showing an operation example of the user device 400 .
 本開示の第1の実施形態においては、現場装置200が有する360度カメラが取得した画像データを複数のユーザ装置400で共有する画像共有システム100について説明する。本実施形態における画像共有システム100の場合、各ユーザ装置400において、ユーザの視線方向に応じて抽出した領域の画像など、360度カメラが取得した画像データ全体のうちユーザの状態に応じた少なくとも一部の領域を表示する。また、後述するように、本実施形態の場合、画像共有システム100は、音データについての条件など予め定められた条件を検出する。すると、画像共有システム100は、検出した条件に対応するユーザが表示している画像データの領域を、画像共有システム100に含まれる各ユーザ装置400に表示させる。 In the first embodiment of the present disclosure, an image sharing system 100 in which a plurality of user devices 400 share image data acquired by a 360-degree camera of the field device 200 will be described. In the case of the image sharing system 100 according to the present embodiment, at least one of the entire image data acquired by the 360-degree camera, such as an image of an area extracted according to the line-of-sight direction of the user, in each user device 400 is selected according to the state of the user. view the region of the Further, as will be described later, in the case of the present embodiment, the image sharing system 100 detects predetermined conditions such as conditions regarding sound data. Then, the image sharing system 100 causes each user device 400 included in the image sharing system 100 to display the area of the image data displayed by the user corresponding to the detected condition.
 図1は、画像共有システム100の構成例を示している。図1を参照すると、画像共有システム100は、現場装置200と中継装置300とユーザ装置400とを有している。図1で示すように、画像共有システム100は、複数のユーザ装置400(ユーザ装置400-1、ユーザ装置400-2、……。図1で示すように、画像共有システム100は種類の異なるユーザ装置400を含んでいてもよい)を有している。画像共有システム100は、複数の現場装置200や中継装置300を有してもよい。 FIG. 1 shows a configuration example of the image sharing system 100. As shown in FIG. Referring to FIG. 1 , image sharing system 100 includes field device 200 , relay device 300 and user device 400 . As shown in FIG. 1, image sharing system 100 includes a plurality of user devices 400 (user device 400-1, user device 400-2, . . . ). device 400). The image sharing system 100 may have a plurality of field devices 200 and relay devices 300 .
 図1で示すように、現場装置200と中継装置300とは、無線通信などにより互いに通信可能なよう接続されている。また、中継装置300とユーザ装置400とは、無線通信などにより互いに通信可能なよう接続されている。 As shown in FIG. 1, the field device 200 and the relay device 300 are connected so as to be able to communicate with each other by wireless communication or the like. Also, the relay device 300 and the user device 400 are connected so as to be able to communicate with each other by wireless communication or the like.
 現場装置200は、画像データを取得する情報処理装置である。例えば、現場装置200は、360度カメラなどの撮像装置を有しており、撮像装置を利用して画像データを取得する。また、現場装置200は、マイクロフォンやスピーカなどを有しており、人物の音声などを示す音データの入出力が可能なよう構成されている。例えば、現場装置200は、工場、店舗、そのほか撮像対象となる現場周辺に配置されており、現場にいる現場人物により操作される。 The on-site device 200 is an information processing device that acquires image data. For example, the field device 200 has an imaging device such as a 360-degree camera, and acquires image data using the imaging device. The on-site device 200 has a microphone, a speaker, and the like, and is configured to be capable of inputting and outputting sound data representing human voices and the like. For example, the on-site device 200 is placed around a factory, store, or other site to be imaged, and is operated by a person on site.
 図2は、現場装置200の構成例を示している。図2を参照すると、現場装置200は、例えば、画像データ取得部210と、音データ入出力部220と、送受信部230と、を有している。例えば、現場装置200は、CPUなどの演算装置と記憶装置とを有している。例えば、現場装置200は、記憶装置に格納されたプログラムを演算装置が実行することで、上記各処理部を実現する。 FIG. 2 shows a configuration example of the on-site device 200. FIG. Referring to FIG. 2, the field device 200 has an image data acquisition unit 210, a sound data input/output unit 220, and a transmission/reception unit 230, for example. For example, the field device 200 has an arithmetic device such as a CPU and a storage device. For example, the on-site device 200 implements each of the above-described processing units by executing a program stored in a storage device by an arithmetic device.
 画像データ取得部210は、撮像装置に対して所定の指示を送信することなどにより、画像データを取得する。例えば、画像データ取得部210は、撮像装置である360度カメラから全天球(または、半天球)の画像データを取得する。また、画像データ取得部210は、取得した画像データを現場装置200が有する記憶装置に格納することが出来る。 The image data acquisition unit 210 acquires image data by, for example, transmitting a predetermined instruction to the imaging device. For example, the image data acquisition unit 210 acquires omnidirectional (or hemispherical) image data from a 360-degree camera that is an imaging device. Also, the image data acquisition unit 210 can store the acquired image data in the storage device of the field device 200 .
 音データ入出力部220は、マイクロフォンやスピーカなどを用いることで、音データを取得したり音データを出力したりする。例えば、音データ入出力部220は、現場人物が発した音声などの音データを取得することが出来る。また、音データ入出力部220は、後述する送受信部230が中継装置300から受信した音データなどを出力することが出来る。 The sound data input/output unit 220 acquires sound data and outputs sound data by using a microphone, a speaker, or the like. For example, the sound data input/output unit 220 can acquire sound data such as voices uttered by people on site. Further, the sound data input/output unit 220 can output sound data received from the relay device 300 by the transmission/reception unit 230 described later.
 送受信部230は、中継装置300との間で各種データを送受信する。例えば、送受信部230は、画像データ取得部210が取得した画像データや音データ入出力部220が取得した音データなどを外部装置である中継装置300に対して送信する。また、送受信部230は、中継装置300から音データなどを受信する。 The transmission/reception unit 230 transmits/receives various data to/from the relay device 300 . For example, the transmission/reception unit 230 transmits image data acquired by the image data acquisition unit 210 and sound data acquired by the sound data input/output unit 220 to the relay device 300, which is an external device. Further, the transmission/reception unit 230 receives sound data and the like from the relay device 300 .
 以上が、現場装置200の構成例である。なお、現場装置200は、1台の情報処理装置により構成されていてもよいし、360度カメラとスピーカと情報処理装置となど、通信可能に接続された複数台の装置により構成されていてもよい。 The above is a configuration example of the on-site device 200 . The on-site device 200 may be composed of one information processing device, or may be composed of a plurality of communicatively connected devices such as a 360-degree camera, a speaker, and an information processing device. good.
 なお、現場装置200は、音データの送受信を行う会話先を現場人物の操作などにより設定可能なよう構成してもよい。換言すると、現場装置200は、現場人物の操作などにより設定されたユーザ装置400との間でのみ会話を行うことが出来るよう構成してもよい。なお、設定されたユーザ装置400との間でのみ会話を行うよう構成した場合であっても、後述するように、現場装置200が送信した音データや会話の相手として設定されたユーザ装置400が送信した音データを会話の相手として設定されていないユーザ装置400に対して送信するよう構成することが出来る。 It should be noted that the on-site device 200 may be configured so that the conversation destination for transmitting and receiving sound data can be set by the operation of the on-site person. In other words, the on-site device 200 may be configured to be able to have a conversation only with the user device 400 set by the operation of the on-site person. Note that even in a case where a conversation is performed only with the set user device 400, as will be described later, the sound data transmitted by the field device 200 and the user device 400 set as a conversation partner are The transmitted sound data can be configured to be transmitted to the user device 400 that is not set as a conversation partner.
 中継装置300は、現場装置200とユーザ装置400との間でデータの送受信を行うサーバ装置などの情報処理装置である。また、後述するように、中継装置300は、所定の条件を検出する。すると、中継装置300は、検出した条件に対応するユーザが表示している画像データの領域を表示するよう、各ユーザ装置400に対して指示する。 The relay device 300 is an information processing device such as a server device that transmits and receives data between the field device 200 and the user device 400 . Also, as will be described later, the relay device 300 detects a predetermined condition. Then, the relay device 300 instructs each user device 400 to display the area of the image data displayed by the user corresponding to the detected condition.
 図3は、中継装置300の構成例を示している。図3を参照すると、中継装置300は、主な構成要素として、例えば、通信I/F部310と、記憶部320と、演算処理部330と、を有している。 3 shows a configuration example of the relay device 300. FIG. Referring to FIG. 3, the relay device 300 has, for example, a communication I/F section 310, a storage section 320, and an arithmetic processing section 330 as main components.
 なお、図3は、中継装置300を1台の情報処理装置により実現する場合について例示している。しかしながら、中継装置300としての機能は、例えばクラウド上に実現されるなど、互いに通信可能に接続された複数台の情報処理装置により実現されてもよい。 Note that FIG. 3 illustrates a case where the relay device 300 is realized by one information processing device. However, the function as the relay device 300 may be realized by a plurality of information processing devices that are communicably connected to each other, such as being realized on a cloud, for example.
 通信I/F部310は、データ通信回路からなる。通信I/F部310は、通信回線を介して接続された現場装置200やユーザ装置400などの外部装置との間でデータ通信を行う。 The communication I/F unit 310 consists of a data communication circuit. Communication I/F section 310 performs data communication with external devices such as field device 200 and user device 400 connected via a communication line.
 記憶部320は、ハードディスクやメモリなどの記憶装置である。記憶部320は、演算処理部330における各種処理に必要な処理情報やプログラム324を記憶する。プログラム324は、演算処理部330に読み込まれて実行されることにより各種処理部を実現する。プログラム324は、通信I/F部310などのデータ入出力機能を介して外部装置や記録媒体から予め読み込まれ、記憶部320に保存されている。記憶部320で記憶される主な情報としては、例えば、画像情報321、音情報322、表示情報323などがある。 The storage unit 320 is a storage device such as a hard disk or memory. The storage unit 320 stores processing information and programs 324 required for various processes in the arithmetic processing unit 330 . The program 324 realizes various processing units by being read and executed by the arithmetic processing unit 330 . The program 324 is read in advance from an external device or recording medium via a data input/output function such as the communication I/F unit 310 and stored in the storage unit 320 . Main information stored in the storage unit 320 includes, for example, image information 321, sound information 322, display information 323, and the like.
 画像情報321は、現場装置200から受信した画像データを含む情報である。画像情報321は、例えば、データ受信部331が現場装置200から画像データを受信した際などに更新される。 The image information 321 is information including image data received from the field device 200 . The image information 321 is updated, for example, when the data receiving unit 331 receives image data from the field device 200 .
 音情報322は、現場装置200やユーザ装置400から受信した音データを含む情報である。音情報322では、現場装置200やユーザ装置400ごとに音データが格納されている。換言すると、音情報322では、現場装置200やユーザ装置400を識別するための識別情報と、音データと、が対応付けられている。音情報322は、例えば、データ受信部331が現場装置200やユーザ装置400から音データを受信した際などに更新される。 The sound information 322 is information including sound data received from the field device 200 or the user device 400 . The sound information 322 stores sound data for each field device 200 and user device 400 . In other words, the sound information 322 associates identification information for identifying the field device 200 and the user device 400 with sound data. The sound information 322 is updated, for example, when the data receiving unit 331 receives sound data from the field device 200 or the user device 400 .
 表示情報323は、画像データ全体のうちユーザ装置400において表示中の領域を特定するための情報である表示領域情報を示している。例えば、表示情報323では、ユーザ装置400ごとに表示領域情報が含まれている。換言すると、表示情報323では、ユーザ装置400を識別するための識別情報と、表示領域情報と、が対応付けられている。例えば、表示情報323は、データ受信部331がユーザ装置400から表示領域情報を取得した際などに更新される。 The display information 323 indicates display area information that is information for specifying the area being displayed on the user device 400 in the entire image data. For example, the display information 323 includes display area information for each user device 400 . In other words, the display information 323 associates identification information for identifying the user device 400 with display area information. For example, the display information 323 is updated when the data receiving unit 331 acquires display area information from the user device 400 .
 なお、表示領域情報は、ユーザ装置400において表示中の領域を特定可能であれば、具体的には限定しない。例えば、表示領域情報は、ユーザの視線方向やユーザの向きに応じた情報、座標などユーザ装置400において表示中の領域を示す情報、そのほか領域を特定するための任意の情報であってよい。 Note that the display area information is not specifically limited as long as the area being displayed on the user device 400 can be specified. For example, the display area information may be information corresponding to the direction of the user's line of sight or orientation, information indicating the area being displayed on the user device 400 such as coordinates, or any other information for specifying the area.
 演算処理部330は、CPUなどの演算装置とその周辺回路を有する。演算処理部330は、記憶部320からプログラム324を読み込んで実行することにより、上記ハードウェアとプログラム324とを協働させて各種処理部を実現する。演算処理部330で実現される主な処理部としては、例えば、データ受信部331、データ送信部332、条件検出部333、表示指示部334などがある。 The arithmetic processing unit 330 has an arithmetic device such as a CPU and its peripheral circuits. The arithmetic processing unit 330 reads the program 324 from the storage unit 320 and executes it, thereby realizing various processing units by cooperating the hardware and the program 324 . Main processing units realized by the arithmetic processing unit 330 include, for example, a data reception unit 331, a data transmission unit 332, a condition detection unit 333, a display instruction unit 334, and the like.
 データ受信部331は、現場装置200やユーザ装置400から各種データを受信する。 The data receiving unit 331 receives various data from the field device 200 and the user device 400 .
 例えば、データ受信部331は、現場装置200から画像データを受信する。すると、データ受信部331は、受信した画像データを画像情報321として記憶部320に格納する。また、データ受信部331は、現場装置200から音データを受信する。すると、データ受信部331は、受信した音データを音情報322として記憶部320に格納する。この際、データ受信部331は、音データと現場装置200を識別するための識別情報とを対応づけるなど、現場装置200から受信した音データであることを判別可能なように音データを格納することが出来る。 For example, the data receiving unit 331 receives image data from the field device 200 . Then, the data receiving section 331 stores the received image data in the storage section 320 as the image information 321 . The data receiving unit 331 also receives sound data from the field device 200 . Then, the data receiving section 331 stores the received sound data in the storage section 320 as the sound information 322 . At this time, the data receiving unit 331 stores the sound data so that the sound data received from the field device 200 can be identified, such as by associating the sound data with identification information for identifying the field device 200. can do
 また、例えば、データ受信部331は、ユーザ装置400から音データを受信する。すると、データ受信部331は、受信した音データを音情報322として記憶部320に格納する。この際、データ受信部331は、音データとユーザ装置400を識別するための識別情報とを対応づけるなど、ユーザ装置400ごとに音データを格納することが出来る。また、データ受信部331は、ユーザ装置400から表示領域情報を受信する。すると、データ受信部331は、受信した表示領域情報を表示情報323として記憶部320に格納する。この際、データ受信部331は、表示領域情報とユーザ装置400を識別するための識別情報とを対応づけるなど、ユーザ装置400ごとに表示領域情報を格納することが出来る。 Also, for example, the data receiving unit 331 receives sound data from the user device 400 . Then, the data receiving section 331 stores the received sound data in the storage section 320 as the sound information 322 . At this time, the data receiving unit 331 can store sound data for each user device 400 , such as by associating the sound data with identification information for identifying the user device 400 . Also, the data receiving unit 331 receives display area information from the user device 400 . Then, data receiving section 331 stores the received display area information in storage section 320 as display information 323 . At this time, the data receiving unit 331 can store the display area information for each user device 400 , such as associating the display area information with identification information for identifying the user device 400 .
 データ送信部332は、現場装置200やユーザ装置400に対して各種データを送信する。例えば、データ送信部332は、現場装置200やユーザ装置400からデータを受信するごとに、受信したデータをユーザ装置400や現場装置200に対して送信する。 The data transmission unit 332 transmits various data to the field device 200 and the user device 400 . For example, the data transmission unit 332 transmits the received data to the user device 400 or the on-site device 200 each time it receives data from the on-site device 200 or the user device 400 .
 例えば、データ送信部332は、現場装置200に対して、ユーザ装置400から受信した音データを送信する。データ送信部332は、音データと識別情報とを対応付けて送信してもよい。なお、会話の相手が設定されている場合、データ送信部332は、予め設定されたユーザ装置400から受信した音データのみを現場装置200に対して送信するよう構成してもよい。 For example, the data transmission unit 332 transmits sound data received from the user device 400 to the field device 200 . The data transmission unit 332 may transmit the sound data and the identification information in association with each other. It should be noted that the data transmission unit 332 may be configured to transmit only the sound data received from the preset user device 400 to the field device 200 when the conversation partner is set.
 また、データ送信部332は、ユーザ装置400に対して、現場装置200から受信した画像データや音データを送信する。また、データ送信部332は、ユーザ装置400に対して、他のユーザ装置400から受信した音データを送信する。データ送信部332は、音データと識別情報とを対応付けて送信してもよい。また、データ送信部332は、ユーザ装置400に対して、他のユーザ装置400から受信した表示領域情報を送信することが出来る。なお、会話の相手が設定されているか否かにかかわらず、データ送信部332は、全てのユーザ装置400に対して音データを送信するよう構成してよい。 The data transmission unit 332 also transmits image data and sound data received from the field device 200 to the user device 400 . The data transmission unit 332 also transmits sound data received from another user device 400 to the user device 400 . The data transmission unit 332 may transmit the sound data and the identification information in association with each other. Also, the data transmission unit 332 can transmit display area information received from another user device 400 to the user device 400 . Note that the data transmission unit 332 may be configured to transmit sound data to all the user devices 400 regardless of whether or not a conversation partner is set.
 条件検出部333は、予め定められた条件を検出する。例えば、条件検出部333は、音データの送受信状況に応じた条件を検出する。また、条件検出部333は、検出した条件に対応するユーザ装置400を特定する。 The condition detection unit 333 detects a predetermined condition. For example, the condition detection unit 333 detects a condition according to the transmission/reception status of sound data. Also, the condition detection unit 333 identifies the user device 400 corresponding to the detected condition.
 例えば、条件検出部333は、会話相手の設定などに応じて、音データの送受信状況に応じた条件を検出する。具体的には、例えば、条件検出部333は、現場装置200により現場とあるユーザ装置400との間の会話が設定された際に、現場装置200とユーザ装置400との間で音データの送受信が行われるという条件を満たす旨を検出する。そして、条件検出部333は、会話の相手として設定されたユーザ装置400を検出した条件に対応するユーザ装置400として特定する。 For example, the condition detection unit 333 detects a condition corresponding to the transmission/reception status of sound data according to the setting of the conversation partner. Specifically, for example, when the field device 200 sets a conversation between the field device 200 and a certain user device 400 , the condition detection unit 333 transmits and receives sound data between the field device 200 and the user device 400 . Detects that the condition that Then, the condition detection unit 333 identifies the user device 400 set as the conversation partner as the user device 400 corresponding to the detected condition.
 また、例えば、条件検出部333は、音情報322に含まれる音データの状況などに応じて、音データの送受信状況に応じた条件を検出する。具体的には、例えば、条件検出部333は、音データの状況に応じてユーザ間や現場ユーザ間で会話していると判断可能な場合に、現場装置200とユーザ装置400との間、または、ユーザ装置400間で音データの送受信が行われているという条件を満たす旨を検出する。そして、条件検出部333は、会話していると判断したユーザ装置400を検出した条件に対応するユーザ装置400として特定する。なお、条件検出部333は、所定時間当たりの音データの量や割合など任意の手段を用いて会話していると判断可能か否か判断してよい。例えば、条件検出部333は、音データの量の割合などに基づいて、現場装置200と会話しているユーザ装置400やユーザ間で会話しているユーザ装置400を特定してもよい。また、条件検出部333は、音データの量などに基づいて誰も会話していないと判断される場合に、会話状態にないという条件を満たす旨を検出して、予め司会などとして定められたユーザ装置400などを特定してもよい。つまり、条件検出部333は、音を検出しない(または、所定以下である)ことを条件として検出してもよい。 Also, for example, the condition detection unit 333 detects a condition corresponding to the transmission/reception status of the sound data according to the status of the sound data included in the sound information 322 . Specifically, for example, if the condition detection unit 333 can determine that users or on-site users are having a conversation according to the state of the sound data, the on-site device 200 and the user device 400 or between the on-site device 200 and the user device 400 , that the condition that sound data is being transmitted and received between the user devices 400 is satisfied. Then, the condition detection unit 333 identifies the user device 400 determined to be having a conversation as the user device 400 corresponding to the detected condition. It should be noted that the condition detection unit 333 may determine whether or not it is possible to determine that the user is having a conversation using arbitrary means such as the amount or ratio of sound data per predetermined time. For example, the condition detection unit 333 may identify the user device 400 that is conversing with the field device 200 or the user device 400 that is conversing between users based on the ratio of the amount of sound data. In addition, when it is determined that no one is talking based on the amount of sound data, the condition detection unit 333 detects that the condition is that no one is in a conversation state, and determines that the pre-determined moderator or the like is satisfied. User device 400 or the like may be identified. In other words, the condition detection unit 333 may perform detection under the condition that no sound is detected (or the sound is below a predetermined level).
 例えば、以上のように、条件検出部333は、音データの送受信状況に応じた条件を検出する。 For example, as described above, the condition detection unit 333 detects conditions according to the transmission/reception status of sound data.
 なお、条件検出部333は、上記例示した以外の条件を検出してもよい。例えば、条件検出部333は、音データを文字化した上で特定のキーワードを検出した場合に特定のキーワードを含む音データの送受信が行われているという条件を検出してもよい。条件検出部333は、上記例示した以外の音データについての条件を検出してもよい。また、条件検出部333は、PCのボタン操作やマウスの動き、視線など、音データ以外の情報に応じた条件を検知して、ユーザ装置を特定してもよい。 Note that the condition detection unit 333 may detect conditions other than those exemplified above. For example, the condition detection unit 333 may detect a condition that sound data including a specific keyword is being transmitted and received when a specific keyword is detected after characterizing the sound data. The condition detection unit 333 may detect conditions for sound data other than those exemplified above. Further, the condition detection unit 333 may detect a condition according to information other than sound data, such as PC button operation, mouse movement, line of sight, etc., to identify the user device.
 表示指示部334は、条件検出部333による条件の検出に応じて、条件検出部333が特定したユーザ装置400において表示中の領域を表示するよう、特定したユーザ装置400を含む各ユーザ装置400に対して指示する。表示指示部334は、特定したユーザ装置400を除く画像共有システム100に含まれる各ユーザ装置400に対して上記指示を行ってもよい。 The display instruction unit 334 instructs each user device 400 including the specified user device 400 to display the area being displayed on the user device 400 specified by the condition detection unit 333 in response to detection of the condition by the condition detection unit 333 . instruct against. The display instruction unit 334 may issue the above instruction to each user device 400 included in the image sharing system 100 other than the specified user device 400 .
 例えば、表示指示部334は、条件検出部333による条件の検出に応じて、条件検出部333が特定したユーザ装置400を識別するための識別情報を含む表示指示を各ユーザ装置400に対して送信する。例えば、このような表示指示の送信により、表示指示部334は、各ユーザ装置400に対して、特定したユーザ装置400において表示中の領域を表示するよう指示する。なお、表示指示には、識別情報のほか、特定したユーザ装置400に対応する表示領域情報などが含まれてもよい。また、条件検出部333がユーザ装置400間で音データの送受信が行われているという条件を検出した場合、条件検出部333は、音データの送受信を行う複数のユーザ装置400を特定することになる。そのため、表示指示部334が送信する表示指示には、複数の識別情報が含まれてもよい。 For example, the display instruction unit 334 transmits a display instruction including identification information for identifying the user device 400 specified by the condition detection unit 333 to each user device 400 in response to detection of the condition by the condition detection unit 333 . do. For example, by transmitting such a display instruction, the display instruction unit 334 instructs each user device 400 to display the area being displayed on the specified user device 400 . Note that the display instruction may include display area information corresponding to the specified user device 400 in addition to the identification information. Further, when the condition detection unit 333 detects a condition that sound data is being transmitted/received between the user devices 400, the condition detection unit 333 identifies the plurality of user devices 400 that transmit/receive the sound data. Become. Therefore, the display instruction transmitted by the display instruction unit 334 may include a plurality of pieces of identification information.
 以上が、中継装置300の構成例である。なお、中継装置300の機能うち、データ受信部331としての機能とデータ送信部332としての機能は、既知の技術を用いて実現されてよい。また、表示指示部334は、画像共有システム100に含まれるユーザ装置400のうち、所定の条件を満たす一部ユーザ装置400に対してのみ表示指示を送信するよう構成してもよい。換言すると、表示指示部334は、特定したユーザ装置400以外のユーザ装置400のうち所定の条件を満たすユーザ装置400に対して表示指示を送信するよう構成してもよい。つまり、表示指示部334は、必ずしも画像共有システム100に含まれるすべてのユーザ装置400に対して表示指示を送信しなくてもよい。例えば、表示指示部334は、特定したユーザ装置400以外のユーザ装置400のうち予め設定されたグループ内に属するユーザ装置400などに対して表示指示を送信するよう構成してもよい。同様に、データ送信部332は、画像共有システム100に含まれるユーザ装置400のうち、所定の条件を満たす一部ユーザ装置400に対してのみ表示領域情報を送信するよう構成してもよい。 The above is a configuration example of the relay device 300 . Among the functions of the relay device 300, the function as the data reception unit 331 and the function as the data transmission unit 332 may be realized using known technology. Further, the display instruction unit 334 may be configured to transmit the display instruction only to some of the user devices 400 included in the image sharing system 100 that satisfy a predetermined condition. In other words, the display instruction unit 334 may be configured to transmit a display instruction to user devices 400 that satisfy a predetermined condition among user devices 400 other than the specified user device 400 . In other words, the display instruction unit 334 does not necessarily have to transmit display instructions to all user devices 400 included in the image sharing system 100 . For example, the display instruction unit 334 may be configured to transmit a display instruction to user devices 400 belonging to a preset group among user devices 400 other than the specified user device 400 . Similarly, the data transmission unit 332 may be configured to transmit the display area information only to some of the user devices 400 included in the image sharing system 100 that satisfy a predetermined condition.
 ユーザ装置400は、画像データのうちの少なくとも一部の領域を表示する情報処理装置である。例えば、ユーザ装置400は、ヘッドマウントディスプレイ、スマートフォン、タブレットなどの表示装置を有する情報処理装置である。例えば、ユーザ装置400は、センサなどを有しており、ユーザの視線方向などのユーザの状態を検出可能なよう構成することが出来る。また、ユーザ装置400は、マイクロフォンやスピーカなどを有しており、人物の音声などを示す音データの入出力が可能なよう構成されている。例えば、ユーザ装置400は、現場とは異なる位置に存在する各ユーザが有しており、ユーザにより操作される。なお、ユーザ装置400のうちの少なくとも一部は現場に存在してもよい。 The user device 400 is an information processing device that displays at least a partial area of image data. For example, the user device 400 is an information processing device having a display device such as a head-mounted display, smart phone, or tablet. For example, the user device 400 has a sensor or the like, and can be configured to be able to detect the user's state such as the user's line-of-sight direction. Also, the user device 400 has a microphone, a speaker, and the like, and is configured to be capable of inputting and outputting sound data indicating a person's voice. For example, the user device 400 is owned by each user at a location different from the site, and is operated by the user. Note that at least some of the user devices 400 may exist on site.
 図4は、ユーザ装置400の構成例を示している。図4を参照すると、ユーザ装置400は、主な構成要素として、例えば、通信I/F部410と、画面表示部420と、操作入力部430と、音入出力部440と、記憶部450と、演算処理部460と、を有している。 4 shows a configuration example of the user device 400. FIG. Referring to FIG. 4, the user device 400 has main components such as a communication I/F unit 410, a screen display unit 420, an operation input unit 430, a sound input/output unit 440, and a storage unit 450. , and an arithmetic processing unit 460 .
 通信I/F部410は、データ通信回路からなる。通信I/F部410は、通信回線を介して接続された中継装置300などの外部装置との間でデータ通信を行う。 The communication I/F unit 410 consists of a data communication circuit. Communication I/F section 410 performs data communication with an external device such as relay device 300 connected via a communication line.
 画面表示部420は、LCD(Liquid Crystal Display、液晶ディスプレイ)などの画面表示装置からなる。画面表示部420は、演算処理部460からの指示に応じて、画像データのうちの少なくとも一部の領域など記憶部450に格納された各種情報を画面表示することが出来る。画面表示部420は、操作入力部430としての機能を有するタッチパネル付きのディスプレイなどであってもよい。 The screen display unit 420 consists of a screen display device such as an LCD (Liquid Crystal Display). The screen display unit 420 can display various information stored in the storage unit 450, such as at least a partial area of the image data, according to an instruction from the arithmetic processing unit 460. FIG. The screen display unit 420 may be a display with a touch panel that functions as the operation input unit 430, or the like.
 操作入力部430は、キーボード、マウス、各種センサなどの操作入力装置からなる。操作入力部430は、ユーザ装置400を有するユーザの操作を検出して演算処理部460に出力する。 The operation input unit 430 consists of operation input devices such as a keyboard, mouse, and various sensors. The operation input unit 430 detects the operation of the user who has the user device 400 and outputs it to the arithmetic processing unit 460 .
 音入出力部440は、マイクロフォンやスピーカなどの音入出力装置からなる。音入出力部440は、音データを取得したり、演算処理部460からの指示に応じて音データを出力したりする。例えば、音入出力部440は、ユーザが発した音声などの音データを取得することが出来る。また、音入出力部440は、後述する音情報452に含まれる音データなどを出力することが出来る。 The sound input/output unit 440 consists of sound input/output devices such as a microphone and a speaker. The sound input/output unit 440 acquires sound data and outputs sound data according to instructions from the arithmetic processing unit 460 . For example, the sound input/output unit 440 can acquire sound data such as voice uttered by the user. Also, the sound input/output unit 440 can output sound data and the like included in the sound information 452 described later.
 記憶部450は、ハードディスクやメモリなどの記憶装置である。記憶部450は、演算処理部460における各種処理に必要な処理情報やプログラム455を記憶する。プログラム455は、演算処理部460に読み込まれて実行されることにより各種処理部を実現する。プログラム455は、通信I/F部410などのデータ入出力機能を介して外部装置や記録媒体から予め読み込まれ、記憶部450に保存されている。記憶部450で記憶される主な情報としては、例えば、画像情報451、音情報452、表示情報453、指示情報454などがある。 The storage unit 450 is a storage device such as a hard disk or memory. The storage unit 450 stores processing information and programs 455 necessary for various processes in the arithmetic processing unit 460 . The program 455 realizes various processing units by being read into the arithmetic processing unit 460 and executed. The program 455 is read in advance from an external device or recording medium via a data input/output function such as the communication I/F unit 410 and stored in the storage unit 450 . Main information stored in the storage unit 450 includes, for example, image information 451, sound information 452, display information 453, instruction information 454, and the like.
 画像情報451は、中継装置300を介して現場装置200から受信した画像データを含む情報である。画像情報451は、例えば、データ受信部461が中継装置300から画像データを受信した際などに更新される。 The image information 451 is information including image data received from the field device 200 via the relay device 300 . The image information 451 is updated, for example, when the data receiving unit 461 receives image data from the relay device 300 .
 音情報452は、中継装置300を介して現場装置200やユーザ装置400から受信した音データを含む情報である。音情報452では、現場装置200やユーザ装置400ごとに音データが格納されている。換言すると、音情報452では、現場装置200やユーザ装置400を識別するための識別情報と、音データと、が対応付けられている。音情報452は、例えば、データ受信部461が中継装置300から音データを受信した際などに更新される。 The sound information 452 is information including sound data received from the field device 200 or the user device 400 via the relay device 300 . The sound information 452 stores sound data for each field device 200 and user device 400 . In other words, the sound information 452 associates identification information for identifying the field device 200 and the user device 400 with sound data. The sound information 452 is updated, for example, when the data receiving unit 461 receives sound data from the relay device 300 .
 表示情報453は、画像共有システム100に含まれる各ユーザ装置400において表示中の領域を特定するための情報である表示領域情報を示している。例えば、表示情報453では、ユーザ装置400ごとに表示領域情報が含まれている。換言すると、表示情報453では、ユーザ装置400を識別するための識別情報と、表示領域情報と、が対応付けられている。例えば、表示情報453は、データ受信部461が中継装置300から表示領域情報を取得した際などに更新される。 The display information 453 indicates display area information that is information for specifying the area being displayed on each user device 400 included in the image sharing system 100 . For example, the display information 453 includes display area information for each user device 400 . In other words, display information 453 associates identification information for identifying user device 400 with display area information. For example, the display information 453 is updated when the data receiving unit 461 acquires display area information from the relay device 300 .
 指示情報454は、中継装置300から受信した表示指示を示している。例えば、指示情報454は、中継装置300から表示指示を受信した際などに更新される。 The instruction information 454 indicates the display instruction received from the relay device 300. For example, the instruction information 454 is updated when a display instruction is received from the relay device 300 or the like.
 演算処理部460は、CPUなどの演算装置とその周辺回路を有する。演算処理部460は、記憶部450からプログラム455を読み込んで実行することにより、上記ハードウェアとプログラム455とを協働させて各種処理部を実現する。演算処理部460で実現される主な処理部としては、例えば、データ受信部461、データ送信部462、画像表示部463、表示領域情報取得部464、表示領域情報送信部465などがある。 The arithmetic processing unit 460 has an arithmetic device such as a CPU and its peripheral circuits. The arithmetic processing unit 460 reads the program 455 from the storage unit 450 and executes it, thereby realizing various processing units by cooperating the hardware and the program 455 . Main processing units realized by the arithmetic processing unit 460 include, for example, a data reception unit 461, a data transmission unit 462, an image display unit 463, a display area information acquisition unit 464, a display area information transmission unit 465, and the like.
 データ受信部461は、中継装置300から各種データを受信する。 The data receiving unit 461 receives various data from the relay device 300 .
 例えば、データ受信部461は、中継装置300から画像データを受信する。すると、データ受信部461は、受信した画像データを画像情報451として記憶部450に格納する。また、データ受信部461は、中継装置300から音データを受信する。すると、データ受信部461は、受信した音データを音情報452として記憶部450に格納する。この際、データ受信部461は、音データと識別情報とを対応づけて記憶部450に格納してもよい。 For example, the data receiving unit 461 receives image data from the relay device 300 . Then, the data receiving section 461 stores the received image data in the storage section 450 as the image information 451 . Also, the data receiving unit 461 receives sound data from the relay device 300 . Then, the data receiving section 461 stores the received sound data in the storage section 450 as the sound information 452 . At this time, the data receiving section 461 may store the sound data and the identification information in the storage section 450 in association with each other.
 また、データ受信部461は、中継装置300から表示指示を受信する。すると、データ受信部461は、受信した表示指示を指示情報454として記憶部450に格納する。 The data receiving unit 461 also receives display instructions from the relay device 300 . Then, data receiving section 461 stores the received display instruction in storage section 450 as instruction information 454 .
 データ送信部462は、中継装置300に対して各種データを送信する。例えば、データ送信部462は、音データを取得するごとに、取得した音データを中継装置300に対して送信する。データ送信部462は、音データと識別情報とを対応付けて送信してもよい。 The data transmission unit 462 transmits various data to the relay device 300 . For example, the data transmission unit 462 transmits the acquired sound data to the relay device 300 every time it acquires the sound data. The data transmission unit 462 may transmit the sound data and the identification information in association with each other.
 画像表示部463は、画像情報451に含まれる画像データのうちの少なくとも一部の領域を画面表示部420に表示させる。例えば、画像表示部463は、ユーザ装置400を操作するユーザの状態に応じた一部の領域を表示させるとともに、表示情報453に基づいて他のユーザ装置400において表示中の領域を表示させることが出来る。 The image display unit 463 causes the screen display unit 420 to display at least a partial area of the image data included in the image information 451 . For example, the image display unit 463 can display a partial area according to the state of the user operating the user device 400, and display the area being displayed on another user device 400 based on the display information 453. I can.
 例えば、画像表示部463は、画像情報451を参照して、画像データのうち、センサが検出したユーザの視線方向など、ユーザの状態に応じた領域を抽出する。この際、画像表示部463は、任意の方法で領域の抽出を行ってよい。そして、画像表示部463は、抽出した領域を画面表示部420に表示させる。このように、画像表示部463は、ユーザの状態に応じた領域を画面表示部420に表示させることが出来る。 For example, the image display unit 463 refers to the image information 451 and extracts an area corresponding to the user's state, such as the user's gaze direction detected by the sensor, from the image data. At this time, the image display unit 463 may extract the region by any method. Then, the image display unit 463 causes the screen display unit 420 to display the extracted area. In this manner, the image display unit 463 can cause the screen display unit 420 to display an area corresponding to the state of the user.
 なお、画像表示部463は、視線方向の検出の代わりに、または、視線方向の検出とともに、操作入力部430に対する操作などユーザの選択状態に応じて領域の抽出を行ってもよい。画像表示部463は、任意の手段を用いてユーザの状態を特定するとともに、特定した状態に応じた領域の抽出をおこなってよい。 Note that the image display unit 463 may extract a region according to the user's selection state such as an operation on the operation input unit 430 instead of detecting the line-of-sight direction, or in addition to detecting the line-of-sight direction. The image display unit 463 may specify the user's state using any means and extract a region according to the specified state.
 また、画像表示部463は、画像情報451と表示情報453とを参照して、他のユーザ装置400において表示中の領域を画面表示部420に表示させることが出来る。つまり、画像表示部463は、任意のユーザ装置400において表示中の領域である他者表示領域を画面表示部420に表示させることが出来る。例えば、画像表示部463は、操作入力部430に対するユーザの操作などに応じて、画面表示部420に表示させる対象になるユーザを切り替えることが出来るよう構成してもよい。また、画面表示部463は、他のユーザ装置400に対する他のユーザの操作に応じて、画面表示部420に表示させる対象になるユーザを切り替えることが出来るよう構成してもよい。換言すると、画面表示部463は、画面共有システム100に含まれる各ユーザ装置400で共通のユーザに対応する他者表示領域を表示させるよう構成されてもよいし、自装置で選択されたユーザに対応する他者表示領域を表示させるよう構成されてもよい。また、表示させる対象になるユーザの切り替えは、予め定められた1つまたは複数のユーザ装置400が行うよう構成されていてもよいし、任意の条件に応じて定められる1つまたは複数のユーザ装置400が行うよう構成されていてもよい。 Also, the image display unit 463 can refer to the image information 451 and the display information 453 and cause the screen display unit 420 to display the area being displayed on the other user device 400 . In other words, the image display unit 463 can cause the screen display unit 420 to display the other person's display area, which is the area being displayed on any user device 400 . For example, the image display unit 463 may be configured to switch the user to be displayed on the screen display unit 420 according to the user's operation on the operation input unit 430 . Further, the screen display unit 463 may be configured so as to be able to switch the user to be displayed on the screen display unit 420 according to another user's operation on the other user device 400 . In other words, the screen display unit 463 may be configured to display the other person's display area corresponding to the common user in each of the user devices 400 included in the screen sharing system 100, or display the other person's display area corresponding to the user selected by the own device. It may be configured to display a corresponding others display area. In addition, switching of the user to be displayed may be performed by one or more predetermined user devices 400, or one or more user devices determined according to arbitrary conditions. 400 may be configured to do so.
 また、画像表示部463は、データ受信部461が受信した表示指示に応じて、表示指示が示すユーザ装置400において表示中の領域を画面表示部420に表示させることが出来る。つまり、画像表示部463は、指示されたユーザ装置400において表示中の領域である指示領域を画面表示部420に表示させることが出来る。例えば、画像表示部463は、データ受信部461が表示指示を受信すると、表示情報453を参照して表示指示に含まれる識別情報に対応する表示領域情報を特定する。そして、画像表示部463は、特定した表示領域情報が示す領域を画面表示部420に表示させる。または、表示指示に表示領域情報が含まれる場合、画像表示部463は、表示指示に含まれる表示領域情報が示す領域を画面表示部420に表示させる。 In addition, the image display unit 463 can cause the screen display unit 420 to display the area being displayed on the user device 400 indicated by the display instruction, according to the display instruction received by the data reception unit 461 . In other words, the image display unit 463 can cause the screen display unit 420 to display the designated area, which is the area being displayed on the designated user device 400 . For example, when the data receiving unit 461 receives the display instruction, the image display unit 463 refers to the display information 453 to specify the display area information corresponding to the identification information included in the display instruction. Then, image display section 463 causes screen display section 420 to display the area indicated by the identified display area information. Alternatively, when display area information is included in the display instruction, image display section 463 causes screen display section 420 to display the area indicated by the display area information included in the display instruction.
 なお、画像表示部463は、表示指示に応じて、他者表示領域の代わりに指示領域を画面表示部420に表示させることが出来る。画像表示部463は、他者表示領域とともに指示領域を画面表示部420に表示させてもよい。 Note that the image display unit 463 can cause the screen display unit 420 to display the instruction area instead of the other person's display area in response to the display instruction. The image display section 463 may cause the screen display section 420 to display the designation area together with the other person's display area.
 また、本実施形態においては、画像表示部463が表示指示に応じた表示を解除するタイミングなどについては特に限定しない。画像表示部463は、任意のタイミングで表示指示に応じた表示を解除してよい。 In addition, in the present embodiment, the timing at which the image display unit 463 cancels the display according to the display instruction is not particularly limited. The image display unit 463 may cancel the display according to the display instruction at any timing.
 図5から図7までは、画像表示部463が表示させる画面表示部420における表示例を示している。例えば、図5は、表示指示を受信していない状態における表示例を示している。図5を参照すると、画像表示部463は、表示部にユーザの状態に応じて抽出した領域を表示させる。また、画像表示部463は、他者表示表示部に他者表示領域を表示させる。例えば、以上のように、画像表示部463は、表示指示を受信していない状態において、ユーザの状態に応じて抽出した領域とともに、他者表示領域を表示させることが出来る。なお、他者表示領域を表示させるか否かは、ユーザにより選択可能なよう構成してもよい。他者表示領域を表示させないとユーザが表示している場合、画像表示部463は、画面表示部420全体にユーザの状態に応じて抽出した領域を表示させてよい。 FIGS. 5 to 7 show display examples on the screen display section 420 displayed by the image display section 463. FIG. For example, FIG. 5 shows a display example when no display instruction has been received. Referring to FIG. 5, the image display unit 463 causes the display unit to display the region extracted according to the state of the user. In addition, the image display unit 463 causes the other person's display area to be displayed on the other person's display area. For example, as described above, the image display unit 463 can display the other person's display area together with the area extracted according to the state of the user when no display instruction is received. It should be noted that it may be configured such that the user can select whether or not to display the other person's display area. When the user indicates that the other person's display area should not be displayed, the image display unit 463 may display the extracted area on the entire screen display unit 420 according to the state of the user.
 また、図6は、表示指示を受信した状態における表示例を示している。図6を参照すると、画像表示部463は、表示部にユーザの状態に応じて抽出した領域を表示させる。また、画像表示部463は、他者表示表示部に指示領域を表示させる。例えば、以上のように、画像表示部463は、表示指示を受信すると、他者表示領域の代わりに指示領域を表示させることが出来る。なお、他者表示領域を表示させないとユーザが選択している場合でも、表示指示を受信すると、画像表示部463は、図6で示すように指示領域を表示させるものとする。 Also, FIG. 6 shows a display example in a state in which a display instruction is received. Referring to FIG. 6, the image display unit 463 causes the display unit to display the region extracted according to the state of the user. In addition, the image display unit 463 causes the other person's display unit to display the designation area. For example, as described above, upon receiving a display instruction, the image display unit 463 can display the instruction area instead of the other person's display area. Note that even if the user has selected not to display the other person's display area, upon receiving the display instruction, image display unit 463 displays the instruction area as shown in FIG.
 また、図7は、表示指示に複数の識別情報が含まれる場合の表示例を示している。図7を参照すると、画像表示部463は、表示指示に複数の識別情報が含まれる場合、他者表示表示部を複数に分割して、分割したそれぞれに指示領域を表示させることが出来る。なお、図7で示すような分割表示を行う場合、ユーザによる操作入力部430に対する操作やユーザの視線方向検出結果などユーザの選択行為に応じて、分割した領域のうちの少なくとも1つを選択可能なよう構成してもよい。また、ユーザが選択した場合、例えば図6で示すように、指示領域のうちの選択した領域のみを他者表示表示部に表示してもよい。つまり、画像表示部463は、ユーザによる選択行為に応じて複数の指示領域のうちの一部のみを画面表示部420に表示させてもよい。 Also, FIG. 7 shows a display example when a display instruction includes a plurality of pieces of identification information. Referring to FIG. 7, when the display instruction includes a plurality of pieces of identification information, the image display section 463 can divide the other person's display section into a plurality of sections and display an instruction area on each of the divided sections. When the divided display as shown in FIG. 7 is performed, at least one of the divided regions can be selected according to the user's selection action such as the user's operation on the operation input unit 430 or the user's line-of-sight direction detection result. It can be configured as Further, when the user selects, only the selected area of the instruction area may be displayed on the other person's display section as shown in FIG. 6, for example. In other words, the image display section 463 may cause the screen display section 420 to display only some of the plurality of instruction areas according to the user's selection action.
 なお、図5から図7までは表示例を示しており、画面表示部420における表示は図5から図7までに示す場合に限定されない。例えば、表示部や他者表示表示部の位置や大きさは任意に変更してよい。 It should be noted that FIGS. 5 to 7 show display examples, and the display on the screen display unit 420 is not limited to the cases shown in FIGS. For example, the positions and sizes of the display section and the other person's display section may be changed arbitrarily.
 表示領域情報取得部464は、画像表示部463がユーザの状態に応じた表示させている領域を特定するための情報である表示領域情報を取得する。そして、表示領域情報取得部464は、取得した表示領域情報を表示情報453として記憶部450に格納する。 The display area information acquisition unit 464 acquires display area information that is information for specifying the area displayed by the image display unit 463 according to the state of the user. Then, display area information acquisition section 464 stores the acquired display area information in storage section 450 as display information 453 .
 例えば、表示領域情報取得部464は、表示領域情報として、ユーザの視線方向やユーザの向きに応じた情報、座標などユーザ装置400において表示中の領域を示す情報、そのほか領域を特定するための情報、のうちの少なくとも1つを取得する。表示領域情報取得部464は、予め定められた間隔など任意のタイミングで表示領域情報を取得してよい。 For example, the display area information acquisition unit 464 obtains, as the display area information, information corresponding to the line-of-sight direction of the user and the orientation of the user, information indicating the area being displayed on the user device 400 such as coordinates, and other information for specifying the area. , at least one of The display area information acquisition unit 464 may acquire the display area information at arbitrary timing such as predetermined intervals.
 表示領域情報送信部465は、表示領域情報取得部464が取得した表示領域情報を中継装置300に対して送信する。例えば、表示領域情報送信部465は、自装置の識別情報と対応づけて表示領域情報を中継装置300に対して送信することが出来る。例えば、表示領域情報送信部465は、表示領域情報取得部464が表示領域情報を取得するごとに表示領域情報を中継装置300に対して送信することが出来る。 The display area information transmission unit 465 transmits the display area information acquired by the display area information acquisition unit 464 to the relay device 300 . For example, the display area information transmitting unit 465 can transmit the display area information to the relay apparatus 300 in association with the identification information of its own apparatus. For example, the display area information transmission unit 465 can transmit the display area information to the relay device 300 every time the display area information acquisition unit 464 acquires the display area information.
 以上が、ユーザ装置400の構成例である。続いて、図8、図9を参照して中継装置300とユーザ装置400の動作例について説明する。 The above is a configuration example of the user device 400 . Next, an operation example of the relay device 300 and the user device 400 will be described with reference to FIGS. 8 and 9. FIG.
 図8は、中継装置300の動作例を示している。図8を参照すると、条件検出部333は、予め設定された条件を満たすか否か確認する(ステップS101)。条件を満たさない場合(ステップS101、No)、条件検出部333は、条件の検出を続ける。一方、条件を満たす場合(ステップS101、Yes)、条件検出部333は、検出した条件に対応するユーザ装置400を特定する(ステップS102)。 FIG. 8 shows an operation example of the relay device 300. FIG. Referring to FIG. 8, the condition detection unit 333 confirms whether or not a preset condition is satisfied (step S101). If the condition is not satisfied (step S101, No), the condition detection unit 333 continues detection of the condition. On the other hand, if the condition is satisfied (step S101, Yes), the condition detection unit 333 identifies the user device 400 corresponding to the detected condition (step S102).
 表示指示部334は、条件検出部333が特定したユーザ装置400において表示中の領域を表示するよう、ユーザ装置400に対して指示する(ステップS103)。つまり、表示指示部334は、ユーザ装置400に対して表示指示を送信する。例えば、表示指示部334は、画像共有システム100に含まれるすべてのユーザ装置400に対して表示指示を送信する。表示指示部334は、画像共有システム100に含まれるすべてのユーザ装置400のうち所定の条件を満たすユーザ装置400に対して表示指示を送信するよう構成してもよい。 The display instruction unit 334 instructs the user device 400 to display the area being displayed on the user device 400 specified by the condition detection unit 333 (step S103). In other words, the display instruction unit 334 transmits a display instruction to the user device 400 . For example, the display instruction unit 334 transmits display instructions to all user devices 400 included in the image sharing system 100 . The display instruction unit 334 may be configured to transmit a display instruction to user devices 400 satisfying a predetermined condition among all user devices 400 included in the image sharing system 100 .
 以上が、中継装置300の動作例である。続いて、図9を参照して、ユーザ装置400の動作例について説明する。 The above is an operation example of the relay device 300 . Next, an operation example of the user device 400 will be described with reference to FIG.
 図9を参照すると、画像表示部463は、表示指示を受信していない状態において(ステップS201、No)、任意のユーザ装置400において表示中の領域である他者表示領域を表示させる(ステップS203)。表示対象となるユーザ装置400は、ユーザの操作などに応じて選択可能であってよい。 Referring to FIG. 9, the image display unit 463 displays the other person's display area, which is the area being displayed on any user device 400, when no display instruction is received (step S201, No) (step S203). ). The user device 400 to be displayed may be selectable according to a user's operation or the like.
 一方、表示指示を受信すると(ステップS201、Yes)、画像表示部463は、表示指示により指示されたユーザ装置400において表示中の領域である指示領域を表示させる(ステップS202)。例えば、画像表示部463は、他者表示領域の代わりに、指示領域を表示させる。画像表示部463は、他者表示領域とともに指示領域を表示させてもよい。 On the other hand, when the display instruction is received (step S201, Yes), the image display unit 463 displays the designated area, which is the area being displayed on the user device 400 designated by the display instruction (step S202). For example, the image display unit 463 displays the designated area instead of the other person's display area. The image display unit 463 may display the designation area together with the other person's display area.
 以上が、ユーザ装置400の動作例である。 The above is an operation example of the user device 400 .
 このように、中継装置300は、条件検出部333と表示指示部334とを有している。このような構成によると、表示指示部334は、条件検出部333による条件の検出に応じて、条件検出部333が特定したユーザ装置400において表示中の領域を表示するよう、所定の各ユーザ装置400に対して指示することが出来る。その結果、表示指示を受信したユーザ装置400において、表示指示が指示するユーザ装置400において表示中の領域を表示することが出来る。これにより、画像データを共有する画像共有システムにおいて、会話の発信者であるユーザが確認中の領域などを他のユーザ装置400においても確認することが可能となり、共有者間において情報の円滑な共有を行うことが可能となる。 As described above, the relay device 300 has the condition detection section 333 and the display instruction section 334 . According to such a configuration, the display instruction unit 334 instructs each predetermined user device to display the area being displayed on the user device 400 specified by the condition detection unit 333 in response to detection of a condition by the condition detection unit 333 . 400 can be instructed. As a result, in the user device 400 that has received the display instruction, it is possible to display the area that is being displayed in the user device 400 instructed by the display instruction. As a result, in an image sharing system for sharing image data, it becomes possible for the user, who is the originator of the conversation, to check the area being checked by the other user device 400 as well, thereby facilitating smooth sharing of information between sharers. It is possible to do
 また、ユーザ装置400の画像表示部463は、表示指示に応じて、表示指示が指示するユーザ装置400において表示中の領域を表示するよう構成されている。これにより、ユーザ装置400を操作するユーザは、会話の発信者であるユーザが確認中の領域などを迅速に確認することが可能となり、会話の発信者との間で情報の円滑な共有を行うことが可能となる。 In addition, the image display unit 463 of the user device 400 is configured to display the area being displayed on the user device 400 indicated by the display instruction, according to the display instruction. As a result, the user operating the user device 400 can quickly check the area being checked by the user who is the originator of the conversation, and information can be smoothly shared with the originator of the conversation. becomes possible.
 一例として、本発明は、製造工程において作業内容を教育する場合などの教育分野、店舗内におけるディスプレイ例を遠隔で複数のユーザが確認・検討する場合などのビジネス分野などにおいて活用することが出来る。例えば、本発明によると、条件の検出に応じて、熟練者などの教師が確認中の領域を他のユーザ装置400に表示させることが出来る。その結果、教師が何を見てチェック・発言しているのかなどを他のユーザ装置400を操作するユーザが容易に確認することが可能となり、ノウハウの共有などをより容易に行うことが可能となる。また、本発明によると、条件の検出に応じて、会話中のユーザが確認中の領域を他のユーザ装置400に表示させることが出来る。その結果、どの視点に基づいて会話しているのかなどを他のユーザ装置400を操作するユーザが容易に確認することが可能となり、円滑な情報の共有を行うことが可能となる。なお、本発明は、上記例示した以外の場面で活用されてもよい。 As an example, the present invention can be utilized in the field of education, such as when educating about work details in the manufacturing process, and in the field of business, such as when multiple users remotely check and review display examples in a store. For example, according to the present invention, it is possible to cause another user device 400 to display an area being checked by a teacher such as an expert in response to detection of a condition. As a result, it becomes possible for the user operating the other user device 400 to easily confirm what the teacher is looking at and checking or speaking, and it is possible to more easily share know-how. Become. Further, according to the present invention, it is possible to cause another user device 400 to display the region being checked by the user in conversation in response to the detection of the condition. As a result, it becomes possible for a user operating another user device 400 to easily confirm from which point of view the conversation is based, and it is possible to smoothly share information. The present invention may be utilized in situations other than those exemplified above.
 なお、本実施形態においては、360度カメラが取得した全天球の画像データなどを共有する場合について例示した。しかしながら、本発明は、本実施形態において例示した以外の共有システムに適用されてもよい。例えば、本発明は、画像データを各々がズームすることなどにより、二次元の画像データの一部を各ユーザ装置において表示することが可能な画像共有システムに適用されてもよい。また、本発明は、ユーザ装置において情報の少なくとも一部を表示する画像データ以外の情報共有システムに適用してもよい。例えば、本発明は、閲覧中のウェブサイトを示す表示情報を中継装置に送信する、ウェブサイトなどの情報を共有するシステムに適用されてもよい。 In addition, in this embodiment, the case of sharing omnidirectional image data acquired by a 360-degree camera is exemplified. However, the present invention may be applied to shared systems other than those exemplified in this embodiment. For example, the present invention may be applied to an image sharing system capable of displaying a portion of two-dimensional image data on each user device, such as by each zooming the image data. Also, the present invention may be applied to an information sharing system other than image data that displays at least part of information on a user device. For example, the present invention may be applied to a system for sharing information such as a website that transmits display information indicating the website being browsed to a relay device.
[第2の実施形態]
 次に、図10から図12までを参照して、本発明の第2の実施形態について説明する。本発明の第2の実施形態では、データ全体のうちの少なくとも一部の領域を表示するユーザ装置複数と接続された情報処理装置500、表示装置600の構成の概要について説明する。
[Second embodiment]
Next, a second embodiment of the present invention will be described with reference to FIGS. 10 to 12. FIG. In the second embodiment of the present invention, an outline of the configuration of an information processing device 500 and a display device 600 connected to a plurality of user devices that display at least a partial area of all data will be described.
 図10は、情報処理装置500のハードウェア構成例を示している。図10を参照すると、情報処理装置500は、一例として、以下のようなハードウェア構成を有している。
 ・CPU(Central Processing Unit)501(演算装置)
 ・ROM(Read Only Memory)502(記憶装置)
 ・RAM(Random Access Memory)503(記憶装置)
 ・RAM503にロードされるプログラム群504
 ・プログラム群504を格納する記憶装置505
 ・情報処理装置外部の記録媒体510の読み書きを行うドライブ装置506
 ・情報処理装置外部の通信ネットワーク511と接続する通信インタフェース507
 ・データの入出力を行う入出力インタフェース508
 ・各構成要素を接続するバス509
FIG. 10 shows a hardware configuration example of the information processing device 500 . Referring to FIG. 10, the information processing apparatus 500 has the following hardware configuration as an example.
- CPU (Central Processing Unit) 501 (arithmetic unit)
- ROM (Read Only Memory) 502 (storage device)
・RAM (Random Access Memory) 503 (storage device)
Program group 504 loaded into RAM 503
- Storage device 505 for storing program group 504
A drive device 506 that reads and writes a recording medium 510 outside the information processing device
- A communication interface 507 that connects to a communication network 511 outside the information processing apparatus
An input/output interface 508 for inputting/outputting data
- A bus 509 connecting each component
 また、情報処理装置500は、プログラム群504をCPU501が取得して当該CPU501が実行することで、図11に示す検出部521、指示部522としての機能を実現することが出来る。なお、プログラム群504は、例えば、予め記憶装置505やROM502に格納されており、必要に応じてCPU501がRAM503などにロードして実行する。また、プログラム群504は、通信ネットワーク511を介してCPU501に供給されてもよいし、予め記録媒体510に格納されており、ドライブ装置506が該プログラムを読み出してCPU501に供給してもよい。 Also, the information processing apparatus 500 can realize the functions of the detection unit 521 and the instruction unit 522 shown in FIG. The program group 504 is stored in the storage device 505 or the ROM 502 in advance, for example, and is loaded into the RAM 503 or the like by the CPU 501 as necessary and executed. The program group 504 may be supplied to the CPU 501 via the communication network 511 or may be stored in the recording medium 510 in advance, and the drive device 506 may read the program and supply it to the CPU 501 .
 なお、図10は、情報処理装置500のハードウェア構成例を示している。情報処理装置500のハードウェア構成は上述した場合に限定されない。例えば、情報処理装置500は、ドライブ装置506を有さないなど、上述した構成の一部から構成されてもよい。 Note that FIG. 10 shows a hardware configuration example of the information processing device 500 . The hardware configuration of the information processing device 500 is not limited to the case described above. For example, the information processing apparatus 500 may be composed of part of the above-described configuration, such as not having the drive device 506 .
 検出部521は、予め定められた条件を検出する。例えば、検出部521は、音データに基づいて会話していると判断であるか否かなど、音データの送受信状況に応じた条件を検出する。 The detection unit 521 detects a predetermined condition. For example, the detection unit 521 detects conditions according to the state of transmission and reception of sound data, such as whether or not it is determined that the user is having a conversation based on the sound data.
 指示部522は、検出部521が検出した結果に応じて、条件に対応するユーザ装置が表示している領域を表示するよう、接続中のユーザ装置のうちの少なくとも一部に対して指示する。例えば、指示部522は、表示対象となるユーザ装置を指示する表示指示を送信することなどにより上記指示を行うことが出来る。 The instruction unit 522 instructs at least some of the connected user devices to display the area displayed by the user device corresponding to the condition according to the detection result of the detection unit 521 . For example, the instruction unit 522 can perform the above instruction by transmitting a display instruction to instruct the user device to be displayed.
 このように、情報処理装置500は、検出部521と指示部522とを有している。このような構成によると、指示部522は、検出部521が検出した結果に応じて、条件に対応するユーザ装置が表示している領域を表示するよう、接続中のユーザ装置のうちの少なくとも一部に対して指示することが出来る。その結果、指示を受けたユーザ装置において、対象となるユーザ装置において表示中の領域を表示することが出来る。これにより、画像データを共有する画像共有システムにおいて、情報の円滑な共有を行うことが可能となる。 As described above, the information processing device 500 has the detection unit 521 and the instruction unit 522 . According to such a configuration, the instruction unit 522 causes at least one of the connected user devices to display the area displayed by the user device corresponding to the condition according to the detection result of the detection unit 521. You can give instructions to the department. As a result, the user device that has received the instruction can display the area that is being displayed on the target user device. This enables smooth sharing of information in an image sharing system that shares image data.
 なお、上述した情報処理装置500は、当該情報処理装置500に所定のプログラムが組み込まれることで実現できる。具体的に、本発明の他の形態であるプログラムは、データ全体のうちの少なくとも一部の領域を表示するユーザ装置複数と接続された情報処理装置500に、予め定められた条件を検出し、検出した結果に応じて、条件に対応するユーザ装置が表示している領域を表示するよう、接続中のユーザ装置のうちの少なくとも一部に対して指示する処理を実現するためのプログラムである。 The information processing device 500 described above can be realized by installing a predetermined program in the information processing device 500 . Specifically, a program that is another aspect of the present invention detects a predetermined condition in an information processing device 500 connected to a plurality of user devices that display at least a partial area of all data, A program for realizing a process of instructing at least some of the connected user devices to display an area displayed by a user device corresponding to a condition according to a detection result.
 また、上述した情報処理装置500により実行される情報処理方法は、データ全体のうちの少なくとも一部の領域を表示するユーザ装置複数と接続された情報処理装置500が、予め定められた条件を検出し、検出した結果に応じて、条件に対応するユーザ装置が表示している領域を表示するよう、接続中のユーザ装置のうちの少なくとも一部に対して指示する、というものである。 Further, in the information processing method executed by the information processing device 500 described above, the information processing device 500 connected to a plurality of user devices displaying at least a partial area of the entire data detects a predetermined condition. Then, according to the detected result, at least some of the connected user devices are instructed to display the area displayed by the user device corresponding to the condition.
 上述した構成を有する、プログラム(又は記録媒体)、又は、情報処理方法、の発明であっても、情報処理装置500と同様の作用・効果を有するために、上述した本発明の目的を達成することが出来る。 Even the invention of the program (or recording medium) or information processing method having the configuration described above achieves the above-described object of the present invention because it has the same actions and effects as the information processing apparatus 500. can do
 また、本発明は、図12で示すような構成を有する表示装置600であっても、同様の理由により本発明の目的を達成することが出来る。なお、表示装置600のハードウェア構成は、図10を参照して説明した情報処理装置500の場合と同様であってよい。そのため、表示装置600のハードウェア構成については説明を省略する。 Also, the present invention can achieve the object of the present invention for the same reason even with a display device 600 having a configuration as shown in FIG. Note that the hardware configuration of the display device 600 may be the same as that of the information processing device 500 described with reference to FIG. 10 . Therefore, description of the hardware configuration of the display device 600 is omitted.
 図12を参照すると、表示装置600は、プログラム群をCPUが取得して当該CPUが実行することで、受信部621と表示部622としての機能を実現することが出来る。 Referring to FIG. 12, the display device 600 can realize the functions of the reception unit 621 and the display unit 622 by having the CPU acquire a group of programs and have the CPU execute them.
 受信部621は、データ全体のうち他のユーザ装置で表示中の領域を特定するための情報である表示領域情報を受信する。 The receiving unit 621 receives display area information, which is information for specifying an area being displayed on another user device, among the entire data.
 表示部622は、データ全体のうちユーザの状態に応じた領域を表示させるとともに、表示領域情報に基づいて他のユーザ装置で表示中の領域を表示させる。また、表示部622は、所定の条件の検出結果として外部装置から送信される表示指示を受信すると、表示指示に基づいて、データ全体のうち前記表示指示が示すユーザ装置で表示中の領域を表示させる。 The display unit 622 displays an area corresponding to the state of the user out of the entire data, and displays an area being displayed on another user device based on the display area information. Further, when the display unit 622 receives a display instruction transmitted from an external device as a detection result of a predetermined condition, the display unit 622 displays the area currently displayed on the user device indicated by the display instruction out of the entire data based on the display instruction. Let
 このような構成を有する表示装置600、または、表示装置600が行う表示方法、表示装置600に表示方法を実現させるためのプログラム(または、記録媒体)であっても、上述したように、本発明の目的を達成することが出来る。 Even if the display device 600 having such a configuration, the display method performed by the display device 600, or the program (or recording medium) for realizing the display method on the display device 600, the present invention can be used as described above. can achieve the purpose of
 <付記>
 上記実施形態の一部又は全部は、以下の付記のようにも記載されうる。以下、本発明における情報処理装置などの概略を説明する。但し、本発明は、以下の構成に限定されない。
<Appendix>
Some or all of the above embodiments may also be described as the following appendices. An outline of an information processing apparatus and the like according to the present invention will be described below. However, the present invention is not limited to the following configurations.
(付記1)
 データ全体のうちの少なくとも一部の領域を表示するユーザ装置複数と接続された情報処理装置であって、
 予め定められた条件を検出する検出部と、
 前記検出部が検出した結果に応じて、条件に対応するユーザ装置が表示している領域を表示するよう、接続中のユーザ装置のうちの少なくとも一部に対して指示する指示部と、
 を有する
 情報処理装置。
(付記2)
 前記検出部は、前記音データの送受信状況に応じた条件を検出する
 付記1に記載の情報処理装置。
(付記3)
 前記検出部は、前記音データの送受信状況に応じて会話していると判断可能な場合に予め定められた条件を満たす旨を検出する
 付記1または付記2に記載の情報処理装置。
(付記4)
 データを取得する現場装置と接続されており、
 前記検出部は、現場装置とユーザ装置との間で会話していると判断可能な場合に予め定められた条件を満たす旨を検出し、
 前記指示部は、前記検出部が検出した結果に応じて、現場装置と会話していると判断可能なユーザ装置が表示している領域を表示するよう、接続中のユーザ装置のうちの少なくとも一部に対して指示する
 付記1から付記3までのうちのいずれか1項に記載の情報処理装置。
(付記5)
 ユーザ装置で表示中の領域を特定するための表示領域情報を受信する受信部と、
 前記受信部が受信した前記表示領域情報を接続中のユーザ装置のうちの少なくとも一部に対して送信する送信部と、
 を有する
 付記1から付記4までのうちのいずれか1項に記載の情報処理装置。
(付記6)
 前記指示部は、前記表示領域情報を含む指示を接続中のユーザ装置のうちの少なくとも一部に対して行う
 付記5に記載の情報処理装置。
(付記7)
 データ全体のうちの少なくとも一部の領域を表示するユーザ装置複数と接続された情報処理装置が、
 予め定められた条件を検出し、
 検出した結果に応じて、条件に対応するユーザ装置が表示している領域を表示するよう、接続中のユーザ装置のうちの少なくとも一部に対して指示する
 情報処理方法。
(付記8)
 データ全体のうちの少なくとも一部の領域を表示するユーザ装置複数と接続された情報処理装置が、
 予め定められた条件を検出し、
 検出した結果に応じて、条件に対応するユーザ装置が表示している領域を表示するよう、接続中のユーザ装置のうちの少なくとも一部に対して指示する
 処理を実現するためのプログラム。
(付記9)
 データ全体のうち他のユーザ装置で表示中の領域を特定するための情報である表示領域情報を受信する受信部と、
 データ全体のうちユーザの状態に応じた領域を表示させるとともに、前記表示領域情報に基づいて他のユーザ装置で表示中の領域を表示させる表示部と、
 を有し、
 前記表示部は、所定の条件の検出結果として外部装置から送信される表示指示に基づいて、データ全体のうち前記表示指示が示すユーザ装置で表示中の領域を表示させる
 表示装置。
(付記10)
 ユーザ装置が、
 データ全体のうち他のユーザ装置で表示中の領域を特定するための情報である表示領域情報を受信し、
 データ全体のうちユーザの状態に応じた領域を表示させるとともに、前記表示領域情報に基づいて他のユーザ装置で表示中の領域を表示させ、
 所定の条件の検出結果として外部装置から送信される表示指示を受信すると、受信した表示指示に基づいて、データ全体のうち前記表示指示が示すユーザ装置で表示中の領域を表示させる
 表示方法。
(Appendix 1)
An information processing device connected to a plurality of user devices that display at least a partial area of all data,
a detection unit that detects a predetermined condition;
an instruction unit that instructs at least some of the connected user devices to display the area displayed by the user device corresponding to the condition according to the detection result of the detection unit;
An information processing device.
(Appendix 2)
The information processing apparatus according to appendix 1, wherein the detection unit detects a condition according to a transmission/reception state of the sound data.
(Appendix 3)
The information processing apparatus according to appendix 1 or appendix 2, wherein the detection unit detects that a predetermined condition is satisfied when it can be determined that a conversation is being held according to the transmission/reception status of the sound data.
(Appendix 4)
It is connected to the field device that acquires the data,
The detection unit detects that a predetermined condition is satisfied when it can be determined that the field device and the user device are having a conversation;
The instruction unit causes at least one of the connected user devices to display an area displayed by the user device that can be determined to be having a conversation with the field device according to the detection result of the detection unit. The information processing apparatus according to any one of appendices 1 to 3, wherein the instruction is given to the unit.
(Appendix 5)
a receiving unit that receives display area information for specifying the area being displayed on the user device;
a transmitting unit configured to transmit the display area information received by the receiving unit to at least some of the connected user devices;
The information processing apparatus according to any one of appendices 1 to 4.
(Appendix 6)
6. The information processing apparatus according to appendix 5, wherein the instruction unit issues an instruction including the display area information to at least some of the connected user devices.
(Appendix 7)
An information processing device connected to a plurality of user devices that display at least a partial area of all data,
detecting a predetermined condition;
An information processing method for instructing at least a part of connected user devices to display an area displayed by a user device corresponding to a condition according to a detection result.
(Appendix 8)
An information processing device connected to a plurality of user devices that display at least a partial area of all data,
detecting a predetermined condition;
A program for realizing a process of instructing at least some of the connected user devices to display the area displayed by the user device corresponding to the condition according to the detection result.
(Appendix 9)
a receiving unit that receives display area information, which is information for specifying an area being displayed on another user device in the entire data;
a display unit that displays an area corresponding to the state of the user out of the entire data and displays an area that is being displayed on another user device based on the display area information;
has
The display unit displays, based on a display instruction transmitted from an external device as a detection result of a predetermined condition, a region being displayed on the user device indicated by the display instruction out of the entire data.
(Appendix 10)
the user device
receiving display area information, which is information for specifying an area being displayed on another user device in the entire data;
displaying an area corresponding to the state of the user out of the entire data, and displaying an area being displayed on another user device based on the display area information;
A display method comprising: upon receiving a display instruction transmitted from an external device as a result of detection of a predetermined condition, displaying an area currently displayed on the user device indicated by the display instruction out of the entire data, based on the received display instruction.
 なお、上記各実施形態及び付記において記載したプログラムは、記憶装置に記憶されていたり、コンピュータが読み取り可能な記録媒体に記録されていたりする。例えば、記録媒体は、フレキシブルディスク、光ディスク、光磁気ディスク、及び、半導体メモリ等の可搬性を有する媒体である。 The programs described in each of the above embodiments and supplementary notes are stored in a storage device or recorded in a computer-readable recording medium. For example, the recording medium is a portable medium such as a flexible disk, an optical disk, a magneto-optical disk, and a semiconductor memory.
 以上、上記各実施形態を参照して本願発明を説明したが、本願発明は、上述した実施形態に限定されるものではない。本願発明の構成や詳細には、本願発明の範囲内で当業者が理解しうる様々な変更をすることが出来る。 Although the present invention has been described with reference to the above-described embodiments, the present invention is not limited to the above-described embodiments. Various changes that can be understood by those skilled in the art can be made to the configuration and details of the present invention within the scope of the present invention.
 なお、本発明は、日本国にて2021年5月26日に特許出願された特願2021-088051の特許出願に基づく優先権主張の利益を享受するものであり、当該特許出願に記載された内容は、全て本明細書に含まれるものとする。 In addition, the present invention enjoys the benefit of the priority claim based on the patent application of Japanese Patent Application No. 2021-088051 filed on May 26, 2021 in Japan. The contents are hereby incorporated by reference in their entirety.
100 画像共有システム
200 現場装置
210 画像データ取得部
220 音データ入出力部
230 送受信部
300 中継装置
310 通信I/F部
320 記憶部
321 画像情報
322 音情報
323 表示情報
324 プログラム
330 演算処理部
331 データ受信部
332 データ送信部
333 条件検出部
334 表示指示部
400 ユーザ装置
410 通信I/F部
420 画面表示部
430 操作入力部
440 音入出力部
450 記憶部
451 画像情報
452 音情報
453 表示情報
454 指示情報
455 プログラム
460 演算処理部
461 データ受信部
462 データ送信部
463 画像表示部
464 表示領域情報取得部
465 表示領域情報送信部
500 情報処理装置
501 CPU
502 ROM
503 RAM
504 プログラム群
505 記憶装置
506 ドライブ装置
507 通信インタフェース
508 入出力インタフェース
509 バス
510 記録媒体
511 通信ネットワーク
521 検出部
522 指示部
600 表示装置
621 受信部
622 表示部

 
100 image sharing system 200 field device 210 image data acquisition unit 220 sound data input/output unit 230 transmission/reception unit 300 relay device 310 communication I/F unit 320 storage unit 321 image information 322 sound information 323 display information 324 program 330 arithmetic processing unit 331 data Reception unit 332 Data transmission unit 333 Condition detection unit 334 Display instruction unit 400 User device 410 Communication I/F unit 420 Screen display unit 430 Operation input unit 440 Sound input/output unit 450 Storage unit 451 Image information 452 Sound information 453 Display information 454 Instruction information 455 program 460 arithmetic processing unit 461 data reception unit 462 data transmission unit 463 image display unit 464 display area information acquisition unit 465 display area information transmission unit 500 information processing device 501 CPU
502 ROMs
503 RAM
504 program group 505 storage device 506 drive device 507 communication interface 508 input/output interface 509 bus 510 recording medium 511 communication network 521 detection unit 522 instruction unit 600 display device 621 reception unit 622 display unit

Claims (10)

  1.  データ全体のうちの少なくとも一部の領域を表示するユーザ装置複数と接続された情報処理装置であって、
     予め定められた条件を検出する検出部と、
     前記検出部が検出した結果に応じて、条件に対応するユーザ装置が表示している領域を表示するよう、接続中のユーザ装置のうちの少なくとも一部に対して指示する指示部と、
     を有する
     情報処理装置。
    An information processing device connected to a plurality of user devices that display at least a partial area of all data,
    a detection unit that detects a predetermined condition;
    an instruction unit that instructs at least some of the connected user devices to display the area displayed by the user device corresponding to the condition according to the detection result of the detection unit;
    An information processing device.
  2.  前記検出部は、前記音データの送受信状況に応じた条件を検出する
     請求項1に記載の情報処理装置。
    The information processing apparatus according to claim 1, wherein the detection unit detects a condition according to a transmission/reception state of the sound data.
  3.  前記検出部は、前記音データの送受信状況に応じて会話していると判断可能な場合に予め定められた条件を満たす旨を検出する
     請求項1または請求項2に記載の情報処理装置。
    3. The information processing apparatus according to claim 1, wherein the detection unit detects that a predetermined condition is satisfied when it can be determined that the person is having a conversation according to the transmission/reception status of the sound data.
  4.  データを取得する現場装置と接続されており、
     前記検出部は、現場装置とユーザ装置との間で会話していると判断可能な場合に予め定められた条件を満たす旨を検出し、
     前記指示部は、前記検出部が検出した結果に応じて、現場装置と会話していると判断可能なユーザ装置が表示している領域を表示するよう、接続中のユーザ装置のうちの少なくとも一部に対して指示する
     請求項1から請求項3までのうちのいずれか1項に記載の情報処理装置。
    It is connected to the field device that acquires the data,
    The detection unit detects that a predetermined condition is satisfied when it can be determined that the field device and the user device are having a conversation;
    The instruction unit causes at least one of the connected user devices to display an area displayed by the user device that can be determined to be having a conversation with the field device according to the detection result of the detection unit. The information processing apparatus according to any one of claims 1 to 3, wherein an instruction is given to a department.
  5.  ユーザ装置で表示中の領域を特定するための表示領域情報を受信する受信部と、
     前記受信部が受信した前記表示領域情報を接続中のユーザ装置のうちの少なくとも一部に対して送信する送信部と、
     を有する
     請求項1から請求項4までのうちのいずれか1項に記載の情報処理装置。
    a receiving unit that receives display area information for specifying the area being displayed on the user device;
    a transmitting unit configured to transmit the display area information received by the receiving unit to at least some of the connected user devices;
    The information processing apparatus according to any one of claims 1 to 4.
  6.  前記指示部は、前記表示領域情報を含む指示を接続中のユーザ装置のうちの少なくとも一部に対して行う
     請求項5に記載の情報処理装置。
    6. The information processing apparatus according to claim 5, wherein the instruction unit issues an instruction including the display area information to at least some of the connected user devices.
  7.  データ全体のうちの少なくとも一部の領域を表示するユーザ装置複数と接続された情報処理装置が、
     予め定められた条件を検出し、
     検出した結果に応じて、条件に対応するユーザ装置が表示している領域を表示するよう、接続中のユーザ装置のうちの少なくとも一部に対して指示する
     情報処理方法。
    An information processing device connected to a plurality of user devices that display at least a partial area of all data,
    detecting a predetermined condition;
    An information processing method for instructing at least a part of connected user devices to display an area displayed by a user device corresponding to a condition according to a detection result.
  8.  データ全体のうちの少なくとも一部の領域を表示するユーザ装置複数と接続された情報処理装置が、
     予め定められた条件を検出し、
     検出した結果に応じて、条件に対応するユーザ装置が表示している領域を表示するよう、接続中のユーザ装置のうちの少なくとも一部に対して指示する
     処理を実現するためのプログラムを記録した、コンピュータが読み取り可能な記録媒体。
    An information processing device connected to a plurality of user devices that display at least a partial area of all data,
    detecting a predetermined condition;
    A program is recorded for executing the process of instructing at least some of the connected user devices to display the area displayed by the user device corresponding to the conditions according to the detection result. , a computer-readable recording medium.
  9.  データ全体のうち他のユーザ装置で表示中の領域を特定するための情報である表示領域情報を受信する受信部と、
     データ全体のうちユーザの状態に応じた領域を表示させるとともに、前記表示領域情報に基づいて他のユーザ装置で表示中の領域を表示させる表示部と、
     を有し、
     前記表示部は、所定の条件の検出結果として外部装置から送信される表示指示に基づいて、データ全体のうち前記表示指示が示すユーザ装置で表示中の領域を表示させる
     表示装置。
    a receiving unit that receives display area information, which is information for specifying an area being displayed on another user device in the entire data;
    a display unit that displays an area corresponding to the state of the user out of the entire data and displays an area that is being displayed on another user device based on the display area information;
    has
    The display unit displays, based on a display instruction transmitted from an external device as a detection result of a predetermined condition, a region being displayed on the user device indicated by the display instruction out of the entire data.
  10.  ユーザ装置が、
     データ全体のうち他のユーザ装置で表示中の領域を特定するための情報である表示領域情報を受信し、
     データ全体のうちユーザの状態に応じた領域を表示させるとともに、前記表示領域情報に基づいて他のユーザ装置で表示中の領域を表示させ、
     所定の条件の検出結果として外部装置から送信される表示指示を受信すると、受信した表示指示に基づいて、データ全体のうち前記表示指示が示すユーザ装置で表示中の領域を表示させる
     表示方法。

     
    the user device
    receiving display area information, which is information for specifying an area being displayed on another user device in the entire data;
    displaying an area corresponding to the state of the user out of the entire data, and displaying an area being displayed on another user device based on the display area information;
    A display method comprising: upon receiving a display instruction transmitted from an external device as a detection result of a predetermined condition, displaying an area currently displayed on the user device indicated by the display instruction out of the entire data, based on the received display instruction.

PCT/JP2022/020208 2021-05-26 2022-05-13 Information processing device WO2022249900A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2023523411A JPWO2022249900A1 (en) 2021-05-26 2022-05-13

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021088051 2021-05-26
JP2021-088051 2021-05-26

Publications (1)

Publication Number Publication Date
WO2022249900A1 true WO2022249900A1 (en) 2022-12-01

Family

ID=84229969

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/020208 WO2022249900A1 (en) 2021-05-26 2022-05-13 Information processing device

Country Status (2)

Country Link
JP (1) JPWO2022249900A1 (en)
WO (1) WO2022249900A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08163122A (en) * 1994-12-08 1996-06-21 Hitachi Ltd Remote conference method
JP2013020520A (en) * 2011-07-13 2013-01-31 Sony Corp Information processing method and information processing system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08163122A (en) * 1994-12-08 1996-06-21 Hitachi Ltd Remote conference method
JP2013020520A (en) * 2011-07-13 2013-01-31 Sony Corp Information processing method and information processing system

Also Published As

Publication number Publication date
JPWO2022249900A1 (en) 2022-12-01

Similar Documents

Publication Publication Date Title
KR102481894B1 (en) Electronic device and method for sharing screen thereof
TWI610097B (en) Electronic system, portable display device and guiding device
KR102479052B1 (en) Method for controlling display of electronic device using a plurality of controllers and device thereof
US8659635B2 (en) Information processing system and information processing method
US10304252B2 (en) Collaboration methods to improve use of 3D models in mixed reality environments
CN102695032A (en) Information processing apparatus, information sharing method, program, and terminal device
CN110209488B (en) Task execution method, device, equipment, system and storage medium
CN101467139A (en) Dragging and dropping objects between local and remote modules
JP2011248860A (en) Information processing system, and information processing method and program
CN109828731B (en) Searching method and terminal equipment
CN112188152A (en) Remote operation support system
EP3905037A1 (en) Session creation method and terminal device
WO2022249900A1 (en) Information processing device
WO2019218737A1 (en) Video picture display method in large screen system, apparatus, and storage medium
CN111031555A (en) Online teaching control method and device, electronic equipment and medium
JP6208910B1 (en) Moving image processing apparatus, moving image processing system, moving image processing method, and moving image processing program
KR102166719B1 (en) Apparatus and method for information exchange
KR20150105131A (en) System and method for augmented reality control
KR102539578B1 (en) Method for mapping function of application and electronic device thereof
US10250845B1 (en) Remote collaboration system with projector-camera based robot device and head mounted display, and remote interaction method using the same
JP5954731B2 (en) Information communication terminal and program for information communication terminal
CN111897709A (en) Method, device, electronic equipment and medium for monitoring user
EP3813026A1 (en) Method and electronic device for displaying content
JP4478047B2 (en) Information presentation apparatus, information presentation method, and program thereof
JP7451931B2 (en) Information processing system, information processing device, information processing method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22811173

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023523411

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE