WO2021200212A1 - Information processing device, information processing method, and program - Google Patents

Information processing device, information processing method, and program Download PDF

Info

Publication number
WO2021200212A1
WO2021200212A1 PCT/JP2021/011067 JP2021011067W WO2021200212A1 WO 2021200212 A1 WO2021200212 A1 WO 2021200212A1 JP 2021011067 W JP2021011067 W JP 2021011067W WO 2021200212 A1 WO2021200212 A1 WO 2021200212A1
Authority
WO
WIPO (PCT)
Prior art keywords
image data
important
information processing
area
processing device
Prior art date
Application number
PCT/JP2021/011067
Other languages
French (fr)
Japanese (ja)
Inventor
翼 乾
翔介 青合
浩太 堀内
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2021200212A1 publication Critical patent/WO2021200212A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/238Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
    • H04N21/2385Channel allocation; Bandwidth allocation

Definitions

  • This disclosure relates to information processing devices, information processing methods and programs.
  • the present disclosure provides an information processing device, an information processing method, and a program capable of suppressing deterioration of usability.
  • an information processing device includes a control unit.
  • the control unit extracts important image data of the important region and non-important image data of the non-important region from the image data.
  • the control unit allocates the important image data and the non-important image data to different network slices.
  • a numerical value may be given as a specific example, but such a numerical value is an example, and another numerical value may be used.
  • FIG. 1 is a diagram showing an outline of an information processing system 1 to which the proposed technique of the present disclosure is applied.
  • FIG. 2 is a diagram for explaining an outline of the proposed technique of the present disclosure.
  • the proposed technique according to the present disclosure is implemented in the information processing system 1 shown in FIG.
  • the information processing system 1 includes an information processing device 20 and a terminal device 10.
  • the information processing device 20 and the terminal device 10 communicate with each other via the base station device 30.
  • the information processing device 20 is, for example, a server device, and provides services such as game distribution and video distribution to the terminal device 10 via the base station device 30.
  • the information processing device 20 transmits a moving image to, for example, the terminal device 10.
  • the terminal device 10 displays a moving image received from the information processing device 20 to the user.
  • FIG. 1 shows a case where the terminal device 10 is configured as a so-called head-mounted device that is worn and used on at least a part of the user's head.
  • the terminal device 10 may be configured to be able to detect the line of sight of the user, for example.
  • the base station device 30 is a wireless communication device that wirelessly communicates with the terminal device 10.
  • the base station device 30 is, for example, a device corresponding to a radio base station (Node B, eNB, gNB, etc.), and provides the terminal device 10 with a cellular communication service such as NR (New Radio).
  • the base station device 30 may be a wireless relay station.
  • the base station device 30 may be a road base station device such as an RSU (Road Side Unit). Further, the base station device 30 may be an optical overhanging device called RRH (Remote Radio Head).
  • RRH Remote Radio Head
  • the base station device 30 connects to the information processing device 20 via, for example, a network, and transmits information about services provided by the information processing device 20, such as moving images, to the terminal device 10. Further, the base station device 30 provides the information from the terminal device 10 to the information processing device 20 via the network.
  • the information processing device 20 and the terminal device 10 transmit and receive information via the base station device 30, so that the terminal device 10 receives a service such as video distribution from the information processing device 20. Can be done.
  • the information processing device 20 transmits the moving image data to be distributed using a transmission line of the same quality, there is a problem that data loss or error occurs depending on the communication environment, and usability is deteriorated.
  • NR New Radio
  • eMBB Enhanced Mobile Broadband
  • mMTC Massive Machine Type Communications
  • URLLC Ultra Reliable and Low Latency Communications
  • the information processing device 20 extracts an important region and a non-important region from image data which is one frame of a moving image to be distributed, for example.
  • the information processing device 20 allocates the image data of the important area and the image data of the non-important area to different network slices and transmits the image data.
  • one image data can be transmitted using transmission lines of different qualities depending on the area.
  • the information processing apparatus 20 extracts, for example, an area that the user is gazing at as an important area, and extracts other areas as non-important areas.
  • the information processing apparatus 20 allocates the image data of the important region to the slice S1 for transmitting the data with high reliability and low delay and transmits the image data, and allocates the non-important region to the slice S2 for transmitting the data with the best effort and transmits the data.
  • the information processing device 20 can transmit, for example, image data of an important region that the user is gazing at with high reliability and low delay, and can suppress delay of data in the important region and deterioration of image quality. It is possible to suppress the deterioration of usability.
  • the information processing device 20 transmits, for example, image data of a non-important area that the user is not paying attention to with normal quality (for example, best effort).
  • normal quality for example, best effort
  • the information processing device 20 allocates the important area and the non-important area of the image data to different network slices, so that the data delay and the image quality deterioration of the important area can be suppressed, and the deterioration of usability can be suppressed.
  • the radio resource may be significantly occupied and the utilization efficiency of the radio resource may decrease.
  • the information processing apparatus 20 allocates the important region and the non-important region of the image data to different network slices.
  • the information processing system 1 can suppress the deterioration of the utilization efficiency of the wireless resource while suppressing the deterioration of the usability.
  • FIG. 3 is a diagram showing an example of the configuration of the information processing system 1 according to the first embodiment of the present disclosure.
  • the information processing system 1 includes a terminal device 10 and an information processing device 20.
  • the terminal device 10 is a so-called head-mounted device that is worn on the user's head, such as an AR glass or a head-mounted display.
  • the terminal device 10 includes a control unit 11, a storage unit 12, a communication unit 13, and a display unit 14.
  • the configuration of the terminal device 10 is not limited to the configuration shown in FIG. 3, and may be another configuration as long as it is a configuration for performing information processing described later.
  • the communication unit 13 is a communication interface (I / F) that communicates with an external device.
  • the communication unit 13 is realized by, for example, a NIC (Network Interface Card) or the like.
  • the communication unit 13 connects to the core network by performing wireless communication with the base station device 30.
  • the communication unit 13 receives image data from the information processing device 20 via the base station device 30.
  • the communication unit 13 transmits the line-of-sight information described later to the information processing device 20 via the base station device 30.
  • the display unit 14 displays the image data transmitted by the information processing device 20.
  • the display unit 14 corresponds to the display unit of these AR glasses or the head-mounted display.
  • the terminal device 10 is an AR glass or a head-mounted display, but the present invention is not limited to this.
  • the display unit 14 may be a display device (such as the AR glass or head-mounted display described above) different from the terminal device 10, and the terminal device 10 may be an information processing device for displaying image data on the display device.
  • the terminal device 10 may be, for example, a smartphone or a PC.
  • the storage unit 12 is realized by, for example, a semiconductor memory element such as a RAM or a flash memory, or a storage device such as a hard disk or an optical disk.
  • the storage unit 12 stores programs, calculation parameters, and the like used for processing by the control unit 11.
  • the control unit 11 is, for example, a controller, and various programs stored in a storage device inside the terminal device 10 by a CPU (Central Processing Unit), an MPU (Micro Processing Unit), or the like are stored in a RAM (Random Access Memory). ) Is executed as a work area.
  • the various programs include programs of applications installed in the terminal device 10.
  • the control unit 11 is realized by, for example, an integrated circuit such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field Programmable Gate Array).
  • the control unit 11 has a line-of-sight detection unit 111, and realizes or executes an information processing function or operation described below.
  • the internal configuration of the control unit 11 is not limited to the configuration shown in FIG. 3, and may be another configuration as long as it is a configuration for performing information processing described later.
  • the line-of-sight detection unit 111 detects the user's line-of-sight direction by using eye tracking technology based on the detection results of, for example, a camera, an optical sensor, and a motion sensor (all not shown) mounted on the terminal device 10. ..
  • the line-of-sight detection unit 111 determines the gaze area of the display screen that the user is gazing at based on the detected line-of-sight direction.
  • the line-of-sight detection unit 111 transmits the line-of-sight information including the determined gaze area to the information processing device 20.
  • the gaze area is an area used when the information processing apparatus 20 sets an important area, and thus can be said to be information about the important area.
  • the information processing device 20 is, for example, a server device, and delivers video and data (for example, moving images) to the terminal device 10.
  • the information processing device 20 includes a control unit 21, a storage unit 22, a communication unit 23, and a display unit 24.
  • the configuration of the information processing device 20 is not limited to the configuration shown in FIG. 3, and may be any other configuration as long as it performs information processing described later.
  • the communication unit 23 is a communication interface (I / F) that communicates with an external device.
  • the communication unit 23 is realized by, for example, a NIC (Network Interface Card) or the like.
  • the communication unit 23 connects to the base station device 30 via a network.
  • the communication unit 23 receives line-of-sight information from the terminal device 10 via the base station device 30.
  • the communication unit 23 transmits image data to the terminal device 10 via the base station device 30.
  • the display unit 24 is a display that displays various information of the information processing device 20.
  • the display unit 24 may be a display such as a liquid crystal display.
  • the storage unit 22 is realized by, for example, a semiconductor memory element such as a RAM or a flash memory, or a storage device such as a hard disk or an optical disk.
  • the storage unit 22 stores programs, calculation parameters, and the like used for processing by the control unit 11.
  • the storage unit 22 may hold the image data to be transmitted to the terminal device 10.
  • the control unit 21 is, for example, a controller, and various programs stored in a storage device inside the information processing device 20 by a CPU (Central Processing Unit), an MPU (Micro Processing Unit), or the like are stored in a RAM (Random Access). It is realized by executing Memory) as a work area.
  • the various programs include programs of applications installed in the information processing apparatus 20.
  • the control unit 21 is realized by, for example, an integrated circuit such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field Programmable Gate Array).
  • the control unit 21 has an important area setting unit 211, and realizes or executes an information processing function or operation described below.
  • the internal configuration of the control unit 21 is not limited to the configuration shown in FIG. 3, and may be another configuration as long as it is a configuration for performing information processing described later.
  • the important area setting unit 211 sets the important area and the non-important area based on the line-of-sight information acquired from the terminal device 10.
  • the important area setting unit 211 determines the gaze area included in the line-of-sight information as an important area, and determines the other areas as non-important areas.
  • the important area setting unit 211 extracts the image data corresponding to the determined important area as important information (important image data) and assigns it to a network slice (slice S1 in FIG. 3) that transmits a signal with high reliability and low delay.
  • Slice S1 is, for example, a network slice for URLLC.
  • the important area setting unit 211 extracts the image data corresponding to the determined non-important area as non-important information (non-important image data) and allocates it to the network slice (slice S2 in FIG. 3) to be transmitted with best effort.
  • the important image data and the non-important image data assigned to the slices S1 and S2 by the important area setting unit 211 are transmitted to the terminal device 10 via the communication unit 23.
  • FIG. 4 is a diagram for explaining a gaze area according to the first embodiment of the present disclosure.
  • the terminal device 10 determines the area in which the user is gazing based on the line-of-sight direction of the user.
  • the user wears the terminal device 10 which is a head mount display, for example, on the head and is watching the object Ob displayed on the screen of the terminal device 10.
  • the terminal device 10 which is a head mount display, for example, on the head and is watching the object Ob displayed on the screen of the terminal device 10.
  • the terminal device 10 uses eye tracking technology or the like based on the detection results of sensors such as a camera, an optical sensor, or a motion sensor (all of which are not shown) mounted on the own device. Detect the line-of-sight direction.
  • the terminal device 10 detects at least one feature amount, position, and movement of the user's eye as an eye tracking technique, for example.
  • the terminal device 10 detects at least one feature amount, position, and movement of the user's eye by photographing the user's eye with a camera, for example, and analyzes the captured image, and performs eye tracking (line-of-sight detection). ..
  • the terminal device 10 may perform line-of-sight detection by detecting at least one feature amount, position, and movement of the user's face or head in addition to or instead of the user's eyes.
  • the terminal device 10 determines the display area of the object Ob displayed in the detected line-of-sight direction as the gaze area. For example, when the cube object Ob is displayed as shown in FIG. 4, the display area of the cube object Ob is set as the gaze area.
  • a predetermined area R1 including the object Ob displayed in the line-of-sight direction of the terminal device 10 may be determined as the gaze area.
  • the terminal device 10 determines the area including the object Ob displayed in the line-of-sight direction as the gaze area, and transmits the line-of-sight information including the information regarding the determined gaze area to the information processing device 20.
  • the information processing device 20 When the information processing device 20 acquires the line-of-sight information from the terminal device 10, the information processing device 20 extracts important image data and non-important image data corresponding to each of the important area and the non-important area from the image data based on the line-of-sight information. The information processing device 20 adds slice-related information for using the corresponding slice to each image data so that the extracted image data is transmitted under desired transmission conditions (for example, low delay, best effort, etc.). ..
  • the slice-related information is information (metadata) for specifying a slice to be assigned according to the use and characteristics of the signal to be transmitted (important image data or non-important image data in this case).
  • the control unit 21 of the information processing device 20 controls the communication unit 23 so as to associate the image data with the slice-related information for specifying the slice to which the image data is to be assigned and transmit the image data to the base station device 30.
  • the base station apparatus 30 assigns each received image data to a slice specified by each slice-related information, and transmits each image data to the terminal apparatus 10 using the assigned slice.
  • the information processing device 20 determines that important image data is a signal having a high priority and needs to be transmitted with a low delay.
  • the information processing device 20 generates slice-related information for identifying a slice S1 having a high priority so as to ensure transmission with low delay for the important image data.
  • the information processing device 20 associates the generated slice-related information with important image data.
  • the information processing device 20 determines that the non-important image data is a signal having a lower priority than the important image data.
  • the information processing apparatus 20 generates slice-related information for identifying a slice S2 different from the slice S1 (for example, a slice S2 having a lower priority than the slice S1).
  • the information processing device 20 associates the generated slice-related information with the non-important image data.
  • the information processing apparatus 20 can allocate and transmit slices according to the characteristics required for each image data, and the user can view the important image data having a high priority in a low delay state while not transmitting it. Important image data can also be viewed.
  • control unit 21 of the information processing apparatus 20 has described an example of generating slice-related information for specifying a slice to be assigned to image data to be transmitted, the present invention is not limited to this.
  • the control unit 21 of the information processing apparatus 20 may simply associate the identification information indicating the type (important, non-important, etc.) of the image data to be transmitted with the image data.
  • the base station apparatus 30 may allocate each image data to the corresponding slices for each type based on the identification information associated with each received image data.
  • the base station apparatus 30 holds in advance the relationship between the type of image data and the slices to be assigned for each type, and assigns slices to each image data by referring to the relationship.
  • the base station apparatus 30 can assign slices to the image data, so that each image data can be transmitted in different network slices.
  • FIG. 5 is a sequence diagram for explaining the communication process according to the first embodiment of the present disclosure.
  • the terminal device 10 detects the line of sight of the user (step S101). Next, the terminal device 10 determines the gaze area to be gazed by the user from the detected line of sight of the user, and transmits the line-of-sight information including the information about the determined gaze area to the information processing device 20 (step S102).
  • the information processing device 20 sets an important area based on the line-of-sight information (step S103).
  • the information processing device 20 extracts important image data in an important region and non-important image data in an region excluding the important region from the image data, and assigns each of the extracted image data to different slices (step S104).
  • the information processing device 20 distributes the video to the user by transmitting each image data in each slice (step S105).
  • the information processing system 1 may execute the above-mentioned communication processing for each frame of the video to be distributed, for example.
  • the cycle of line-of-sight detection and the cycle of video distribution (frame cycle) may be different.
  • the information processing device 20 updates the important area at the timing when the line-of-sight information is acquired from the terminal device 10.
  • the terminal device 10 shall display to the user the non-important image data that was successfully received at a timing prior to the reception failure.
  • the terminal device 10 displays the non-important image data received before that until the non-important image data is received to the user. It shall be.
  • the terminal device 10 receives the non-important image data received before the non-important image data in which the defect or the error occurs. It shall be displayed to the user. Alternatively, the terminal device 10 may interpolate (correct) using the non-important image data for which the loss or error is correctly received and present it to the user.
  • the information processing device 20 divides the image data into an important area and a non-important area according to the gaze area detected by the terminal device 10.
  • the information processing device 20 allocates the image data of the important region and the image data of the non-important region to different network slices.
  • the information processing system 1 can suppress the deterioration of usability and the deterioration of the utilization efficiency of wireless resources.
  • Second embodiment >> In the first embodiment described above, the information processing device 20 sets the important area based on the line-of-sight information detected by the terminal device 10, but the information processing device 20 sets the important area based on the image data. You may. Therefore, as a second embodiment, a case where the information processing apparatus 20 sets an important region based on the image data will be described.
  • FIG. 6 is a block diagram showing the configuration of the information processing system 2 according to the second embodiment of the present disclosure.
  • the terminal device 10 of the information processing system 2 shown in FIG. 6 is different from the information processing system 1 shown in FIG. 3 in that it does not have the line-of-sight detection unit 111 and does not transmit the line-of-sight information to the information processing device 20. ..
  • the important area setting unit 211 of the information processing device 20 shown in FIG. 6 sets an important area based on image data instead of line-of-sight information.
  • FIGS. 7 and 8 are diagrams for explaining important areas according to the second embodiment of the present disclosure.
  • the important area setting unit 211 identifies an important area, for example, by performing image recognition of image data in advance. For example, as shown in FIG. 7, when the video of a soccer game is distributed, the important area setting unit 211 identifies the ball and the player holding the ball from the image data by image recognition, and holds the ball. The area including the player and the periphery of the player is set as the important area R1. Further, the important area setting unit 211 sets an area other than the important area R1 as a non-important area.
  • the important area setting unit 211 recognizes the image data and sets the important area according to the type of the video to be distributed.
  • the important area setting unit 211 sets a predetermined area including a player in the case of sports and a performer in the case of a drama or a movie as an important area.
  • a predetermined area including a player in the case of sports and a performer in the case of a drama or a movie.
  • what the video creator uses to set the important area for example, an object such as a person or a ball
  • the important area setting unit 211 performs image recognition, identifies the designated one, and sets the important area.
  • the important area setting unit 211 may set a predetermined (set) area as the important area.
  • the important area is generally placed in the center of the screen. Therefore, as shown in FIG. 8, the important area setting unit 211 sets, for example, the central area of the screen as the important area and the other peripheral areas as the non-important area.
  • FIG. 9 is a sequence diagram for explaining the communication process according to the second embodiment of the present disclosure.
  • the information processing device 20 performs image recognition of image data (step S201).
  • the information processing device 20 sets an important region based on the result of image recognition (step S202).
  • the subsequent processing is the same as the communication processing shown in FIG.
  • the information processing device 20 divides the image data into an important area and a non-important area according to the result of image recognition.
  • the information processing device 20 allocates the image data of the important region and the image data of the non-important region to different network slices.
  • the information processing system 2 can suppress the deterioration of usability and the deterioration of the utilization efficiency of wireless resources.
  • the information processing device 20 delivers the video to the terminal device 10, but the information processing device may receive the video. Therefore, as a third embodiment, a case where the information processing device 60 receives an image from the image pickup device 50 will be described.
  • FIG. 10 is a block diagram showing the configuration of the information processing system 3 according to the third embodiment of the present disclosure.
  • the information processing system 3 shown in FIG. 10 has an imaging device 50 instead of the terminal device 10 shown in FIG. Further, the information processing device 60 is different from the information processing device 20 shown in FIG. 3 in that it does not have the important area setting unit 211 and the display unit 24.
  • the imaging device 50 shown in FIG. 10 includes a control unit 51, a storage unit 52, a communication unit 53, and an imaging unit 54.
  • the image pickup unit 54 takes an image of the image pickup target and generates image data.
  • the imaging unit 54 may support high-resolution shooting such as 4K (horizontal number of pixels 3840 x vertical pixel number 2160) or 8K (horizontal pixel number 7680 x vertical pixel number 4320).
  • the storage unit 52 is realized by, for example, a semiconductor memory element such as a RAM or a flash memory, or a storage device such as a hard disk or an optical disk.
  • the storage unit 52 stores programs, calculation parameters, and the like used for processing by the control unit 51.
  • the communication unit 53 is a communication interface (I / F) that communicates with an external device.
  • the communication unit 53 is realized by, for example, a NIC (Network Interface Card) or the like.
  • the communication unit 53 connects to the core network by performing wireless communication with the base station device 30.
  • the communication unit 53 transmits image data to the information processing device 20 via the base station device 30.
  • the control unit 51 is, for example, a controller, and various programs stored in a storage device inside the image pickup device 50 by a CPU (Central Processing Unit), an MPU (Micro Processing Unit), or the like are stored in a RAM (Random Access Memory). ) Is executed as a work area.
  • the various programs include programs of applications installed in the image pickup apparatus 50.
  • the control unit 51 is realized by, for example, an integrated circuit such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field Programmable Gate Array).
  • the control unit 51 has an important area setting unit 511, and realizes or executes the functions and operations of information processing described below.
  • the internal configuration of the control unit 51 is not limited to the configuration shown in FIG. 10, and may be another configuration as long as it is a configuration for performing information processing described later.
  • the important area setting unit 511 performs image analysis (for example, image recognition) of the image data captured by the imaging unit 54 to set the important area. For example, when the image data is a photograph of a sport such as soccer, the important area setting unit 511 sets the area including the ball and the player holding the ball as the important area. Alternatively, when the image data is a photograph of a marathon, the area including the runner is set as the important area, and the other area including, for example, the spectators along the road is set as the non-important area.
  • image analysis for example, image recognition
  • the important area setting unit 511 identifies the person on the image. It may be.
  • the image pickup device 50 may be provided with a sensor such as a microphone so that the image pickup device 50 detects the direction of sound using the sensor.
  • the important area setting unit 511 may set the area determined based on the direction of the sound as the important area, assuming that there is an important person (for example, a speaker) in the direction of the detected sound.
  • the important area setting unit 511 sets the area including the performer who was the most popular in the voting by the user as the important area.
  • the important area setting unit 511 may set the important area according to the scenario of the image to be captured. For example, in a live music program or the like, there is a case where it is decided in advance as a scenario which camera is used at what time and what kind of image is to be shot. In this case, the important area setting unit 511 sets the important area and the non-important area according to a predetermined scenario.
  • the important area setting unit 511 allocates the image data of the important area and the image data of the non-important area to different network slices among the image data of the captured video. Since such an allocation is the same as the allocation by the information processing apparatus 20 of the information processing system 1, detailed description thereof will be omitted.
  • FIG. 11 is a sequence diagram for explaining the communication process according to the third embodiment of the present disclosure.
  • the image pickup apparatus 50 performs image recognition of image data (step S301).
  • the image pickup apparatus 50 sets an important region based on the result of image recognition (step S302).
  • the imaging device 50 extracts important image data in an important region and non-important image data in an region excluding the important region from the image data, and assigns each extracted image data to a different slice (step S303).
  • the image pickup apparatus 50 transmits an image to the information processing apparatus 20 by transmitting each image data in each slice (step S304).
  • the image pickup apparatus 50 divides the image data into an important region and a non-important region according to, for example, the result of image recognition.
  • the image pickup apparatus 50 allocates the image data of the important region and the image data of the non-important region to different network slices. In this way, even when the device on the terminal side (here, the image pickup device 50) sends an image to the information processing device 20, the information processing system 3 suppresses the deterioration of usability and the utilization efficiency of wireless resources. Can be suppressed.
  • the terminal device 10 is a head-mounted device, but the present invention is not limited to this.
  • the terminal device 10 may be a device for displaying an image to a user, and may be a device such as a TV, a personal computer, or a smartphone.
  • the information processing systems 1 to 3 transmit the important image data in the slice S1 having low delay or high reliability, and transmit the non-important image data in the slice S2 which transmits the non-important image data with the best effort. , Not limited to this.
  • the information processing device 20 or the image pickup device 50 may transmit the important image data as data having a higher resolution than the non-important image data.
  • the image pickup device 50 when the image pickup device 50 captures a 4K or 8K image, the image pickup device 50 transmits important image data with a high resolution of 4K or 8K.
  • the image pickup apparatus 50 performs processing such as down-conversion on the non-important image data to lower the resolution and transmit the data. In other words, it can be said that the image pickup apparatus 50 down-converts the image data in the non-important region to generate the non-important image data.
  • the information processing systems 1 to 3 may change the resolution according to the importance of the area.
  • the important image data may have a higher resolution than the non-important image data.
  • the important area setting units 211 and 511 set the important area and the non-important area, in other words, the image data is divided into two levels of importance, has been shown, but the present invention is limited to this. Not done.
  • the important area setting units 211 and 511 may divide the image data into areas according to the importance of three or more levels. In this case, the important area setting units 211 and 511 allocate to different network slices according to the importance of the divided areas. For example, it is assumed that the important area setting units 211 and 511 divide the image data into three stages of the most important, important, and non-important areas.
  • the important area setting units 211 and 511 allocate, for example, the image data divided into the most important areas to slices that require low-delay and highly reliable communication.
  • the important area setting units 211 and 511 allocate the image data divided into important areas to slices that require low-delay communication, and allocate the image data divided into non-important areas to slices to be transmitted with best effort.
  • the number of areas divided by the important area setting units 211 and 511 and the number of slices to be allocated may be plural, and the number is not limited.
  • the control device for controlling the terminal device 10, the information processing devices 20, 60, the base station device 30, or the image pickup device 50 of each of the above embodiments may be realized by a dedicated computer system or a general-purpose computer system. ..
  • a communication program for executing the above-mentioned operation (for example, communication processing) is stored and distributed in a computer-readable recording medium such as an optical disk, a semiconductor memory, a magnetic tape, or a flexible disk.
  • the control device is configured by installing the program on a computer and executing the above-mentioned processing.
  • the control device may be a terminal device 10, information processing devices 20, 60, a base station device 30, or an external device (for example, a personal computer) of the image pickup device 50.
  • the control device is a terminal device 10, an information processing device 20, 60, a base station device 30, or a device inside the image pickup device 50 (for example, a control unit 11, a control unit 21, or a control unit 51). May be good.
  • the above communication program may be stored in a disk device provided in a server device on a network such as the Internet so that it can be downloaded to a computer or the like.
  • the above-mentioned functions may be realized by collaboration between the OS (Operating System) and the application software.
  • the part other than the OS may be stored in a medium and distributed, or the part other than the OS may be stored in the server device so that it can be downloaded to a computer or the like.
  • each component of each device shown in the figure is a functional concept, and does not necessarily have to be physically configured as shown in the figure. That is, the specific form of distribution / integration of each device is not limited to the one shown in the figure, and all or part of the device is functionally or physically dispersed / physically distributed in arbitrary units according to various loads and usage conditions. Can be integrated and configured.
  • the present embodiment includes a device or any configuration constituting the system, for example, a processor as a system LSI (Large Scale Integration) or the like, a module using a plurality of processors, a unit using a plurality of modules, or a unit. It can also be implemented as a set or the like (that is, a part of the configuration of the device) to which other functions are added.
  • a processor as a system LSI (Large Scale Integration) or the like, a module using a plurality of processors, a unit using a plurality of modules, or a unit. It can also be implemented as a set or the like (that is, a part of the configuration of the device) to which other functions are added.
  • LSI Large Scale Integration
  • the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Therefore, a plurality of devices housed in separate housings and connected via a network, and a device in which a plurality of modules are housed in one housing are both systems. ..
  • the present embodiment can have a cloud computing configuration in which one function is shared and jointly processed by a plurality of devices via a network.
  • the important image data of the important area and the non-important image data of the non-important area are extracted from the image data.
  • a control unit that assigns the important image data and the non-important image data to different network slices.
  • Information processing device equipped with (2) The information processing device according to (1), wherein the control unit allocates the important image data to a network slice having a high priority.
  • the control unit Obtaining information on the important area according to the line-of-sight direction of the user who views the image data, The information processing apparatus according to (1) or (2), which extracts the important image data and the non-important image data from the image data according to the acquired information.
  • the important image data of the important area and the non-important image data of the non-important area are extracted from the image data. Allocate the important image data and the non-important image data to different network slices. Information processing method.
  • the computer extracts important image data in the important area and non-important image data in the non-important area from the image data.
  • a control unit that assigns the important image data and the non-important image data to different network slices.
  • a program that functions as.
  • Information processing system 10 Terminal equipment 11, 21, 51 Control unit 12, 22, 52 Storage unit 13, 23, 53 Communication unit 14, 24 Display unit 20, 60 Information processing equipment 30 Base station equipment 50 Imaging device 64 Imaging unit 111 Line-of-sight detection unit 211,511 Important area setting unit

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

This information processing device (20) comprises a control unit (21). The control unit (21) extracts, from image data, important image data in an important area and unimportant image data in an unimportant area. The control unit (21) allocates the important image data and the unimportant image data to respective different network slices.

Description

情報処理装置、情報処理方法及びプログラムInformation processing equipment, information processing methods and programs
 本開示は、情報処理装置、情報処理方法及びプログラムに関する。 This disclosure relates to information processing devices, information processing methods and programs.
 従来、無線通信を利用して各種データのやり取りを行う無線通信技術が存在する。例えば、近年、スマートフォンなどの端末装置から無線ネットワークを介してモバイルネットワークゲームにアクセスするアプリケーションが増加している。このようなアプリケーションが端末装置へ画像を伝送する際に、3次元モデルを2次元にレンダリングする技術が知られている。かかる技術では、画面内のユーザによる注視オブジェクトに対して重点的にビットレートを割り振ることで、端末装置の処理負荷を低減しながら、高画質な画像を端末装置に伝送している。 Conventionally, there is a wireless communication technology that exchanges various data using wireless communication. For example, in recent years, an increasing number of applications have been used to access mobile network games from terminal devices such as smartphones via wireless networks. A technique for rendering a three-dimensional model in two dimensions when such an application transmits an image to a terminal device is known. In such a technique, a high-quality image is transmitted to a terminal device while reducing the processing load of the terminal device by allocating a bit rate to the gaze object by the user on the screen.
特開2007-79664号公報Japanese Unexamined Patent Publication No. 2007-79664
 しかしながら、上記技術では、画像の伝送量が増加した場合に、画像伝送の遅延や画質劣化が発生してしまう恐れがあり、ユーザビリティが低下してしまう恐れがあった。 However, with the above technology, when the amount of image transmission increases, there is a risk that image transmission delay and image quality deterioration may occur, and usability may deteriorate.
 そこで、本開示では、ユーザビリティの低下を抑制することができる情報処理装置、情報処理方法及びプログラムを提供する。 Therefore, the present disclosure provides an information processing device, an information processing method, and a program capable of suppressing deterioration of usability.
 本開示によれば、情報処理装置が提供される。情報処理装置は、制御部を備える。制御部は、画像データから重要領域の重要画像データと、非重要領域の非重要画像データと、を抽出する。制御部は、前記重要画像データと前記非重要画像データとをそれぞれ異なるネットワークスライスに割り当てる。 According to the present disclosure, an information processing device is provided. The information processing device includes a control unit. The control unit extracts important image data of the important region and non-important image data of the non-important region from the image data. The control unit allocates the important image data and the non-important image data to different network slices.
本開示の提案技術が適用される情報処理システムの概要を示す図である。It is a figure which shows the outline of the information processing system to which the proposed technique of this disclosure is applied. 本開示の提案技術の概要を説明するための図である。It is a figure for demonstrating the outline of the proposed technique of this disclosure. 本開示の第1の実施形態に係る情報処理システムの構成の一例を示す図である。It is a figure which shows an example of the structure of the information processing system which concerns on 1st Embodiment of this disclosure. 本開示の第1の実施形態に係る注視領域について説明するための図である。It is a figure for demonstrating the gaze area which concerns on 1st Embodiment of this disclosure. 本開示の第1の実施形態にかかる通信処理を説明するためのシーケンス図である。It is a sequence diagram for demonstrating the communication process which concerns on the 1st Embodiment of this disclosure. 本開示の第2の実施形態に係る情報処理システムの構成を示すブロック図である。It is a block diagram which shows the structure of the information processing system which concerns on the 2nd Embodiment of this disclosure. 本開示の第2の実施形態に係る重要領域について説明するための図である。It is a figure for demonstrating the important area which concerns on the 2nd Embodiment of this disclosure. 本開示の第2の実施形態に係る重要領域について説明するための図である。It is a figure for demonstrating the important area which concerns on the 2nd Embodiment of this disclosure. 本開示の第2の実施形態にかかる通信処理を説明するためのシーケンス図である。It is a sequence diagram for demonstrating the communication process which concerns on the 2nd Embodiment of this disclosure. 本開示の第3の実施形態に係る情報処理システムの構成を示すブロック図である。It is a block diagram which shows the structure of the information processing system which concerns on 3rd Embodiment of this disclosure. 本開示の第3の実施形態にかかる通信処理を説明するためのシーケンス図である。It is a sequence diagram for demonstrating the communication process which concerns on the 3rd Embodiment of this disclosure.
 以下に添付図面を参照しながら、本開示の好適な実施の形態について詳細に説明する。なお、本明細書及び図面において、実質的に同一の機能構成を有する構成要素については、同一の符号を付することにより重複説明を省略する。 Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. In the present specification and the drawings, components having substantially the same functional configuration are designated by the same reference numerals, so that duplicate description will be omitted.
 また、以下の説明において、具体例として例えば数値を挙げて説明する場合があるが、かかる数値は一例であり、別の数値を使用してもよい。 Further, in the following description, for example, a numerical value may be given as a specific example, but such a numerical value is an example, and another numerical value may be used.
 なお、説明は以下の順序で行うものとする。
  1.はじめに
    1.1.提案技術の概要
  2.第1の実施形態
    2.1.情報処理システム
    2.2.重要領域の設定
    2.3.通信処理
  3.第2の実施形態
  4.第3の実施形態
  5.その他の実施形態
  6.補足
The explanations will be given in the following order.
1. 1. Introduction 1.1. Outline of the proposed technology 2. First Embodiment 2.1. Information processing system 2.2. Setting important areas 2.3. Communication processing 3. Second embodiment 4. Third embodiment 5. Other embodiments 6. supplement
 <<1.はじめに>>
 <1.1.提案技術の概要>
 まず、図1、図2を用いて、本開示に係る提案技術の概要について説明する。図1は、本開示の提案技術が適用される情報処理システム1の概要を示す図である。図2は、本開示の提案技術の概要を説明するための図である。本開示に係る提案技術は、図1に示す情報処理システム1で実施される。図1に示すように、情報処理システム1は、情報処理装置20と、端末装置10と、を有する。情報処理装置20及び端末装置10は、基地局装置30を介して通信を行う。
<< 1. Introduction >>
<1.1. Outline of proposed technology >
First, the outline of the proposed technique according to the present disclosure will be described with reference to FIGS. 1 and 2. FIG. 1 is a diagram showing an outline of an information processing system 1 to which the proposed technique of the present disclosure is applied. FIG. 2 is a diagram for explaining an outline of the proposed technique of the present disclosure. The proposed technique according to the present disclosure is implemented in the information processing system 1 shown in FIG. As shown in FIG. 1, the information processing system 1 includes an information processing device 20 and a terminal device 10. The information processing device 20 and the terminal device 10 communicate with each other via the base station device 30.
 情報処理装置20は、例えばサーバ装置であり、基地局装置30を介して端末装置10にゲーム配信や動画配信等のサービスを提供する。情報処理装置20は、例えば端末装置10に動画像を送信する。 The information processing device 20 is, for example, a server device, and provides services such as game distribution and video distribution to the terminal device 10 via the base station device 30. The information processing device 20 transmits a moving image to, for example, the terminal device 10.
 端末装置10は、ユーザに対して、情報処理装置20から受信した動画像を表示する。図1では、端末装置10が、ユーザの頭部の少なくとも一部に装着して使用する所謂頭部装着型デバイスとして構成される場合について示している。端末装置10は、例えば、当該ユーザの視線を検出可能に構成されていてもよい。 The terminal device 10 displays a moving image received from the information processing device 20 to the user. FIG. 1 shows a case where the terminal device 10 is configured as a so-called head-mounted device that is worn and used on at least a part of the user's head. The terminal device 10 may be configured to be able to detect the line of sight of the user, for example.
 基地局装置30は、端末装置10と無線通信する無線通信装置である。基地局装置30は、例えば、無線基地局(Node B、eNB、gNB、など)に相当する装置であり、端末装置10にNR(New Radio)などのセルラー通信サービスを提供する。基地局装置30は、無線リレー局であってもよい。基地局装置30は、RSU(Road Side Unit)等の路上基地局装置であってもよい。また、基地局装置30は、RRH(Remote Radio Head)と呼ばれる光張り出し装置であってもよい。 The base station device 30 is a wireless communication device that wirelessly communicates with the terminal device 10. The base station device 30 is, for example, a device corresponding to a radio base station (Node B, eNB, gNB, etc.), and provides the terminal device 10 with a cellular communication service such as NR (New Radio). The base station device 30 may be a wireless relay station. The base station device 30 may be a road base station device such as an RSU (Road Side Unit). Further, the base station device 30 may be an optical overhanging device called RRH (Remote Radio Head).
 基地局装置30は、例えばネットワークを介して情報処理装置20と接続し、例えば動画像など情報処理装置20が提供するサービスに関する情報を端末装置10に送信する。また、基地局装置30は、端末装置10からの情報を、ネットワークを介して情報処理装置20に提供する。 The base station device 30 connects to the information processing device 20 via, for example, a network, and transmits information about services provided by the information processing device 20, such as moving images, to the terminal device 10. Further, the base station device 30 provides the information from the terminal device 10 to the information processing device 20 via the network.
 このように、情報処理装置20と端末装置10とが、基地局装置30を介して情報を送受信することで、端末装置10は、情報処理装置20から例えば動画像配信といったサービスの提供を受けることができる。 In this way, the information processing device 20 and the terminal device 10 transmit and receive information via the base station device 30, so that the terminal device 10 receives a service such as video distribution from the information processing device 20. Can be done.
 ここで、情報処理装置20が、配信する動画像データを同品質の伝送路を用いて送信すると、通信環境によってデータの欠損やエラーが発生してしまい、ユーザビリティが低下するという問題があった。 Here, when the information processing device 20 transmits the moving image data to be distributed using a transmission line of the same quality, there is a problem that data loss or error occurs depending on the communication environment, and usability is deteriorated.
 近年、LTEに対する次世代の無線アクセス方式としてNR(New Radio)の検討が進められている。NR及び次世代コア網においては、eMBB(Enhanced Mobile Broadband)、mMTC(Massive Machine Type Communications)、及びURLLC(Ultra Reliable and Low Latency Communications)を含む様々なユースケースを想定した複数形態の通信をひとつのネットワークで収容するためのスライシング技術(Network Slicing)が検討されている。スライシング技術によれば、スライス又はネットワークスライスと呼ばれる論理ネットワークをひとつの物理ネットワークにおいて共存させることが可能である。これにより、ユーザが利用するサービスの要求条件に合わせて効率的にネットワークを提供することが可能となる。 In recent years, NR (New Radio) is being studied as a next-generation wireless access method for LTE. In NR and next-generation core networks, one type of communication assumes various use cases including eMBB (Enhanced Mobile Broadband), mMTC (Massive Machine Type Communications), and URLLC (Ultra Reliable and Low Latency Communications). Slicing technology (Network Slicing) for accommodating in a network is being studied. According to slicing technology, logical networks called slices or network slices can coexist in one physical network. This makes it possible to efficiently provide the network according to the requirements of the service used by the user.
 そこで、図2に示すように、提案技術では、情報処理装置20が、例えば配信する動画像の1フレームである画像データから重要領域と非重要領域とを抽出する。情報処理装置20は、重要領域の画像データと、非重要領域の画像データと、をそれぞれ異なるネットワークスライスに割り当てて送信する。これにより、1つの画像データを領域に応じて異なる品質の伝送路を用いて送信することができる。 Therefore, as shown in FIG. 2, in the proposed technique, the information processing device 20 extracts an important region and a non-important region from image data which is one frame of a moving image to be distributed, for example. The information processing device 20 allocates the image data of the important area and the image data of the non-important area to different network slices and transmits the image data. As a result, one image data can be transmitted using transmission lines of different qualities depending on the area.
 例えば、図2では、情報処理装置20は、重要領域として、例えばユーザが注視している領域を抽出し、それ以外の領域を非重要領域として抽出する。情報処理装置20は、重要領域の画像データを高信頼・低遅延でデータを送信するスライスS1に割り当てて送信し、非重要領域をベストエフォートで送信するスライスS2に割り当てて送信する。 For example, in FIG. 2, the information processing apparatus 20 extracts, for example, an area that the user is gazing at as an important area, and extracts other areas as non-important areas. The information processing apparatus 20 allocates the image data of the important region to the slice S1 for transmitting the data with high reliability and low delay and transmits the image data, and allocates the non-important region to the slice S2 for transmitting the data with the best effort and transmits the data.
 これにより、情報処理装置20は、例えばユーザが注視している重要領域の画像データを高信頼・低遅延で送信することができ、重要領域のデータの遅延や画質劣化を抑制することができ、ユーザビリティの低下を抑制することができる。 As a result, the information processing device 20 can transmit, for example, image data of an important region that the user is gazing at with high reliability and low delay, and can suppress delay of data in the important region and deterioration of image quality. It is possible to suppress the deterioration of usability.
 また、情報処理装置20は、例えばユーザが注視していない非重要領域の画像データを通常の品質(例えばベストエフォート)で送信する。これにより、非重要領域ではデータの遅延や画質劣化が発生する可能性があるが、かかる領域は、ユーザが注視している領域ではないので、ユーザビリティに大きな影響を与えない。 Further, the information processing device 20 transmits, for example, image data of a non-important area that the user is not paying attention to with normal quality (for example, best effort). As a result, data delay and image quality deterioration may occur in the non-important area, but since such an area is not the area that the user is paying attention to, the usability is not significantly affected.
 このように、情報処理装置20が、画像データの重要領域及び非重要領域を異なるネットワークスライスに割り当てることで、重要領域のデータ遅延や画質劣化を抑制し、ユーザビリティの劣化を抑制することができる。 In this way, the information processing device 20 allocates the important area and the non-important area of the image data to different network slices, so that the data delay and the image quality deterioration of the important area can be suppressed, and the deterioration of usability can be suppressed.
 また、情報処理装置20が、全ての画像データを高信頼・低遅延のネットワークスライスに割り当てて送信すると、無線リソースを大幅に占拠し、当該無線リソースの利用効率が低下してしまう恐れがある。 Further, if the information processing device 20 allocates all the image data to a network slice with high reliability and low delay and transmits the data, the radio resource may be significantly occupied and the utilization efficiency of the radio resource may decrease.
 そこで、本開示の提案技術では、上述したように、情報処理装置20が、画像データの重要領域及び非重要領域を異なるネットワークスライスに割り当てる。これにより、情報処理システム1は、ユーザビリティの劣化を抑制しつつ、無線リソースの利用効率の低下を抑制することができる。 Therefore, in the proposed technique of the present disclosure, as described above, the information processing apparatus 20 allocates the important region and the non-important region of the image data to different network slices. As a result, the information processing system 1 can suppress the deterioration of the utilization efficiency of the wireless resource while suppressing the deterioration of the usability.
 <<2.第1の実施形態>>
 <2.1.情報処理システム>
 図3は、本開示の第1の実施形態に係る情報処理システム1の構成の一例を示す図である。情報処理システム1は、端末装置10と、情報処理装置20と、を有する。
<< 2. First Embodiment >>
<2.1. Information processing system >
FIG. 3 is a diagram showing an example of the configuration of the information processing system 1 according to the first embodiment of the present disclosure. The information processing system 1 includes a terminal device 10 and an information processing device 20.
 [端末装置10]
 端末装置10は、例えば、ARグラスやヘッドマウントディスプレイなど、ユーザの頭部に装着する、いわゆる頭部装着型デバイスである。
[Terminal device 10]
The terminal device 10 is a so-called head-mounted device that is worn on the user's head, such as an AR glass or a head-mounted display.
 図3に示すように、端末装置10は、制御部11と、記憶部12と、通信部13と、表示部14と、を有する。なお、端末装置10の構成は、図3に示す構成に限られず、後述する情報処理を行う構成であれば他の構成であってもよい。 As shown in FIG. 3, the terminal device 10 includes a control unit 11, a storage unit 12, a communication unit 13, and a display unit 14. The configuration of the terminal device 10 is not limited to the configuration shown in FIG. 3, and may be another configuration as long as it is a configuration for performing information processing described later.
 (通信部13)
 通信部13は、外部装置との通信を行う通信インタフェース(I/F)である。通信部13は、例えば、NIC(Network Interface Card)等によって実現される。例えば、通信部13は、基地局装置30と無線通信を行うことでコアネットワークに接続する。通信部13は、基地局装置30を介して情報処理装置20から画像データを受信する。通信部13は、基地局装置30を介して情報処理装置20に後述する視線情報を送信する。
(Communication unit 13)
The communication unit 13 is a communication interface (I / F) that communicates with an external device. The communication unit 13 is realized by, for example, a NIC (Network Interface Card) or the like. For example, the communication unit 13 connects to the core network by performing wireless communication with the base station device 30. The communication unit 13 receives image data from the information processing device 20 via the base station device 30. The communication unit 13 transmits the line-of-sight information described later to the information processing device 20 via the base station device 30.
 (表示部14)
 表示部14は、情報処理装置20が送信する画像データを表示する。本実施形態にかかる端末装置10が、例えば、ARグラスやヘッドマウントディスプレイである場合、表示部14は、これらARグラスやヘッドマウントディスプレイの表示部に相当する。なお、ここでは、端末装置10がARグラスやヘッドマウントディスプレイであるとしたが、これに限定されない。例えば、表示部14を、端末装置10とは別の表示装置(上述のARグラスやヘッドマウントディスプレイ等)とし、端末装置10を、かかる表示装置に画像データを表示させる情報処理装置としてもよい。この場合、端末装置10は、例えばスマートフォンやPC等であってもよい。
(Display unit 14)
The display unit 14 displays the image data transmitted by the information processing device 20. When the terminal device 10 according to the present embodiment is, for example, an AR glass or a head-mounted display, the display unit 14 corresponds to the display unit of these AR glasses or the head-mounted display. Here, it is assumed that the terminal device 10 is an AR glass or a head-mounted display, but the present invention is not limited to this. For example, the display unit 14 may be a display device (such as the AR glass or head-mounted display described above) different from the terminal device 10, and the terminal device 10 may be an information processing device for displaying image data on the display device. In this case, the terminal device 10 may be, for example, a smartphone or a PC.
 (記憶部12)
 記憶部12は、例えば、RAM、フラッシュメモリ等の半導体メモリ素子、または、ハードディスク、光ディスク等の記憶装置によって実現される。記憶部12は、制御部11の処理に用いられるプログラムや演算パラメータ等を記憶する。
(Memory unit 12)
The storage unit 12 is realized by, for example, a semiconductor memory element such as a RAM or a flash memory, or a storage device such as a hard disk or an optical disk. The storage unit 12 stores programs, calculation parameters, and the like used for processing by the control unit 11.
 (制御部11)
 制御部11は、例えば、コントローラ(controller)であり、CPU(Central Processing Unit)やMPU(Micro Processing Unit)等によって、端末装置10内部の記憶装置に記憶されている各種プログラムがRAM(Random Access Memory)を作業領域として実行されることにより実現される。例えば、この各種プログラムには、端末装置10にインストールされたアプリケーションのプログラムが含まれる。また、制御部11は、例えば、ASIC(Application Specific Integrated Circuit)やFPGA(Field Programmable Gate Array)等の集積回路により実現される。
(Control unit 11)
The control unit 11 is, for example, a controller, and various programs stored in a storage device inside the terminal device 10 by a CPU (Central Processing Unit), an MPU (Micro Processing Unit), or the like are stored in a RAM (Random Access Memory). ) Is executed as a work area. For example, the various programs include programs of applications installed in the terminal device 10. Further, the control unit 11 is realized by, for example, an integrated circuit such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field Programmable Gate Array).
 制御部11は、視線検出部111を有し、以下に説明する情報処理の機能や作用を実現または実行する。なお、制御部11の内部構成は、図3に示した構成に限られず、後述する情報処理を行う構成であれば他の構成であってもよい。 The control unit 11 has a line-of-sight detection unit 111, and realizes or executes an information processing function or operation described below. The internal configuration of the control unit 11 is not limited to the configuration shown in FIG. 3, and may be another configuration as long as it is a configuration for performing information processing described later.
 (視線検出部111)
 視線検出部111は、例えば、端末装置10に搭載されたカメラや光センサ、動きセンサ(いずれも図示省略)等の検出結果に基づき、アイトラッキング技術を利用して、ユーザの視線方向を検出する。視線検出部111は、検出した視線方向に基づき、表示画面のうち、ユーザが注視している注視領域を決定する。視線検出部111は、決定した注視領域を含む視線情報を情報処理装置20に送信する。なお、注視領域は、後述するように、情報処理装置20によって重要領域を設定する際に使用される領域であるため、重要領域に関する情報とも言える。
(Gaze detection unit 111)
The line-of-sight detection unit 111 detects the user's line-of-sight direction by using eye tracking technology based on the detection results of, for example, a camera, an optical sensor, and a motion sensor (all not shown) mounted on the terminal device 10. .. The line-of-sight detection unit 111 determines the gaze area of the display screen that the user is gazing at based on the detected line-of-sight direction. The line-of-sight detection unit 111 transmits the line-of-sight information including the determined gaze area to the information processing device 20. As will be described later, the gaze area is an area used when the information processing apparatus 20 sets an important area, and thus can be said to be information about the important area.
 [情報処理装置20]
 図3に示すように、情報処理装置20は、例えば、サーバ装置であり、端末装置10に映像やデータ(例えば、動画像)の配信を行う。
 情報処理装置20は、制御部21と、記憶部22と、通信部23と、表示部24と、を有する。なお、情報処理装置20の構成は、図3に示す構成に限られず、後述する情報処理を行う構成であれば他の構成であってもよい。
[Information processing device 20]
As shown in FIG. 3, the information processing device 20 is, for example, a server device, and delivers video and data (for example, moving images) to the terminal device 10.
The information processing device 20 includes a control unit 21, a storage unit 22, a communication unit 23, and a display unit 24. The configuration of the information processing device 20 is not limited to the configuration shown in FIG. 3, and may be any other configuration as long as it performs information processing described later.
 (通信部23)
 通信部23は、外部装置との通信を行う通信インタフェース(I/F)である。通信部23は、例えば、NIC(Network Interface Card)等によって実現される。例えば、通信部23は、基地局装置30とネットワークを介して接続する。通信部23は、基地局装置30を介して端末装置10から視線情報を受信する。通信部23は、基地局装置30を介して端末装置10に画像データを送信する。
(Communication unit 23)
The communication unit 23 is a communication interface (I / F) that communicates with an external device. The communication unit 23 is realized by, for example, a NIC (Network Interface Card) or the like. For example, the communication unit 23 connects to the base station device 30 via a network. The communication unit 23 receives line-of-sight information from the terminal device 10 via the base station device 30. The communication unit 23 transmits image data to the terminal device 10 via the base station device 30.
 (表示部24)
 表示部24は、情報処理装置20の各種情報を表示するディスプレイである。表示部24は、例えば液晶ディスプレイ等のディスプレイであってもよい。
(Display unit 24)
The display unit 24 is a display that displays various information of the information processing device 20. The display unit 24 may be a display such as a liquid crystal display.
 (記憶部22)
 記憶部22は、例えば、RAM、フラッシュメモリ等の半導体メモリ素子、または、ハードディスク、光ディスク等の記憶装置によって実現される。記憶部22は、制御部11の処理に用いられるプログラムや演算パラメータ等を記憶する。記憶部22が、端末装置10に送信する画像データを保持していてもよい。
(Memory unit 22)
The storage unit 22 is realized by, for example, a semiconductor memory element such as a RAM or a flash memory, or a storage device such as a hard disk or an optical disk. The storage unit 22 stores programs, calculation parameters, and the like used for processing by the control unit 11. The storage unit 22 may hold the image data to be transmitted to the terminal device 10.
 (制御部21)
 制御部21は、例えば、コントローラ(controller)であり、CPU(Central Processing Unit)やMPU(Micro Processing Unit)等によって、情報処理装置20内部の記憶装置に記憶されている各種プログラムがRAM(Random Access Memory)を作業領域として実行されることにより実現される。例えば、この各種プログラムには、情報処理装置20にインストールされたアプリケーションのプログラムが含まれる。また、制御部21は、例えば、ASIC(Application Specific Integrated Circuit)やFPGA(Field Programmable Gate Array)等の集積回路により実現される。
(Control unit 21)
The control unit 21 is, for example, a controller, and various programs stored in a storage device inside the information processing device 20 by a CPU (Central Processing Unit), an MPU (Micro Processing Unit), or the like are stored in a RAM (Random Access). It is realized by executing Memory) as a work area. For example, the various programs include programs of applications installed in the information processing apparatus 20. Further, the control unit 21 is realized by, for example, an integrated circuit such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field Programmable Gate Array).
 制御部21は、重要領域設定部211を有し、以下に説明する情報処理の機能や作用を実現または実行する。なお、制御部21の内部構成は、図3に示した構成に限られず、後述する情報処理を行う構成であれば他の構成であってもよい。 The control unit 21 has an important area setting unit 211, and realizes or executes an information processing function or operation described below. The internal configuration of the control unit 21 is not limited to the configuration shown in FIG. 3, and may be another configuration as long as it is a configuration for performing information processing described later.
 (重要領域設定部211)
 重要領域設定部211は、端末装置10から取得した視線情報に基づき、重要領域及び非重要領域を設定する。重要領域設定部211は、視線情報に含まれる注視領域を重要領域に決定し、それ以外の領域を非重要領域に決定する。
(Important area setting unit 211)
The important area setting unit 211 sets the important area and the non-important area based on the line-of-sight information acquired from the terminal device 10. The important area setting unit 211 determines the gaze area included in the line-of-sight information as an important area, and determines the other areas as non-important areas.
 重要領域設定部211は、決定した重要領域に該当する画像データを重要情報(重要画像データ)として抽出し、高信頼・低遅延で信号を伝送するネットワークスライス(図3ではスライスS1)に割り当てる。スライスS1は、例えばURLLC用のネットワークスライスである。また、重要領域設定部211は、決定した非重要領域に該当する画像データを非重要情報(非重要画像データ)として抽出し、ベストエフォートで伝送するネットワークスライス(図3ではスライスS2)に割り当てる。 The important area setting unit 211 extracts the image data corresponding to the determined important area as important information (important image data) and assigns it to a network slice (slice S1 in FIG. 3) that transmits a signal with high reliability and low delay. Slice S1 is, for example, a network slice for URLLC. Further, the important area setting unit 211 extracts the image data corresponding to the determined non-important area as non-important information (non-important image data) and allocates it to the network slice (slice S2 in FIG. 3) to be transmitted with best effort.
 重要領域設定部211が各スライスS1、S2に割り当てた重要画像データ及び非重要画像データは、通信部23を介して端末装置10に送信される。 The important image data and the non-important image data assigned to the slices S1 and S2 by the important area setting unit 211 are transmitted to the terminal device 10 via the communication unit 23.
 <2.2.重要領域の設定>
 ここで、情報処理システム1での重要領域の設定について説明する。まず、図4を用いて、本開示の第1の実施形態に係る注視領域について説明する。図4は、本開示の第1の実施形態に係る注視領域について説明するための図である。
<2.2. Setting important areas>
Here, the setting of the important area in the information processing system 1 will be described. First, the gaze area according to the first embodiment of the present disclosure will be described with reference to FIG. FIG. 4 is a diagram for explaining a gaze area according to the first embodiment of the present disclosure.
 上述したように、端末装置10は、ユーザの視線方向に基づき、ユーザが注視している領域を決定する。 As described above, the terminal device 10 determines the area in which the user is gazing based on the line-of-sight direction of the user.
 図4に示すように、ユーザは、例えばヘットマウントディスプレイである端末装置10を頭部に装着しており、端末装置10の画面に表示されるオブジェクトObを注視しているものとする。 As shown in FIG. 4, it is assumed that the user wears the terminal device 10 which is a head mount display, for example, on the head and is watching the object Ob displayed on the screen of the terminal device 10.
 このとき、端末装置10は、上述したように、自装置に搭載されるカメラや光センサ、あるいは動きセンサ(いずれも図示省略)といったセンサの検出結果に基づき、アイトラッキング技術等を用いてユーザの視線方向を検出する。 At this time, as described above, the terminal device 10 uses eye tracking technology or the like based on the detection results of sensors such as a camera, an optical sensor, or a motion sensor (all of which are not shown) mounted on the own device. Detect the line-of-sight direction.
 端末装置10は、アイトラッキング技術として、例えば、ユーザの眼の特徴量、位置及び動きの少なくとも1つを検出する。端末装置10は、例えばカメラでユーザの眼を撮影し、撮影した画像を分析することで、ユーザの眼の特徴量、位置及び動きの少なくとも1つを検出し、アイトラッキング(視線検出)を行う。あるいは、端末装置10が、ユーザの眼に加えて、又は、これに代えて、ユーザの顔又は頭の特徴量、位置及び動きの少なくとも1つを検出することで視線検出を行ってもよい。 The terminal device 10 detects at least one feature amount, position, and movement of the user's eye as an eye tracking technique, for example. The terminal device 10 detects at least one feature amount, position, and movement of the user's eye by photographing the user's eye with a camera, for example, and analyzes the captured image, and performs eye tracking (line-of-sight detection). .. Alternatively, the terminal device 10 may perform line-of-sight detection by detecting at least one feature amount, position, and movement of the user's face or head in addition to or instead of the user's eyes.
 端末装置10は、検出した視線方向に表示されているオブジェクトObの表示領域を注視領域に決定する。例えば、図4に示すように立方体のオブジェクトObが表示されている場合、立方体のオブジェクトObの表示領域を注視領域とする。 The terminal device 10 determines the display area of the object Ob displayed in the detected line-of-sight direction as the gaze area. For example, when the cube object Ob is displayed as shown in FIG. 4, the display area of the cube object Ob is set as the gaze area.
 あるいは、端末装置10が視線方向に表示されているオブジェクトObを含む所定領域R1を注視領域に決定してもよい。 Alternatively, a predetermined area R1 including the object Ob displayed in the line-of-sight direction of the terminal device 10 may be determined as the gaze area.
 このように、端末装置10は、視線方向に表示しているオブジェクトObを含む領域を注視領域に決定し、決定した注視領域に関する情報を含む視線情報を情報処理装置20に送信する。 In this way, the terminal device 10 determines the area including the object Ob displayed in the line-of-sight direction as the gaze area, and transmits the line-of-sight information including the information regarding the determined gaze area to the information processing device 20.
 情報処理装置20は、端末装置10から視線情報を取得すると、視線情報に基づき、画像データから重要領域及び非重要領域それぞれに対応する重要画像データ及び非重要画像データを抽出する。情報処理装置20は、抽出した画像データが、所望の伝送条件(例えば、低遅延やベストエフォート等)で伝送されるよう、対応するスライスを利用するためのスライス関連情報を各画像データに付与する。 When the information processing device 20 acquires the line-of-sight information from the terminal device 10, the information processing device 20 extracts important image data and non-important image data corresponding to each of the important area and the non-important area from the image data based on the line-of-sight information. The information processing device 20 adds slice-related information for using the corresponding slice to each image data so that the extracted image data is transmitted under desired transmission conditions (for example, low delay, best effort, etc.). ..
 ここで、スライス関連情報とは、送信する信号(ここでは重要画像データ又は非重要画像データ)の用途や特性に応じて割り当てられるべきスライスを特定するための情報(メタデータ)である。情報処理装置20の制御部21は、画像データと、当該画像データが割り当てられるべきスライスを特定するためのスライス関連情報と、を関連付けて基地局装置30に送信するよう通信部23を制御する。基地局装置30は、受信した各画像データをそれぞれのスライス関連情報により特定されるスライスに割り当てることで、割り当てられたスライスを用いて各画像データを端末装置10に送信する。 Here, the slice-related information is information (metadata) for specifying a slice to be assigned according to the use and characteristics of the signal to be transmitted (important image data or non-important image data in this case). The control unit 21 of the information processing device 20 controls the communication unit 23 so as to associate the image data with the slice-related information for specifying the slice to which the image data is to be assigned and transmit the image data to the base station device 30. The base station apparatus 30 assigns each received image data to a slice specified by each slice-related information, and transmits each image data to the terminal apparatus 10 using the assigned slice.
 例えば、情報処理装置20は、重要画像データを、優先度の高い信号であり、低遅延で送信する必要がある信号と判定する。情報処理装置20は、当該重要画像データに対して低遅延での送信が確保されるような優先度の高いスライスS1を特定するためのスライス関連情報を生成する。情報処理装置20は、生成したスライス関連情報と、重要画像データと、を関連付ける。 For example, the information processing device 20 determines that important image data is a signal having a high priority and needs to be transmitted with a low delay. The information processing device 20 generates slice-related information for identifying a slice S1 having a high priority so as to ensure transmission with low delay for the important image data. The information processing device 20 associates the generated slice-related information with important image data.
 また、情報処理装置20は、非重要画像データを、重要画像データよりも優先度の低い信号であると判定する。情報処理装置20は、上記スライスS1とは異なるスライスS2(例えば、スライスS1より優先度の低いスライスS2)を特定するためのスライス関連情報を生成する。情報処理装置20は、生成したスライス関連情報と、非重要画像データと、を関連付ける。 Further, the information processing device 20 determines that the non-important image data is a signal having a lower priority than the important image data. The information processing apparatus 20 generates slice-related information for identifying a slice S2 different from the slice S1 (for example, a slice S2 having a lower priority than the slice S1). The information processing device 20 associates the generated slice-related information with the non-important image data.
 このように、情報処理装置20は、各画像データに求められる特性に応じたスライスを割り当てて送信することができ、ユーザは優先度の高い重要画像データを低遅延な状態で視聴しつつ、非重要画像データも視聴することができるようになる。 In this way, the information processing apparatus 20 can allocate and transmit slices according to the characteristics required for each image data, and the user can view the important image data having a high priority in a low delay state while not transmitting it. Important image data can also be viewed.
 なお、ここでは、情報処理装置20の制御部21が、送信する画像データに割り当てるスライスを特定するためのスライス関連情報を生成する例を説明したが、これに限定されない。情報処理装置20の制御部21は、単に送信する画像データの種別(重要、非重要等)を示す識別情報を、当該画像データに関連付けるようにしてもよい。この場合、基地局装置30が、受信した各画像データに関連付けられた識別情報に基づき、各画像データを、各種別に対応するスライスに割り当てるようにしてもよい。なお、基地局装置30は、例えば画像データの種別と、かかる種別ごとに割り当てるスライスの関係を予め保持しており、かかる関係を参照することで、各画像データにスライスを割り当てる。 Although the control unit 21 of the information processing apparatus 20 has described an example of generating slice-related information for specifying a slice to be assigned to image data to be transmitted, the present invention is not limited to this. The control unit 21 of the information processing apparatus 20 may simply associate the identification information indicating the type (important, non-important, etc.) of the image data to be transmitted with the image data. In this case, the base station apparatus 30 may allocate each image data to the corresponding slices for each type based on the identification information associated with each received image data. The base station apparatus 30 holds in advance the relationship between the type of image data and the slices to be assigned for each type, and assigns slices to each image data by referring to the relationship.
 このように、基地局装置30が、画像データにスライスを割り当てることで、各画像データを異なるネットワークスライスで送信することもできる。 In this way, the base station apparatus 30 can assign slices to the image data, so that each image data can be transmitted in different network slices.
 <2.3.通信処理>
 図5は、本開示の第1の実施形態にかかる通信処理を説明するためのシーケンス図である。
<2.3. Communication processing>
FIG. 5 is a sequence diagram for explaining the communication process according to the first embodiment of the present disclosure.
 図5に示すように、端末装置10は、ユーザの視線を検出する(ステップS101)。次に、端末装置10は、検出したユーザの視線からユーザが注視する注視領域を決定し、決定した注視領域に関する情報を含む視線情報を情報処理装置20に送信する(ステップS102)。 As shown in FIG. 5, the terminal device 10 detects the line of sight of the user (step S101). Next, the terminal device 10 determines the gaze area to be gazed by the user from the detected line of sight of the user, and transmits the line-of-sight information including the information about the determined gaze area to the information processing device 20 (step S102).
 情報処理装置20は、視線情報に基づき、重要領域を設定する(ステップS103)。情報処理装置20は、画像データから重要領域の重要画像データと、重要領域を除く領域の非重要画像データと、を抽出し、抽出した各画像データをそれぞれ異なるスライスに割り当てる(ステップS104)。 The information processing device 20 sets an important area based on the line-of-sight information (step S103). The information processing device 20 extracts important image data in an important region and non-important image data in an region excluding the important region from the image data, and assigns each of the extracted image data to different slices (step S104).
 情報処理装置20は、各スライスで各画像データを送信することで、ユーザに映像を配信する(ステップS105)。 The information processing device 20 distributes the video to the user by transmitting each image data in each slice (step S105).
 情報処理システム1は、上述した通信処理を、例えば配信する映像の1フレームごとに実行してもよい。あるいは、視線検出を行う周期と映像配信の周期(フレーム周期)が異なっていてもよい。この場合、情報処理装置20は、端末装置10から視線情報を取得したタイミングで重要領域のアップデートを行うものとする。 The information processing system 1 may execute the above-mentioned communication processing for each frame of the video to be distributed, for example. Alternatively, the cycle of line-of-sight detection and the cycle of video distribution (frame cycle) may be different. In this case, the information processing device 20 updates the important area at the timing when the line-of-sight information is acquired from the terminal device 10.
 ここで、端末装置10が、非重要画像データの受信に失敗した場合について説明する。この場合、端末装置10は、受信失敗より前のタイミングで受信に成功した非重要画像データをユーザに表示するものとする。あるいは、端末装置10が非重要画像データを重要画像データよりも遅れて受信した場合、端末装置10は、非重要画像データを受信するまで、それ以前に受信した非重要画像データをユーザに表示するものとする。 Here, a case where the terminal device 10 fails to receive non-important image data will be described. In this case, the terminal device 10 shall display to the user the non-important image data that was successfully received at a timing prior to the reception failure. Alternatively, when the terminal device 10 receives the non-important image data later than the important image data, the terminal device 10 displays the non-important image data received before that until the non-important image data is received to the user. It shall be.
 また、端末装置10が受信した非重要画像データの少なくとも一部に欠損やエラーが発生した場合、端末装置10は、欠損やエラーが発生した非重要画像データより以前に受信した非重要画像データをユーザに表示するものとする。あるいは、端末装置10が、欠損やエラーを正しく受信した非重要画像データを用いて補間(補正)して、ユーザに提示するようにしてもよい。 Further, when a defect or an error occurs in at least a part of the non-important image data received by the terminal device 10, the terminal device 10 receives the non-important image data received before the non-important image data in which the defect or the error occurs. It shall be displayed to the user. Alternatively, the terminal device 10 may interpolate (correct) using the non-important image data for which the loss or error is correctly received and present it to the user.
 以上のように、第1の実施形態では、端末装置10が検出した注視領域に応じて、情報処理装置20が、画像データを重要領域と非重要領域とにわける。情報処理装置20は、重要領域の画像データ及び非重要領域の画像データをそれぞれ異なるネットワークスライスに割り当てる。 As described above, in the first embodiment, the information processing device 20 divides the image data into an important area and a non-important area according to the gaze area detected by the terminal device 10. The information processing device 20 allocates the image data of the important region and the image data of the non-important region to different network slices.
 これにより、情報処理システム1は、ユーザビリティの劣化を抑制しつつ、無線リソースの利用効率の低下を抑制することができる。 As a result, the information processing system 1 can suppress the deterioration of usability and the deterioration of the utilization efficiency of wireless resources.
 <<3.第2の実施形態>>
 上述した第1の実施形態では、端末装置10が検出した視線情報に基づき、情報処理装置20が重要領域を設定したが、情報処理装置20が、画像データに基づいて重要領域を設定するようにしてもよい。そこで、第2の実施形態として、情報処理装置20が、画像データに基づき、重要領域を設定する場合について説明する。
<< 3. Second embodiment >>
In the first embodiment described above, the information processing device 20 sets the important area based on the line-of-sight information detected by the terminal device 10, but the information processing device 20 sets the important area based on the image data. You may. Therefore, as a second embodiment, a case where the information processing apparatus 20 sets an important region based on the image data will be described.
 図6は、本開示の第2の実施形態に係る情報処理システム2の構成を示すブロック図である。図6に示す情報処理システム2の端末装置10は、視線検出部111を有しておらず、視線情報を情報処理装置20に送信していない点で、図3に示す情報処理システム1と異なる。 FIG. 6 is a block diagram showing the configuration of the information processing system 2 according to the second embodiment of the present disclosure. The terminal device 10 of the information processing system 2 shown in FIG. 6 is different from the information processing system 1 shown in FIG. 3 in that it does not have the line-of-sight detection unit 111 and does not transmit the line-of-sight information to the information processing device 20. ..
 また、図6に示す情報処理装置20の重要領域設定部211は、視線情報ではなく、画像データに基づいて重要領域を設定する。 Further, the important area setting unit 211 of the information processing device 20 shown in FIG. 6 sets an important area based on image data instead of line-of-sight information.
 ここで、図7及び図8を用いて、重要領域設定部211による重要領域の設定について説明する。図7及び図8は、本開示の第2の実施形態に係る重要領域について説明するための図である。 Here, the setting of the important area by the important area setting unit 211 will be described with reference to FIGS. 7 and 8. 7 and 8 are diagrams for explaining important areas according to the second embodiment of the present disclosure.
 重要領域設定部211は、例えば予め画像データの画像認識を行うことで、重要領域を特定する。例えば、図7に示すように、サッカーの試合の映像を配信する場合、重要領域設定部211は、画像データからボールと、当該ボールを保持する選手とを画像認識によって特定し、ボールを保持する選手と当該選手の周辺とを含む領域を重要領域R1に設定する。また、重要領域設定部211は、重要領域R1以外の領域を、非重要領域に設定する。 The important area setting unit 211 identifies an important area, for example, by performing image recognition of image data in advance. For example, as shown in FIG. 7, when the video of a soccer game is distributed, the important area setting unit 211 identifies the ball and the player holding the ball from the image data by image recognition, and holds the ball. The area including the player and the periphery of the player is set as the important area R1. Further, the important area setting unit 211 sets an area other than the important area R1 as a non-important area.
 このように、重要領域設定部211は、画像データの画像認識を行い、配信する映像の種別に応じて重要領域を設定する。重要領域設定部211は、例えばスポーツの場合は選手、ドラマや映画等の場合は演者を含む所定領域を重要領域に設定する。あるいは、映像制作者が重要領域の設定に用いるもの(例えば、人物やボールなどの物体)を指定するようにしてもよい。この場合、重要領域設定部211は、画像認識を行って指定されたものを特定し、重要領域を設定する。 In this way, the important area setting unit 211 recognizes the image data and sets the important area according to the type of the video to be distributed. The important area setting unit 211 sets a predetermined area including a player in the case of sports and a performer in the case of a drama or a movie as an important area. Alternatively, what the video creator uses to set the important area (for example, an object such as a person or a ball) may be specified. In this case, the important area setting unit 211 performs image recognition, identifies the designated one, and sets the important area.
 あるいは、重要領域設定部211が予め決められた(設定された)領域を重要領域に設定するようにしてもよい。重要な領域は画面の中央に配置されることが一般的である。そこで、図8に示すように、重要領域設定部211は、例えば、画面の中央領域を重要領域に設定し、それ以外の周辺領域を非重要領域に設定する。 Alternatively, the important area setting unit 211 may set a predetermined (set) area as the important area. The important area is generally placed in the center of the screen. Therefore, as shown in FIG. 8, the important area setting unit 211 sets, for example, the central area of the screen as the important area and the other peripheral areas as the non-important area.
 図9は、本開示の第2の実施形態にかかる通信処理を説明するためのシーケンス図である。 FIG. 9 is a sequence diagram for explaining the communication process according to the second embodiment of the present disclosure.
 図9に示すように、情報処理装置20は、画像データの画像認識を行う(ステップS201)。次に、情報処理装置20は、画像認識の結果に基づき、重要領域を設定する(ステップS202)。なお、以降の処理は、図5に示す通信処理と同じである。 As shown in FIG. 9, the information processing device 20 performs image recognition of image data (step S201). Next, the information processing device 20 sets an important region based on the result of image recognition (step S202). The subsequent processing is the same as the communication processing shown in FIG.
 以上のように、第2の実施形態では、情報処理装置20が、画像認識の結果に応じて画像データを重要領域と非重要領域とにわける。情報処理装置20は、重要領域の画像データ及び非重要領域の画像データをそれぞれ異なるネットワークスライスに割り当てる。 As described above, in the second embodiment, the information processing device 20 divides the image data into an important area and a non-important area according to the result of image recognition. The information processing device 20 allocates the image data of the important region and the image data of the non-important region to different network slices.
 これにより、情報処理システム2は、ユーザビリティの劣化を抑制しつつ、無線リソースの利用効率の低下を抑制することができる。 As a result, the information processing system 2 can suppress the deterioration of usability and the deterioration of the utilization efficiency of wireless resources.
 <<4.第3の実施形態>>
 上述した第1、第2の実施形態では、情報処理装置20が、端末装置10に映像を配信するとしたが、情報処理装置が映像を受信するようにしてもよい。そこで、第3の実施形態として、情報処理装置60が、撮像装置50から映像を受信する場合について説明する。
<< 4. Third Embodiment >>
In the first and second embodiments described above, the information processing device 20 delivers the video to the terminal device 10, but the information processing device may receive the video. Therefore, as a third embodiment, a case where the information processing device 60 receives an image from the image pickup device 50 will be described.
 図10は、本開示の第3の実施形態に係る情報処理システム3の構成を示すブロック図である。図10に示す情報処理システム3は、図3に示す端末装置10の代わりに撮像装置50を有している。また、情報処理装置60は、重要領域設定部211及び表示部24を有していない点で図3に示す情報処理装置20と異なる。 FIG. 10 is a block diagram showing the configuration of the information processing system 3 according to the third embodiment of the present disclosure. The information processing system 3 shown in FIG. 10 has an imaging device 50 instead of the terminal device 10 shown in FIG. Further, the information processing device 60 is different from the information processing device 20 shown in FIG. 3 in that it does not have the important area setting unit 211 and the display unit 24.
 図10に示す撮像装置50は、制御部51と、記憶部52と、通信部53と、撮像部54と、を有する。 The imaging device 50 shown in FIG. 10 includes a control unit 51, a storage unit 52, a communication unit 53, and an imaging unit 54.
 撮像部54は、撮像対象を撮像し、画像データを生成する。撮像部54は、例えば4K(水平画素数3840×垂直画素数2160)又は8K(水平画素数7680×垂直画素数4320)等の高解像度の撮影に対応していてもよい。 The image pickup unit 54 takes an image of the image pickup target and generates image data. The imaging unit 54 may support high-resolution shooting such as 4K (horizontal number of pixels 3840 x vertical pixel number 2160) or 8K (horizontal pixel number 7680 x vertical pixel number 4320).
 記憶部52は、例えば、RAM、フラッシュメモリ等の半導体メモリ素子、または、ハードディスク、光ディスク等の記憶装置によって実現される。記憶部52は、制御部51の処理に用いられるプログラムや演算パラメータ等を記憶する。 The storage unit 52 is realized by, for example, a semiconductor memory element such as a RAM or a flash memory, or a storage device such as a hard disk or an optical disk. The storage unit 52 stores programs, calculation parameters, and the like used for processing by the control unit 51.
 通信部53は、外部装置との通信を行う通信インタフェース(I/F)である。通信部53は、例えば、NIC(Network Interface Card)等によって実現される。例えば、通信部53は、基地局装置30と無線通信を行うことでコアネットワークに接続する。通信部53は、基地局装置30を介して情報処理装置20に画像データを送信する。 The communication unit 53 is a communication interface (I / F) that communicates with an external device. The communication unit 53 is realized by, for example, a NIC (Network Interface Card) or the like. For example, the communication unit 53 connects to the core network by performing wireless communication with the base station device 30. The communication unit 53 transmits image data to the information processing device 20 via the base station device 30.
 制御部51は、例えば、コントローラ(controller)であり、CPU(Central Processing Unit)やMPU(Micro Processing Unit)等によって、撮像装置50内部の記憶装置に記憶されている各種プログラムがRAM(Random Access Memory)を作業領域として実行されることにより実現される。例えば、この各種プログラムには、撮像装置50にインストールされたアプリケーションのプログラムが含まれる。また、制御部51は、例えば、ASIC(Application Specific Integrated Circuit)やFPGA(Field Programmable Gate Array)等の集積回路により実現される。 The control unit 51 is, for example, a controller, and various programs stored in a storage device inside the image pickup device 50 by a CPU (Central Processing Unit), an MPU (Micro Processing Unit), or the like are stored in a RAM (Random Access Memory). ) Is executed as a work area. For example, the various programs include programs of applications installed in the image pickup apparatus 50. Further, the control unit 51 is realized by, for example, an integrated circuit such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field Programmable Gate Array).
 制御部51は、重要領域設定部511を有し、以下に説明する情報処理の機能や作用を実現または実行する。なお、制御部51の内部構成は、図10に示した構成に限られず、後述する情報処理を行う構成であれば他の構成であってもよい。 The control unit 51 has an important area setting unit 511, and realizes or executes the functions and operations of information processing described below. The internal configuration of the control unit 51 is not limited to the configuration shown in FIG. 10, and may be another configuration as long as it is a configuration for performing information processing described later.
 重要領域設定部511は、撮像部54が撮像した画像データの画像解析(例えば、画像認識)を行って、重要領域を設定する。例えば、画像データがサッカーのようなスポーツを撮影したものである場合、重要領域設定部511は、ボール及びボールを保持する選手を含む領域を重要領域に設定する。あるいは、画像データがマラソンを撮影したものである場合、走者を含む領域を重要領域に設定し、それ以外の例えば沿道の観客を含む領域を非重要領域に設定する。 The important area setting unit 511 performs image analysis (for example, image recognition) of the image data captured by the imaging unit 54 to set the important area. For example, when the image data is a photograph of a sport such as soccer, the important area setting unit 511 sets the area including the ball and the player holding the ball as the important area. Alternatively, when the image data is a photograph of a marathon, the area including the runner is set as the important area, and the other area including, for example, the spectators along the road is set as the non-important area.
 このように画像解析によって重要領域に含める人物を特定する方法だけでなく、例えば人物に赤外センサや位置センサ等のセンサを付けることで、重要領域設定部511が画像上の人物を特定するようにしてもよい。 In addition to the method of identifying the person to be included in the important area by image analysis in this way, for example, by attaching a sensor such as an infrared sensor or a position sensor to the person, the important area setting unit 511 identifies the person on the image. It may be.
 あるいは、撮像装置50にマイク等のセンサを設け、当該センサを用いて撮像装置50が音の方向を検出するようにしてもよい。重要領域設定部511は、検出した音の方向に、重要な人物(例えば話者等)がいるとして、音の方向に基づいて決定した領域を重要領域に設定するようにしてもよい。 Alternatively, the image pickup device 50 may be provided with a sensor such as a microphone so that the image pickup device 50 detects the direction of sound using the sensor. The important area setting unit 511 may set the area determined based on the direction of the sound as the important area, assuming that there is an important person (for example, a speaker) in the direction of the detected sound.
 また、撮像装置50がTV放送やインターネット放送で配信する番組を撮影する場合、番組に出演する複数の出演者の中から、重要領域に含める出演者を、例えば当該番組を視聴するユーザによる投票によって決定するようにしてもよい。重要領域設定部511は、ユーザによる投票で人気が最も高かった出演者を含む領域を重要領域に設定する。 Further, when the imaging device 50 shoots a program distributed by TV broadcasting or Internet broadcasting, the performers to be included in the important area from the plurality of performers appearing in the program are voted by, for example, a user who watches the program. You may decide. The important area setting unit 511 sets the area including the performer who was the most popular in the voting by the user as the important area.
 また、重要領域設定部511が、撮影する映像のシナリオに応じて重要領域を設定するようにしておいてもよい。例えば、生放送の音楽番組等、どの時刻にどのカメラを用いて、どのような映像を撮影するかシナリオとして予め決められている場合がある。この場合、重要領域設定部511は、予め決められたシナリオに沿って重要領域及び非重要領域を設定する。 Further, the important area setting unit 511 may set the important area according to the scenario of the image to be captured. For example, in a live music program or the like, there is a case where it is decided in advance as a scenario which camera is used at what time and what kind of image is to be shot. In this case, the important area setting unit 511 sets the important area and the non-important area according to a predetermined scenario.
 重要領域設定部511は、撮影した映像の画像データのうち、重要領域の画像データと、非重要領域の画像データと、をそれぞれ異なるネットワークスライスに割り当てる。かかる割り当ては、情報処理システム1の情報処理装置20による割り当てと同じであるため、詳細な説明を省略する。 The important area setting unit 511 allocates the image data of the important area and the image data of the non-important area to different network slices among the image data of the captured video. Since such an allocation is the same as the allocation by the information processing apparatus 20 of the information processing system 1, detailed description thereof will be omitted.
 図11は、本開示の第3の実施形態にかかる通信処理を説明するためのシーケンス図である。 FIG. 11 is a sequence diagram for explaining the communication process according to the third embodiment of the present disclosure.
 図11に示すように、撮像装置50は、画像データの画像認識を行う(ステップS301)。次に、撮像装置50は、画像認識の結果に基づき、重要領域を設定する(ステップS302)。 As shown in FIG. 11, the image pickup apparatus 50 performs image recognition of image data (step S301). Next, the image pickup apparatus 50 sets an important region based on the result of image recognition (step S302).
 撮像装置50は、画像データから重要領域の重要画像データと、重要領域を除く領域の非重要画像データと、を抽出し、抽出した各画像データをそれぞれ異なるスライスに割り当てる(ステップS303)。 The imaging device 50 extracts important image data in an important region and non-important image data in an region excluding the important region from the image data, and assigns each extracted image data to a different slice (step S303).
 撮像装置50は、各スライスで各画像データを送信することで、情報処理装置20に映像を送信する(ステップS304)。 The image pickup apparatus 50 transmits an image to the information processing apparatus 20 by transmitting each image data in each slice (step S304).
 以上のように、第3の実施形態では、撮像装置50が、例えば画像認識の結果に応じて画像データを重要領域と非重要領域とにわける。撮像装置50は、重要領域の画像データ及び非重要領域の画像データをそれぞれ異なるネットワークスライスに割り当てる。このように、端末側の装置(ここでは撮像装置50)が、情報処理装置20に映像を送る場合であっても、情報処理システム3は、ユーザビリティの劣化を抑制しつつ、無線リソースの利用効率の低下を抑制することができる。 As described above, in the third embodiment, the image pickup apparatus 50 divides the image data into an important region and a non-important region according to, for example, the result of image recognition. The image pickup apparatus 50 allocates the image data of the important region and the image data of the non-important region to different network slices. In this way, even when the device on the terminal side (here, the image pickup device 50) sends an image to the information processing device 20, the information processing system 3 suppresses the deterioration of usability and the utilization efficiency of wireless resources. Can be suppressed.
 <<5.その他の実施形態>>
 上記の各実施形態は一例を示したものであり、種々の変更及び応用が可能である。
<< 5. Other embodiments >>
Each of the above embodiments shows an example, and various modifications and applications are possible.
 上記の各実施形態では、端末装置10が頭部装着型デバイスであるとしたが、これに限定されない。端末装置10は、ユーザに画像を表示するための装置であればよく、TVやパソコン、スマートフォンなどの装置であってもよい。 In each of the above embodiments, the terminal device 10 is a head-mounted device, but the present invention is not limited to this. The terminal device 10 may be a device for displaying an image to a user, and may be a device such as a TV, a personal computer, or a smartphone.
 また、上記の各実施形態では、情報処理システム1~3が、重要画像データを低遅延又は高信頼なスライスS1で送信し、非重要画像データをベストエフォートで送信するスライスS2で送信するとしたが、これに限定されない。 Further, in each of the above embodiments, the information processing systems 1 to 3 transmit the important image data in the slice S1 having low delay or high reliability, and transmit the non-important image data in the slice S2 which transmits the non-important image data with the best effort. , Not limited to this.
 例えば、情報処理装置20又は撮像装置50が、重要画像データを非重要画像データよりも高解像度なデータとして各画像データを送信するようにしてもよい。例えば、上述したように、撮像装置50が、4Kや8Kの映像を撮影する場合、撮像装置50は、重要画像データを、4Kまたは8Kの高解像度のまま送信する。一方、撮像装置50は、非重要画像データに対してダウンコンバート等の処理を施し、解像度を下げて送信する。換言すると、撮像装置50は、非重要領域の画像データをダウンコンバートして非重要画像データを生成するとも言える。このように、情報処理システム1~3が、領域の重要度に応じて解像度を変更するようにしてもよい。このように、重要画像データは、非重要画像データより高解像度としてもよい。 For example, the information processing device 20 or the image pickup device 50 may transmit the important image data as data having a higher resolution than the non-important image data. For example, as described above, when the image pickup device 50 captures a 4K or 8K image, the image pickup device 50 transmits important image data with a high resolution of 4K or 8K. On the other hand, the image pickup apparatus 50 performs processing such as down-conversion on the non-important image data to lower the resolution and transmit the data. In other words, it can be said that the image pickup apparatus 50 down-converts the image data in the non-important region to generate the non-important image data. In this way, the information processing systems 1 to 3 may change the resolution according to the importance of the area. As described above, the important image data may have a higher resolution than the non-important image data.
 また、上記の各実施形態では、重要領域設定部211、511が重要領域及び非重要領域を設定する、換言すると、画像データを2段階の重要度に分割する場合について示したが、これに限定されない。重要領域設定部211、511が、画像データを3段階以上の重要度に応じて領域分割するようにしてもよい。この場合、重要領域設定部211、511は、分割した領域の重要度に応じて、それぞれ異なるネットワークスライスに割り当てる。例えば、重要領域設定部211、511が画像データを最重要、重要、非重要の3段階の領域に分割するとする。この場合、重要領域設定部211、511は、例えば最重要領域に分割した画像データを低遅延、高信頼な通信を要求するスライスに割り当てる。重要領域設定部211、511は、重要領域に分割した画像データを低遅延な通信を要求するスライスに割り当て、非重要領域に分割した画像データをベストエフォートで送信するスライスに割り当てる。 Further, in each of the above embodiments, the case where the important area setting units 211 and 511 set the important area and the non-important area, in other words, the image data is divided into two levels of importance, has been shown, but the present invention is limited to this. Not done. The important area setting units 211 and 511 may divide the image data into areas according to the importance of three or more levels. In this case, the important area setting units 211 and 511 allocate to different network slices according to the importance of the divided areas. For example, it is assumed that the important area setting units 211 and 511 divide the image data into three stages of the most important, important, and non-important areas. In this case, the important area setting units 211 and 511 allocate, for example, the image data divided into the most important areas to slices that require low-delay and highly reliable communication. The important area setting units 211 and 511 allocate the image data divided into important areas to slices that require low-delay communication, and allocate the image data divided into non-important areas to slices to be transmitted with best effort.
 このように、重要領域設定部211、511が分割する領域の数、及び、割り当てるスライスの数は複数であればよく、数は限定されない。 As described above, the number of areas divided by the important area setting units 211 and 511 and the number of slices to be allocated may be plural, and the number is not limited.
 上記の各実施形態の端末装置10、情報処理装置20、60、基地局装置30、又は、撮像装置50を制御する制御装置は、専用のコンピュータシステム、又は汎用のコンピュータシステムによって実現してもよい。 The control device for controlling the terminal device 10, the information processing devices 20, 60, the base station device 30, or the image pickup device 50 of each of the above embodiments may be realized by a dedicated computer system or a general-purpose computer system. ..
 例えば、上述の動作(例えば、通信処理)を実行するための通信プログラムを、光ディスク、半導体メモリ、磁気テープ、フレキシブルディスク等のコンピュータ読み取り可能な記録媒体に格納して配布する。そして、例えば、該プログラムをコンピュータにインストールし、上述の処理を実行することによって制御装置を構成する。このとき、制御装置は、端末装置10、情報処理装置20、60、基地局装置30、又は、撮像装置50の外部の装置(例えば、パーソナルコンピュータ)であってもよい。また、制御装置は、端末装置10、情報処理装置20、60、基地局装置30、又は、撮像装置50の内部の装置(例えば、制御部11、制御部21、又は制御部51)であってもよい。 For example, a communication program for executing the above-mentioned operation (for example, communication processing) is stored and distributed in a computer-readable recording medium such as an optical disk, a semiconductor memory, a magnetic tape, or a flexible disk. Then, for example, the control device is configured by installing the program on a computer and executing the above-mentioned processing. At this time, the control device may be a terminal device 10, information processing devices 20, 60, a base station device 30, or an external device (for example, a personal computer) of the image pickup device 50. Further, the control device is a terminal device 10, an information processing device 20, 60, a base station device 30, or a device inside the image pickup device 50 (for example, a control unit 11, a control unit 21, or a control unit 51). May be good.
 また、上記通信プログラムをインターネット等のネットワーク上のサーバ装置が備えるディスク装置に格納しておき、コンピュータにダウンロード等できるようにしてもよい。また、上述の機能を、OS(Operating System)とアプリケーションソフトとの協働により実現してもよい。この場合には、OS以外の部分を媒体に格納して配布してもよいし、OS以外の部分をサーバ装置に格納しておき、コンピュータにダウンロード等できるようにしてもよい。 Further, the above communication program may be stored in a disk device provided in a server device on a network such as the Internet so that it can be downloaded to a computer or the like. Further, the above-mentioned functions may be realized by collaboration between the OS (Operating System) and the application software. In this case, the part other than the OS may be stored in a medium and distributed, or the part other than the OS may be stored in the server device so that it can be downloaded to a computer or the like.
 また、上記実施形態において説明した各処理のうち、自動的に行われるものとして説明した処理の全部又は一部を手動的に行うこともでき、あるいは、手動的に行われるものとして説明した処理の全部又は一部を公知の方法で自動的に行うこともできる。この他、上記文書中や図面中で示した処理手順、具体的名称、各種のデータやパラメータを含む情報については、特記する場合を除いて任意に変更することができる。例えば、各図に示した各種情報は、図示した情報に限られない。 Further, among the processes described in the above-described embodiment, all or a part of the processes described as being automatically performed can be manually performed, or the processes described as being manually performed can be performed. All or part of it can be done automatically by a known method. In addition, the processing procedure, specific name, and information including various data and parameters shown in the above document and drawings can be arbitrarily changed unless otherwise specified. For example, the various information shown in each figure is not limited to the illustrated information.
 また、図示した各装置の各構成要素は機能概念的なものであり、必ずしも物理的に図示の如く構成されていることを要しない。すなわち、各装置の分散・統合の具体的形態は図示のものに限られず、その全部又は一部を、各種の負荷や使用状況などに応じて、任意の単位で機能的又は物理的に分散・統合して構成することができる。 Further, each component of each device shown in the figure is a functional concept, and does not necessarily have to be physically configured as shown in the figure. That is, the specific form of distribution / integration of each device is not limited to the one shown in the figure, and all or part of the device is functionally or physically dispersed / physically distributed in arbitrary units according to various loads and usage conditions. Can be integrated and configured.
 また、上述の実施形態は、処理内容を矛盾させない領域で適宜組み合わせることが可能である。また、上述の実施形態のシーケンス図等に示された各ステップは、適宜順序を変更することが可能である。 Further, the above-described embodiments can be appropriately combined in an area where the processing contents do not contradict each other. Further, the order of each step shown in the sequence diagram or the like of the above-described embodiment can be changed as appropriate.
 また、例えば、本実施形態は、装置またはシステムを構成するあらゆる構成、例えば、システムLSI(Large Scale Integration)等としてのプロセッサ、複数のプロセッサ等を用いるモジュール、複数のモジュール等を用いるユニット、ユニットにさらにその他の機能を付加したセット等(すなわち、装置の一部の構成)として実施することもできる。 Further, for example, the present embodiment includes a device or any configuration constituting the system, for example, a processor as a system LSI (Large Scale Integration) or the like, a module using a plurality of processors, a unit using a plurality of modules, or a unit. It can also be implemented as a set or the like (that is, a part of the configuration of the device) to which other functions are added.
 なお、本実施形態において、システムとは、複数の構成要素(装置、モジュール(部品)等)の集合を意味し、全ての構成要素が同一筐体中にあるか否かは問わない。したがって、別個の筐体に収納され、ネットワークを介して接続されている複数の装置、及び、1つの筐体の中に複数のモジュールが収納されている1つの装置は、いずれも、システムである。 In the present embodiment, the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Therefore, a plurality of devices housed in separate housings and connected via a network, and a device in which a plurality of modules are housed in one housing are both systems. ..
 また、例えば、本実施形態は、1つの機能を、ネットワークを介して複数の装置で分担、共同して処理するクラウドコンピューティングの構成をとることができる。 Further, for example, the present embodiment can have a cloud computing configuration in which one function is shared and jointly processed by a plurality of devices via a network.
 <<6.補足>>
 以上、添付図面を参照しながら本開示の好適な実施形態について詳細に説明したが、本開示の技術的範囲はかかる例に限定されない。本開示の技術分野における通常の知識を有する者であれば、請求の範囲に記載された技術的思想の範疇内において、各種の変更例又は修正例に想到し得ることは明らかであり、これらについても、当然に本開示の技術的範囲に属するものと了解される。
<< 6. Supplement >>
Although the preferred embodiments of the present disclosure have been described in detail with reference to the accompanying drawings, the technical scope of the present disclosure is not limited to such examples. It is clear that anyone with ordinary knowledge in the technical field of the present disclosure may come up with various modifications or modifications within the scope of the technical ideas set forth in the claims. Is, of course, understood to belong to the technical scope of the present disclosure.
 上述してきた実施形態は、処理内容を矛盾させない範囲で適宜組み合わせることが可能である。 The above-described embodiments can be appropriately combined as long as the processing contents do not contradict each other.
 また、本明細書に記載された効果は、あくまで説明的又は例示的なものであって限定的ではない。つまり、本開示にかかる技術は、上記の効果とともに、又は上記の効果に代えて、本明細書の記載から当業者には明らかな他の効果を奏しうる。 Further, the effects described in the present specification are merely explanatory or exemplary and are not limited. That is, the techniques according to the present disclosure may exhibit other effects apparent to those skilled in the art from the description herein, in addition to or in place of the above effects.
 なお、以下のような構成も本開示の技術的範囲に属する。
(1)
 画像データから重要領域の重要画像データと、非重要領域の非重要画像データと、を抽出し、
 前記重要画像データと前記非重要画像データとをそれぞれ異なるネットワークスライスに割り当てる、制御部と、
 を備える情報処理装置。
(2)
 前記制御部は、前記重要画像データを優先度の高いネットワークスライスに割り当てる、(1)に記載の情報処理装置。
(3)
 前記制御部は、
 前記画像データを視聴するユーザの視線方向に応じた前記重要領域に関する情報を取得し、
 取得した前記情報に応じて前記画像データから前記重要画像データと前記非重要画像データとを抽出する、(1)又は(2)に記載の情報処理装置。
(4)
 前記制御部は、前記画像データの画像処理結果に基づき、前記画像データから前記重要画像データと前記非重要画像データとを抽出する、(1)又は(2)に記載の情報処理装置。
(5)
 前記制御部は、予め設定された前記重要領域に基づき、前記画像データから前記重要画像データと前記非重要画像データとを抽出する、(1)又は(2)に記載の情報処理装置。
(6)
 前記重要画像データは、前記非重要画像データより高解像度である、(1)~(5)のいずれか1つに記載の情報処理装置。
(7)
 前記制御部は、非重要領域の前記画像データをダウンコンバートして前記非重要画像データを生成する、(6)に記載の情報処理装置。
(8)
 画像データから重要領域の重要画像データと、非重要領域の非重要画像データと、を抽出し、
 前記重要画像データと前記非重要画像データとをそれぞれ異なるネットワークスライスに割り当てる、
 情報処理方法。
(9)
 コンピュータを
 画像データから重要領域の重要画像データと、非重要領域の非重要画像データと、を抽出し、
 前記重要画像データと前記非重要画像データとをそれぞれ異なるネットワークスライスに割り当てる、制御部、
 として機能させるプログラム。
The following configurations also belong to the technical scope of the present disclosure.
(1)
The important image data of the important area and the non-important image data of the non-important area are extracted from the image data.
A control unit that assigns the important image data and the non-important image data to different network slices.
Information processing device equipped with.
(2)
The information processing device according to (1), wherein the control unit allocates the important image data to a network slice having a high priority.
(3)
The control unit
Obtaining information on the important area according to the line-of-sight direction of the user who views the image data,
The information processing apparatus according to (1) or (2), which extracts the important image data and the non-important image data from the image data according to the acquired information.
(4)
The information processing device according to (1) or (2), wherein the control unit extracts the important image data and the non-important image data from the image data based on the image processing result of the image data.
(5)
The information processing device according to (1) or (2), wherein the control unit extracts the important image data and the non-important image data from the image data based on the preset important area.
(6)
The information processing apparatus according to any one of (1) to (5), wherein the important image data has a higher resolution than the non-important image data.
(7)
The information processing device according to (6), wherein the control unit down-converts the image data in a non-important region to generate the non-important image data.
(8)
The important image data of the important area and the non-important image data of the non-important area are extracted from the image data.
Allocate the important image data and the non-important image data to different network slices.
Information processing method.
(9)
The computer extracts important image data in the important area and non-important image data in the non-important area from the image data.
A control unit that assigns the important image data and the non-important image data to different network slices.
A program that functions as.
 1        情報処理システム
 10       端末装置
 11、21、51 制御部
 12、22、52 記憶部
 13、23、53 通信部
 14、24    表示部
 20、60    情報処理装置
 30       基地局装置
 50       撮像装置
 64       撮像部
 111      視線検出部
 211、511  重要領域設定部
1 Information processing system 10 Terminal equipment 11, 21, 51 Control unit 12, 22, 52 Storage unit 13, 23, 53 Communication unit 14, 24 Display unit 20, 60 Information processing equipment 30 Base station equipment 50 Imaging device 64 Imaging unit 111 Line-of-sight detection unit 211,511 Important area setting unit

Claims (9)

  1.  画像データから重要領域の重要画像データと、非重要領域の非重要画像データと、を抽出し、
     前記重要画像データと前記非重要画像データとをそれぞれ異なるネットワークスライスに割り当てる、制御部と、
     を備える情報処理装置。
    The important image data of the important area and the non-important image data of the non-important area are extracted from the image data.
    A control unit that assigns the important image data and the non-important image data to different network slices.
    Information processing device equipped with.
  2.  前記制御部は、前記重要画像データを優先度の高いネットワークスライスに割り当てる、請求項1に記載の情報処理装置。 The information processing device according to claim 1, wherein the control unit allocates the important image data to a network slice having a high priority.
  3.  前記制御部は、
     前記画像データを視聴するユーザの視線方向に応じた前記重要領域に関する情報を取得し、
     取得した前記情報に応じて前記画像データから前記重要画像データと前記非重要画像データとを抽出する、請求項1に記載の情報処理装置。
    The control unit
    Obtaining information on the important area according to the line-of-sight direction of the user who views the image data,
    The information processing apparatus according to claim 1, wherein the important image data and the non-important image data are extracted from the image data according to the acquired information.
  4.  前記制御部は、前記画像データの画像認識結果に基づき、前記画像データから前記重要画像データと前記非重要画像データとを抽出する、請求項1に記載の情報処理装置。 The information processing device according to claim 1, wherein the control unit extracts the important image data and the non-important image data from the image data based on the image recognition result of the image data.
  5.  前記制御部は、予め設定された前記重要領域に基づき、前記画像データから前記重要画像データと前記非重要画像データとを抽出する、請求項1に記載の情報処理装置。 The information processing device according to claim 1, wherein the control unit extracts the important image data and the non-important image data from the image data based on the preset important area.
  6.  前記重要画像データは、前記非重要画像データより高解像度である、請求項1に記載の情報処理装置。 The information processing device according to claim 1, wherein the important image data has a higher resolution than the non-important image data.
  7.  前記制御部は、非重要領域の前記画像データをダウンコンバートして前記非重要画像データを生成する、請求項6に記載の情報処理装置。 The information processing device according to claim 6, wherein the control unit down-converts the image data in a non-important region to generate the non-important image data.
  8.  画像データから重要領域の重要画像データと、非重要領域の非重要画像データと、を抽出し、
     前記重要画像データと前記非重要画像データとをそれぞれ異なるネットワークスライスに割り当てる、
     情報処理方法。
    The important image data of the important area and the non-important image data of the non-important area are extracted from the image data.
    Allocate the important image data and the non-important image data to different network slices.
    Information processing method.
  9.  コンピュータを
     画像データから重要領域の重要画像データと、非重要領域の非重要画像データと、を抽出し、
     前記重要画像データと前記非重要画像データとをそれぞれ異なるネットワークスライスに割り当てる、制御部、
     として機能させるプログラム。
    The computer extracts important image data in the important area and non-important image data in the non-important area from the image data.
    A control unit that assigns the important image data and the non-important image data to different network slices.
    A program that functions as.
PCT/JP2021/011067 2020-03-30 2021-03-18 Information processing device, information processing method, and program WO2021200212A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-059796 2020-03-30
JP2020059796 2020-03-30

Publications (1)

Publication Number Publication Date
WO2021200212A1 true WO2021200212A1 (en) 2021-10-07

Family

ID=77928038

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/011067 WO2021200212A1 (en) 2020-03-30 2021-03-18 Information processing device, information processing method, and program

Country Status (1)

Country Link
WO (1) WO2021200212A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012191519A (en) * 2011-03-11 2012-10-04 Panasonic Corp Wireless video transmission device, wireless video reception device, and wireless video transmission system having them
JP2014039201A (en) * 2012-08-17 2014-02-27 Nippon Telegr & Teleph Corp <Ntt> Method of remote control by using roi during use of a plurality of cameras
EP3013012A1 (en) * 2014-10-21 2016-04-27 Alcatel Lucent Networking device and method for adapting quality of video bitstreaming over a network
WO2018125579A1 (en) * 2016-12-29 2018-07-05 Sony Interactive Entertainment Inc. Foveated video link for vr, low latency wireless hmd video streaming with gaze tracking
JP2019054415A (en) * 2017-09-15 2019-04-04 ソニー株式会社 Information processing apparatus and method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012191519A (en) * 2011-03-11 2012-10-04 Panasonic Corp Wireless video transmission device, wireless video reception device, and wireless video transmission system having them
JP2014039201A (en) * 2012-08-17 2014-02-27 Nippon Telegr & Teleph Corp <Ntt> Method of remote control by using roi during use of a plurality of cameras
EP3013012A1 (en) * 2014-10-21 2016-04-27 Alcatel Lucent Networking device and method for adapting quality of video bitstreaming over a network
WO2018125579A1 (en) * 2016-12-29 2018-07-05 Sony Interactive Entertainment Inc. Foveated video link for vr, low latency wireless hmd video streaming with gaze tracking
JP2019054415A (en) * 2017-09-15 2019-04-04 ソニー株式会社 Information processing apparatus and method

Similar Documents

Publication Publication Date Title
US10805593B2 (en) Methods and apparatus for receiving and/or using reduced resolution images
US7850306B2 (en) Visual cognition aware display and visual data transmission architecture
WO2018219013A1 (en) Image processing method and device, computer readable storage medium and electronic device
CN110022373B (en) Service distribution method, device, server and storage medium
CN110856019B (en) Code rate allocation method, device, terminal and storage medium
JP6924901B2 (en) Photography method and electronic equipment
US11677925B2 (en) Information processing apparatus and control method therefor
CN106131700A (en) A kind of sharing files method and device during net cast
CN110537208B (en) Head-mounted display and method
US20200259880A1 (en) Data processing method and apparatus
CN114071197B (en) Screen projection data processing method and device
US11843755B2 (en) Cloud-based rendering of interactive augmented/virtual reality experiences
US20220312057A1 (en) Method and device for transmitting video content by using edge computing service
CN111510757A (en) Method, device and system for sharing media data stream
WO2016024546A1 (en) Image transmission device, image transmission method, and image transmission program
WO2021200212A1 (en) Information processing device, information processing method, and program
US11265356B2 (en) Network assistance functions for virtual reality dyanmic streaming
EP4363946A1 (en) Head motion dependent viewport region modification for omnidirectional conversational vdd
US20240048726A1 (en) Decoding and encoding based on adaptive intra refresh mechanism
KR20200028069A (en) Image processing method and apparatus of tile images
CN111028192B (en) Image synthesis method and electronic equipment
KR102019866B1 (en) Time slice image processing method and apparatus using mobile terminal
CN110941413A (en) Display screen generation method and related device
CN114143588B (en) Playing control method and electronic equipment
WO2021200226A1 (en) Information processing device, information processing method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21780337

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21780337

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP