WO2021210652A1 - Dispositif, système, procédé de traitement d'informations et programme - Google Patents

Dispositif, système, procédé de traitement d'informations et programme Download PDF

Info

Publication number
WO2021210652A1
WO2021210652A1 PCT/JP2021/015615 JP2021015615W WO2021210652A1 WO 2021210652 A1 WO2021210652 A1 WO 2021210652A1 JP 2021015615 W JP2021015615 W JP 2021015615W WO 2021210652 A1 WO2021210652 A1 WO 2021210652A1
Authority
WO
WIPO (PCT)
Prior art keywords
viewer
video content
degree
eyes
satisfaction
Prior art date
Application number
PCT/JP2021/015615
Other languages
English (en)
Japanese (ja)
Inventor
壮太郎 五十嵐
Original Assignee
株式会社Theater Guild
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Theater Guild filed Critical 株式会社Theater Guild
Publication of WO2021210652A1 publication Critical patent/WO2021210652A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/258Client or end-user data management, e.g. managing client capabilities, user preferences or demographics, processing of multiple end-users preferences to derive collaborative data

Definitions

  • This disclosure relates to information processing devices, systems, methods and programs.
  • Patent Document 1 discloses a social platform for evaluating a video by so-called throwing money for a video in a video sharing service on the Internet.
  • a payment means for settlement of the amount of money corresponding to the item image is provided, and a money throwing function is implemented.
  • an information processing device including a control unit.
  • the control unit detects the viewer's eyes from the image data of the viewer viewing the video content and acquires the viewer's eye opening degree, and the viewer's eye from the image data. It is provided with a blinking number acquisition unit that detects the number of blinks of the viewer and acquires the number of blinks of the viewer, and a calculation unit that calculates the satisfaction level of the viewer with respect to the video content by using the degree of opening and the number of blinks as parameters.
  • a system including a camera and an information processing device.
  • the camera captures a viewer viewing the video content, generates image data and causes the information processing device to acquire the image data
  • the information processing device includes a control unit, and the control unit captures the viewer viewing the video content.
  • the opening degree acquisition unit that detects the viewer's eyes and acquires the viewer's eye opening degree from the image data, and detects the viewer's eyes from the image data and acquires the number of blinks of the viewer. It is provided with a blinking count acquisition unit and a calculation unit for calculating the viewer's satisfaction with the video content using the degree of opening and the number of blinks as parameters.
  • a method for being executed by a computer equipped with a processor is provided.
  • the method is a step in which the processor detects the viewer's eyes from the image data of the viewer viewing the video content and obtains the degree of opening of the viewer's eyes, and the viewer's eyes are obtained from the image data.
  • a step of detecting and acquiring the number of blinks of the viewer and a step of calculating the satisfaction level of the viewer with respect to the video content are executed by using the degree of opening and the number of blinks as parameters.
  • a program for causing a computer having a processor to execute the program is provided.
  • the program tells the processor a step of detecting the viewer's eyes from the image data of the viewer viewing the video content and acquiring the degree of opening of the viewer's eyes, and the viewer's eyes from the image data.
  • a step of detecting and acquiring the number of blinks of the viewer and a step of calculating the satisfaction level of the viewer with respect to the video content are executed by using the degree of opening and the number of blinks as parameters.
  • the degree of opening of the viewer's eyes and the number of blinks are acquired from the image data of the viewer viewing the video content, and the satisfaction level of the viewer with the video content is calculated based on these. do. Therefore, it is possible to acquire information indicating the reaction of the viewer to the video content, for example, for each scene. This makes it possible to analyze the evaluation of the video content.
  • FIG. 1 It is a figure which shows the whole structure of the satisfaction degree calculation system 1. It is a bird's-eye view which shows the example of the appearance of the facility where the satisfaction calculation system 1 is installed. It is a perspective view which shows the example of the appearance of the voice output device 30. It is a block diagram which shows the functional configuration of the server 10 which comprises the satisfaction calculation system 1 of Embodiment 1.
  • FIG. It is a schematic diagram which shows the specific method of calculating the degree of opening of eyes in the degree of opening degree acquisition module 1034 of FIG. It is a figure which shows the functional configuration of the audio output device 30 which comprises the satisfaction calculation system 1 of Embodiment 1.
  • FIG. It is a figure which shows the data structure of the video content database 181 and the viewer database 184 stored in a server 10.
  • FIG. It is a flowchart which shows an example of the flow which performs the video content output processing by the satisfaction calculation system 1 of Embodiment 1.
  • FIG. It is a flowchart which shows an example of the flow which performs the satisfaction calculation processing by the satisfaction calculation system 1 of Embodiment 1.
  • FIG. It is a figure which shows the screen example of the server 10 which displayed the transition of each parameter in the satisfaction degree calculation process.
  • the satisfaction calculation system provides the viewer with an opportunity to display (screen) a video content such as a movie, and at that time, captures the state in which the viewer is watching the video content and also watches the video content. It is a system that acquires the state of a person with a sensor, analyzes the viewer from the captured image data and the detection signal of the sensor, and calculates the degree of satisfaction.
  • the degree of satisfaction is a numerical value of the degree of satisfaction with the video content, which is estimated from the biological reaction of the viewer who viewed the video content.
  • the degree of satisfaction is a numerical value of the degree of satisfaction with the video content, which is estimated from the biological reaction of the viewer who viewed the video content.
  • the eyes eyelids
  • the number of blinks decreases, and the movement of the face. It is empirically known that (number of times and magnitude of movement) decreases and pulse rate and body temperature increase. Therefore, when the viewer who is viewing the video content shows the above reaction, it is considered that the viewer is concentrating on viewing the video content.
  • the satisfaction calculation system of the present disclosure the state in which the viewer is viewing the video content is photographed with a camera, the face of the viewer is identified from the photographed image data, and the eyes of each viewer are seen. Detect. From the movement of the viewer's eyes in the image data, the degree of eye opening and the number of blinks are acquired. In addition, the movement of the viewer's face is acquired by the detection signal of the sensor that detects the movement of the viewer's face and the biological information, and the biological information of the viewer is acquired. Based on the acquired degree of eye opening, number of blinks, face movement, and biometric information, the degree of satisfaction indicating the degree of satisfaction with the video content is calculated. With such a configuration, it is possible to surely acquire the satisfaction level of the viewer's video content based on the biological reaction of the viewer.
  • an audio output device such as headphones or earphones is attached, and the audio is heard from the audio output device.
  • the above-mentioned sensor is arranged in the audio output device, and the viewer can detect the movement of the viewer's face and the biological information of the viewer by wearing the audio output device. With this configuration, it is possible to reliably acquire the movement of the viewer's face and the biological information of the viewer.
  • each viewer who is viewing the video content is identified, and the satisfaction is calculated for each viewer. This makes it possible to acquire satisfaction for each attribute of the viewer and analyze the evaluation of the video content.
  • FIG. 1 is a diagram showing the overall configuration of the satisfaction calculation system 1.
  • the satisfaction calculation system is a system for displaying (screening) video content such as a movie, providing the viewer with an opportunity to view the content, analyzing the viewer at that time, and calculating the satisfaction.
  • the satisfaction calculation system 1 shows a server 10, a video output device 20, and a plurality of audio output devices (in FIG. 1, the audio output device 30A and the audio output device 30B are shown below. Collectively, it may be referred to as "audio output device 30") and the camera 40.
  • the server 10, the video output device 20, the audio output device 30, and the camera 40 are connected to each other so as to be able to communicate with each other via the network 80.
  • the network 80 is composed of a wired or wireless network.
  • the server 10 is a device that manages video content to be viewed by the viewer, image data obtained by capturing the viewer, a viewer detection signal by a sensor, and information on satisfaction with the video content for each viewer.
  • the server 10 transmits the video data of the video content to the video output device 20, and transmits the audio data to the audio output device 30.
  • the server 10 acquires the image data of the viewer and the detection signal of the sensor, and calculates the satisfaction level of the viewer.
  • the server 10 is communicably connected to the video output device 20, the audio output device 30, and the camera 40 via the network 80.
  • the server 10 supports wireless base stations 81 that support communication standards such as 4G, 5G, and LTE (Long Term Evolution), and wireless LAN (Local Area Network) standards such as IEEE (Institute of Electrical and Electronics Engineers) 802.11. It is connected to the network 80 by communicating with a communication device such as the wireless LAN router 82.
  • each device may be connected wirelessly or by wire, and when connecting wirelessly, the communication protocols include, for example, Z-Wave (registered trademark), ZigBee (registered trademark), and Bluetooth (registered).
  • a trademark) or the like may be used, and when connecting by wire, a USB (Universal Serial Bus) cable or the like may be used for connection.
  • the server 10 is a computer connected to the network 80.
  • the server 10 includes a communication IF (Interface) 12, an input / output IF 13, a memory 15, a storage 16, and a processor 19.
  • IF Interface
  • the communication IF 12 is an interface for inputting / outputting signals because the server 10 communicates with an external device.
  • the input / output IF 13 is an interface with an input device (for example, a pointing device such as a mouse, a keyboard) for receiving an input operation from a user, and an output device (display, a speaker, etc.) for presenting information to the user.
  • the memory 15 is for temporarily storing a program, data processed by the program or the like, and is a volatile memory such as a DRAM (Dynamic Random Access Memory).
  • the storage 16 is a storage device for storing data, and is, for example, a flash memory, an HDD (Hard Disc Drive), or an SSD (Solid State Drive).
  • the processor 19 is hardware for executing an instruction set described in a program, and is composed of an arithmetic unit, registers, peripheral circuits, and the like.
  • the video output device 20 is a device that receives video data of video content and displays (screens) it as a video according to an instruction from the server 10.
  • an LED panel in which LED elements are arranged on the edge of a frame or an organic device. It is composed of an EL display or a liquid crystal display.
  • the video output device 20 is provided in a screening facility such as a movie theater.
  • the video output device 20 may be configured such that a plurality of LED panels are arranged in parallel to form a large-scale vision of about several hundred inches so that a large number of viewers can view the video at a time.
  • the video output device 20 is not limited to one, and may be configured by a plurality of video output devices 20, so that a facility provided with the plurality of video output devices 20 can be remotely controlled by one server 10, for example. It may be configured.
  • the audio output device 30 is a device that outputs audio data of video content as audio in response to an instruction from the server 10, and the audio output from the audio output device 30 synchronizes with the video displayed on the video output device 20. It is configured as follows. In addition, when a plurality of video output devices 20 are provided as described above and different video contents are screened, the sound corresponding to the screened video content is configured to be output, and the plurality of sounds are output. It may be configured to be selectable from the output channel.
  • the audio output device 30 is mounted so as to cover the ears of the viewer who views the video content, and is realized by wireless headphones or the like that receive the audio of the video content by wireless communication and output it as audio under the control of the server 10. ..
  • the audio output device 30 is not limited to, for example, headphones connected by wire or those that cover the ears of the viewer, and may be earphones connected wirelessly or by wire.
  • the voice output device 30 includes a communication IF 32, a sensor unit 33, a speaker 34, a memory 35, a storage unit 36, and a processor 39.
  • the communication IF 32 is an interface for inputting / outputting signals because the voice output device 30 communicates with an external device.
  • the sensor unit 33 is a device that acquires the facial movement and biological information of the viewer who is viewing the video content. For example, a gyro sensor that detects the facial movement of the viewer as an angular velocity and a pulse or body temperature are detected. It is composed of an electrocardiographic sensor such as a temperature sensor and an optical sensor that detect it.
  • the sensor unit 33 may be configured by MEMS (Micro Electro Mechanical Systems) in which a sensor and other parts are integrated. That is, the sensor unit 33 has functions such as vibration detection, a thermometer, an electroencephalograph, and a heart rate monitor.
  • the speaker 34 is an output device for converting a voice signal into voice and outputting information as voice to a user.
  • the memory 35 is for temporarily storing a program, data processed by the program or the like, and is a volatile memory such as a DRAM (Dynamic Random Access Memory).
  • the storage unit 36 is a storage device for storing data, and is, for example, a flash memory, an HDD (Hard Disc Drive), or an SSD (Solid State Drive).
  • the processor 39 is hardware for executing an instruction set described in a program, and is composed of an arithmetic unit, registers, peripheral circuits, and the like.
  • the camera 40 is a device that captures a viewer who is viewing video content according to an instruction from the server 10.
  • an image pickup device such as a CCD (Charge Coupled Device) image sensor and the captured image are captured from a digital signal. It is composed of a digital camera or the like equipped with an A / D conversion device for converting the image data.
  • the camera 40 may be configured to change the shooting direction under the control of the server 10, or may be configured to appropriately move the shooting direction in order to enable shooting in a wide range.
  • the camera 40 shoots a viewer viewing the video content as a moving image, or shoots as a still image at predetermined time intervals, and converts the moving image or the still image into image data. Further, the camera 40 transmits the captured image data to the server 10.
  • the number of cameras 40 is not limited to one, and a plurality of cameras 40 may be provided in one facility. Further, the camera 40 is not limited to a single device, and may be configured by, for example, a camera function provided in a mobile terminal such as a smartphone or tablet compatible with a mobile communication system.
  • FIG. 2 is a bird's-eye view showing an example of the appearance of the facility in which the satisfaction calculation system 1 is installed.
  • the satisfaction calculation system 1 shown in FIG. 2 shows an example in which the satisfaction calculation system 1 is installed in a space for showing video contents, such as a movie theater.
  • the satisfaction calculation system 1 is configured so that the viewer wears an audio output device 30 such as headphones to acquire audio, it is possible to eat, drink, and perform other work other than viewing the video content. It is a facility that constitutes a multipurpose space with a so-called cafe space. Further, the satisfaction calculation system 1 may be configured at, for example, the viewer's home.
  • the server 10 is composed of a cloud server or the like, and the server 10 outputs video to a video output device 20 composed of a television or a display used by the viewer by a service such as on-demand, and headphones used by the viewer.
  • the audio output device 30 configured by the above may output the audio, the viewer may be photographed by the camera 40 configured by the smartphone or the like used by the viewer, and the image data may be transmitted to the server 10.
  • the satisfaction calculation system 1 shown in FIG. 2 is provided on the floor F in which the screening space for screening the video content and the cafe space are provided, and the video is displayed at a predetermined location (upper right direction shown in FIG. 2) on the floor F.
  • An output device 20 and a camera 40 are installed.
  • the viewer W wears the audio output device 30 when viewing the video content as described above.
  • the viewer W can concentrate on viewing the video content without worrying about the surrounding situation, and can enjoy the video content like a user of the cafe space. Those who are not watching can eat, drink, and perform other actions without worrying about the audio of the video content.
  • the video output device 20 is composed of an LED panel or the like, it irradiates the viewer W side with light having a constant brightness. Therefore, the camera 40 can shoot the viewer W at a constant resolution by the light.
  • the facility in which the satisfaction calculation system 1 is installed is not limited to a closed space such as a movie theater, but is a space of a commercial facility such as an eat-in corner of a convenience store or an event space. You may. Further, the satisfaction calculation system 1 may be installed in a space where a certain level of quietness is ensured, such as in a restaurant, a hotel lounge, or a shared space of an apartment house such as an apartment. Further, the satisfaction calculation system 1 may be installed in a game facility such as a pachinko or pachislot store, or in a facility with a certain amount of noise such as a game center or an amusement facility.
  • FIG. 3 is a perspective view showing an example of the appearance of the audio output device 30.
  • the audio output device 30 is composed of headphones worn so that both ears are sandwiched from the head in order for the viewer viewing the video content to hear the audio, and the pair of ears accommodating each covering both ears. It is composed of a portion 37 and an arm portion 38 that supports the pair of ear accommodating portions 37 and attaches the ear accommodating portion 37 to the viewer's ear by pressing the viewer's head with an appropriate pressure.
  • the audio output device 30 is not limited to such a configuration, and may be configured such that the ear accommodating portion 37 is mounted so as to be locked to the earlobe of the viewer without providing the arm portion 38. It may be a canal type earphone that is inserted into the ear hole of the viewer.
  • a sensor unit 33 is provided at a position inside the pair of ear accommodating units 37 in contact with the skin around the viewer's ear.
  • the position in the ear accommodating unit 37 is not limited to a specific position, and a plurality of positions may be provided in the ear accommodating unit 37. Further, depending on the type of biological information acquired by the sensor unit 33, it may be arranged so as to be in direct contact with the skin of the viewer, or may be arranged so as to be separated from the skin of the viewer. Since the audio output device 30 has such a configuration, it is desirable to use the audio output device 30 without attaching the ear pads, but it does not prevent the ear pads from being attached.
  • FIG. 4 is a diagram showing a functional configuration of the server 10 constituting the satisfaction calculation system 1 of the first embodiment. As shown in FIG. 4, the server 10 functions as a communication unit 101, a storage unit 102, and a control unit 103.
  • the communication unit 101 performs processing for the server 10 to communicate with an external device.
  • the storage unit 102 stores data and programs used by the server 10.
  • the storage unit 102 stores the video content database 181, the sensor detection signal database 182, the camera-captured image database 183, the viewer database 184, and the like.
  • the video content database 181 is a database for holding information on the video content and its screening time provided by the satisfaction calculation system 1, that is, the screening schedule of the video output device 20. Details will be described later.
  • the sensor detection signal database 182 is a database for holding the detection signals from the sensor unit 33 of the voice output device 30 in chronological order, which is worn by the viewer who uses the satisfaction calculation system 1. Since the detection signal of the sensor unit 33 is transmitted from the voice output device 30 to the server 10 with the date and time information added, the detection signal is stored.
  • the camera-captured image database 183 is a database for holding the image data of the camera 40 in chronological order, which captures the viewers who use the satisfaction calculation system 1. Since the image data of the camera 40 is transmitted from the camera 40 to the server 10 with date and time information added, the detection signal is stored.
  • the viewer database 184 is a database for holding information on viewers who use the satisfaction calculation system 1 and information on the satisfaction of the video content viewed for each viewer. Details will be described later.
  • the viewer database 184 may store member information (including attribute information) registered in the facility.
  • the control unit 103 performs processing according to a program by the processor of the server 10, and as various modules, the reception control module 1031, the transmission control module 1032, the video content output control module 1033, the opening degree acquisition module (opening degree acquisition unit) 1034, The functions shown in the blink count acquisition module (blink frequency acquisition unit) 1035, face movement acquisition module (movement acquisition unit) 1036, biometric information acquisition module (biological information acquisition unit) 1037, and satisfaction calculation module (calculation unit) 1038 are exhibited. do.
  • the reception control module 1031 controls the process in which the server 10 receives a signal from an external device according to the communication protocol.
  • the transmission control module 1032 controls a process in which the server 10 transmits a signal to an external device according to a communication protocol.
  • the video content output control module 1033 controls the process of transmitting the video data of the video content to the video output device 20 and transmitting the audio data to the audio output device 30 in order to allow the viewer to view the video content.
  • the video content output control module 1033 refers to the video content database 181 and controls the video content to be screened while transmitting the image data and the audio data of the video content based on the schedule information indicating the date and time when the video content is screened. It transmits signals and also transmits information such as the data capacity, data format, and resolution of video content.
  • the video content output control module 1033 may collectively transmit and screen the image data and the audio data to the video output device 20 and the audio output device 30, and sequentially transmit and screen the image data and the audio data in a streaming format. May be good.
  • the video content output control module 1033 may transmit credit information such as the screening time of the video content and the name or name of the creator.
  • the opening degree acquisition module 1034 controls a process of acquiring image data taken by the camera 40, analyzing the image data, recognizing and detecting the viewer's eyes, and acquiring the viewer's eye opening degree.
  • the opening degree acquisition module 1034 uses an image recognition technique such as machine learning, which is a known technique, as a method for identifying the eyes of the viewer.
  • the opening degree acquisition module 1034 analyzes the image data to detect the viewer's eyes, calculates the area of the portion of the image data in which the viewer's eyes are opened, and acquires this area as the degree of opening of the eyes. .. Further, the opening degree acquisition module 1034 analyzes the image data to detect the viewer's eyes, and the contact point between the upper eyelid and the lower eyelid (the left and right end points of the eyes) in the portion of the image data where the viewer's eyes are opened. ) And the upper end point of the upper eyelid, or the contact point between the iris (so-called black eye) and the upper eyelid, and the lower end point of the lower eyelid, or the position information of the contact point between the iris and the lower eyelid. Based on the information, it is acquired as the degree of eyelid opening of the viewer.
  • FIG. 5 is a schematic diagram showing a specific method for calculating the degree of opening of the eyes in the degree of opening degree acquisition module 1034 of FIG. The method for calculating the degree of eye opening will be described with reference to FIG.
  • FIG. 5 shows eyes E schematically showing the eyes of the viewer, and the eyes E show the upper eyelid L1, the lower eyelid L2, and the iris L3.
  • FIG. 5A is an example in which the iris L3 is in contact with the upper eyelid L1 and the lower eyelid L2
  • FIG. 5B is an example in which the iris L3 is in contact with the upper eyelid L1 and the lower eyelid L2. This is an example when there is no such thing.
  • the contact points P1 and P2 between the upper eyelid L1 and the lower eyelid L2 and the iris are analyzed by analyzing the image data. It is possible to recognize the positions of the contacts P3 and P4 between L3 and the upper eyelid L1 and the contacts P5 and P6 between the iris L3 and the lower eyelid L2. Therefore, the coordinates of the contacts P1 to P6 in the image data of the viewer's eyes E are calculated, and the degree of opening of the viewer's eyes is acquired based on the position information of the coordinates.
  • the contact points P1 and P2 between the upper eyelid L1 and the lower eyelid L2 are obtained by analyzing the image data. It is possible to recognize the positions of the upper end point P7 of the upper eyelid L1 and the lower end point P8 of the lower eyelid L2. Therefore, the coordinates of the contacts P1, P2 and the end points P7 and P8 in the image data of the viewer's eyes E are calculated, and the degree of opening of the viewer's eyes is acquired based on the position information of the coordinates.
  • the contact points P3 and P4 are recognized for the upper eyelid L1 by the method shown in FIG. 5A.
  • the end point P8 may be recognized by the method shown in FIG. 5 (b), and vice versa.
  • the blink count acquisition module 1035 controls a process of acquiring image data taken by the camera 40, analyzing the image data, recognizing and detecting the viewer's eyes, and acquiring the number of blinks in the viewer's eyes. .. Similar to the opening degree acquisition module 1034, the blink count acquisition module 1035 uses an image recognition technique such as machine learning, which is a known technique, as a method for identifying the eyes of the viewer. In addition, in order to acquire the blink of a person's eyes as image data, a frame rate of 50 fps (frame per second) or more is generally required. Therefore, the camera 40 shoots at a frame rate of 50 fps or more. It is said.
  • the blink count acquisition module 1035 analyzes the image data to detect the viewer's eyes and acquires the number of blinks of the viewer in a predetermined time (for example, 1 minute). Further, in the blink count acquisition module 1035, when the viewer's eyes are closed and the viewer's eyes are opened within a predetermined time (for example, 1 second) from that state, the viewer Recognizes that you have blinked and counts it as the number of blinks. In such a configuration, if the viewer's eyes are closed and the eyes are not opened for a certain period of time, it cannot be said that the viewer is watching the video content during that period, and it is considered that the viewer is in a sleeping state, for example. Therefore, it is set to distinguish it from the sleeping state.
  • a predetermined time for example, 1 minute
  • the face movement acquisition module 1036 acquires a detection signal detected and output by the sensor unit 33 from the audio output device 30 worn when the viewer is viewing the video content, and analyzes the detection signal. Controls the process of acquiring the movement of the viewer's face.
  • the face movement acquisition module 1036 acquires the movement of the viewer's face from the angular velocity detected by the gyro sensor constituting the sensor unit 33 and the time of the movement.
  • the viewer tilts the face back and forth and left and right, or the viewer moves the direction of the face based on the angular velocity detected by the gyro sensor constituting the sensor unit 33 and the time of the movement. Acquires the movement (movement rotated in the left-right direction).
  • the movement in which the viewer tilts his / her face may be a movement by moving the neck or a movement of the entire upper body by moving the trunk portion.
  • a specific example of the movement of the viewer's face to be acquired may be the number of times the viewer moves the face beyond the predetermined movement range within a predetermined time (for example, 10 minutes), and the angle or distance of movement of the face may be obtained. But it may be.
  • the biological information acquisition module 1037 acquires a detection signal detected and output by the sensor unit 33 from the audio output device 30 worn when the viewer is viewing the video content, and analyzes the detection signal. Controls the process of acquiring the viewer's biometric information.
  • the biological information acquisition module 1037 acquires the viewer's pulse, blood pressure, and body temperature detected by the electrocardiographic sensor constituting the sensor unit 33.
  • the satisfaction calculation module 1038 includes the degree of eye opening acquired by the degree of opening acquisition module 1034, the number of blinks acquired by the number of blinks acquisition module 1035, the movement of the face acquired by the face movement acquisition module 1036, and the biometric information acquisition module 1037. Controls the process of calculating the viewer's satisfaction with the video content using the biometric information acquired by The satisfaction calculation module 1038 may calculate the satisfaction only from the degree of eye opening and the number of blinks as parameters, may add facial movement as a parameter, or may add biometric information as a parameter. ..
  • Satisfaction is a time-series quantification of the degree of satisfaction with the video content, which is estimated from the biological reaction of the viewer who viewed the video content as described above, at predetermined time intervals (for example, 1 minute). Yes, it is calculated as the satisfaction level of the viewer for each viewer, or the satisfaction level of a plurality of viewers is calculated as a total value or an average value and used as an evaluation index of the video content. Further, this satisfaction level may be acquired as a time-series transition at predetermined time intervals.
  • the transition of individual satisfaction of viewers it is possible to group viewers who have similar transitions of satisfaction with a certain video content, or to analyze the attributes of viewers for each group. It may be used as an index value for marketing video content.
  • the video content can be selected. You may recommend it.
  • the satisfaction calculation module 1038 sets, for example, the eyes, face, and physical condition of the viewer first acquired as initial values (reference values), and calculates the satisfaction from the subsequent changes in the respective values or the rate of change. Satisfaction may be calculated from the change in each value within a predetermined time or the rate of change, and the difference from the eyes, face, and physical condition (for example, average value) of a general viewer is satisfied. The degree may be calculated.
  • the satisfaction calculation module 1038 performs predetermined calculations with the parameters of the degree of eye opening, the number of blinks, the movement of the face, and the value of biometric information quantified by the above example, and after the calculation, each of them performs a predetermined calculation. Satisfaction is calculated by adding the values of.
  • the function J for calculating the degree of opening of the eyes is a function of the acquired degree of opening value j.
  • the function K for calculating the number of blinks be a function of the value k of the acquired number of blinks.
  • the function L for calculating the movement of the face be a function of the acquired value l of the movement of the face.
  • the function M for calculating the biological information be the value m of the acquired biological information.
  • each function may use each value as it is.
  • the satisfaction calculation module 1038 multiplies the degree of eye opening, the number of blinks, the movement of the face, and the function of biometric information by different predetermined coefficients as weights according to the above example, and after multiplication. Satisfaction is calculated by adding each value of.
  • a, b, c, and d are predetermined coefficients, respectively.
  • the degree of satisfaction may be calculated by using the coefficients a and d as positive coefficient and the coefficients b and c as negative coefficients. Further, it is considered that the value and the rate of change of each of the above values differ depending on the genre of the video content being viewed. Therefore, the weighting of the coefficients a, b, c, and d may be changed for each genre of the video content. Further, it is considered that a time lag may occur until the viewer reacts to each of the above values depending on the genre of the video content being viewed and the emotion of the viewer for each scene. Therefore, each value may be acquired by correcting the time lag.
  • the satisfaction calculation module 1038 adds the functions of the degree of opening of the eyes and the number of blinks according to the above example to obtain the degree of concentration, and obtains the degree of satisfaction by performing a predetermined calculation on the degree of concentration. It may be calculated.
  • R may be a function or value obtained by performing a predetermined operation on the function of facial movement and biometric information, or may be a function or value calculated using other values as parameters.
  • FIG. 6 is a diagram showing a functional configuration of the audio output device 30 constituting the satisfaction calculation system 1 of the first embodiment.
  • the voice output device 30 exerts functions as a communication unit 301, a detection unit 302, a storage unit 303, and a control unit 304.
  • the communication unit 301 performs processing for the voice output device 30 to communicate with an external device.
  • the detection unit 302 performs processing for the voice output device 30 to acquire the movement of the viewer's face and biometric information. It is composed of the above-mentioned gyro sensor and electrocardiographic sensor.
  • the storage unit 303 stores the data and the program used by the voice output device 30.
  • the storage unit 303 stores the gyro sensor detection signal database 381, the biosensor detection signal database 382, and the like.
  • the gyro sensor detection signal database 381 is a database for holding the detection signals of the gyro sensors constituting the detection unit 302 in chronological order.
  • the biosensor detection signal database 382 is a database for holding the detection signals of the electrocardiographic sensors constituting the detection unit 302 in chronological order.
  • the control unit 304 performs processing according to a program by the processor of the voice output device 30, and causes the reception control module 3031, the transmission control module 3032, the voice output module 3033, the gyro sensor control module 3034, and the biosensor control module 3035 as various modules. Demonstrate the function shown.
  • the reception control module 3031 controls a process in which the voice output device 30 receives a signal from an external device according to a communication protocol.
  • the transmission control module 3032 controls a process in which the voice output device 30 transmits a signal to an external device according to a communication protocol.
  • the audio output module 3033 controls a process of outputting audio data transmitted from the server 10 as audio of video content from the speaker 34.
  • the gyro sensor control module 3034 controls the operation processing of the gyro sensor constituting the detection unit 302. Further, the gyro sensor control module 3034 transmits a detection signal to the server 10.
  • the biosensor control module 3035 controls the operation processing of the electrocardiographic sensor constituting the detection unit 302. Further, the biosensor control module 3035 transmits a detection signal to the server 10.
  • FIG. 7 is a diagram showing the data structures of the video content database 181 and the viewer database 184 stored in the server 10.
  • each record in the video content database 181 includes an item "facility ID”, an item “facility name”, an item “content screening information”, and the like.
  • the item "facility ID” is information that identifies each facility in which the satisfaction calculation system 1 is installed. When a plurality of video output devices 20 are installed in one facility and different video contents are screened, the item “facility ID” having a different value is assigned and managed separately.
  • the item "facility name” is information indicating the name of the facility in which the satisfaction calculation system 1 is installed.
  • the item “content screening information” is information on the screening schedule of video content such as a movie to be screened at a facility using the satisfaction calculation system 1. Specifically, the item “start date and time” and the item “end time”. , And the item “content name” and the like. As shown in FIG. 7, one or a plurality of items “content screening information” are stored in chronological order for one item “facility ID”.
  • start date and time is information indicating the start date and start time of the video content to be screened at the facility.
  • the item "end time” is information indicating the end time of the video content to be screened at the facility.
  • the item "content name” is information indicating the name of the video content to be screened at the facility.
  • the server 10 updates the video content database 181 in response to receiving input of video content screening information from the facility manager.
  • Each of the records in the viewer database 184 includes a "viewer ID”, an item “viewer name”, an item “viewing content”, an item “satisfaction information”, and the like.
  • the item "viewer ID” is information that identifies each viewer who has viewed the video content screened at the facility that uses the satisfaction calculation system 1.
  • the item "viewer name” is information indicating the name of the viewer who has viewed the video content screened at the facility using the satisfaction calculation system 1. If the viewer's name information cannot be obtained at the facility, the item “viewer ID” shown in FIG. 7 does not have to be stored as in the record shown by "# 0202".
  • the item "viewing content” is information indicating the name of the video content viewed by the viewer, and corresponds to the item "content name” of the video content database 281.
  • the item "satisfaction information” is information on the satisfaction of the viewer who has viewed the video content screened at the facility using the satisfaction calculation system 1, and specifically, the item “elapsed time” and the item “elapsed time”. Includes “satisfaction” and the like. As shown in FIG. 7, one or a plurality of items “satisfaction information” are stored in a time series for each predetermined time (for example, one minute) for one item “viewer ID”.
  • the item "elapsed time” is information indicating the elapsed time for each predetermined time from the start of the video content.
  • the item "satisfaction” is information indicating the satisfaction of the viewer who has viewed the video content at the time of the item "elapsed time”, and is the value of the satisfaction calculated by the satisfaction calculation module 1038.
  • the satisfaction calculation module 1038 of the server 10 updates the viewer database 184 every time the satisfaction of the viewer who has viewed the video content is calculated.
  • FIG. 8 is a flowchart showing an example of a flow of performing video content output processing by the satisfaction calculation system 1 of the first embodiment.
  • step S111 the video content output control module 1033 of the server 10 refers to the video content database 181 and manages the date and time when the video content is screened.
  • the video content output control module 1033 proceeds to the process of step S112 when it is time to transmit the video content in order to screen it.
  • step S112 the video content output control module 1033 of the server 10 transmits the video data of the video content to the video output device 20 and the audio data to the audio output device 30 in order to allow the viewer to view the video content.
  • step S122 the video output device 20 receives the video data of the video content transmitted by the server 10.
  • the audio output device 30 receives audio data of video content transmitted by the server 10.
  • step S123 the video output device 20 screens the video data of the received video content as a video.
  • the audio output device 30 outputs the audio data of the received video content as audio.
  • the satisfaction calculation system 1 transmits the video data of the video content to the video output device 20 and the audio data to the audio output device 30 at a predetermined time.
  • the video output device 20 screens video data as video, and the audio output device 30 outputs audio data as audio.
  • the video content is screened, and the viewer wears the audio output device 30 for viewing.
  • FIG. 9 is a flowchart showing an example of a flow of performing satisfaction calculation processing by the satisfaction calculation system 1 of the first embodiment.
  • step S211 the gyro sensor control module 3034 of the voice output device 30 controls the operation of the gyro sensors constituting the detection unit 302, and transmits the detection signal of the gyro sensor to the server 10 via the communication unit 301. Further, the biosensor control module 3035 controls the operation of the biosensors constituting the detection unit 302, and transmits the detection signal of the biosensor to the server 10 via the communication unit 301.
  • step S231 the camera 40 transmits the captured image data to the server 10.
  • step S221 the server 10 receives the detection signal transmitted from the voice output device 30 via the communication unit 101. Further, the server 10 receives the image data transmitted from the camera 40 via the communication unit 101. The server 10 stores the received detection signal in the sensor detection signal database 182 and the received image data in the camera-captured image database 183.
  • step S222 the open degree acquisition module 1034 of the server 10 acquires the image data stored in the camera-captured image database 183.
  • the opening degree acquisition module 1034 analyzes the image data, recognizes and detects the eyes of the viewer, and acquires the degree of opening of the eyes in the eyes of the viewer.
  • step S223 the blink count acquisition module 1035 of the server 10 acquires the image data stored in the camera-captured image database 183.
  • the blink count acquisition module 1035 analyzes the image data, recognizes and detects the viewer's eyes, and acquires the number of blinks of the viewer.
  • step S224 the face movement acquisition module 1036 of the server 10 acquires the detection signal stored in the sensor detection signal database 182.
  • the face movement acquisition module 1036 analyzes the detection signal of the gyro sensor and acquires the movement of the viewer's face from the angular velocity of the detection signal and the time of the movement.
  • the biometric information acquisition module 1037 of the server 10 acquires the detection signal stored in the sensor detection signal database 182.
  • the biological information acquisition module 1037 analyzes the detection signal of the electrocardiographic sensor and acquires the biological information of the viewer (the pulse and body temperature of the viewer).
  • step S226 the satisfaction calculation module 1038 of the server 10 is based on the degree of eye opening acquired in step S222, the number of blinks acquired in step S223, the facial movement acquired in step S224, and the biometric information acquired in step S225. Based on this, the degree of satisfaction of the viewer with the video content is calculated.
  • step S227 the satisfaction calculation module 1038 of the server 10 stores the calculated satisfaction in the viewer database 184 and updates the database.
  • FIG. 10 is a diagram showing a screen example of the server 10 displaying the transition of each parameter in the satisfaction calculation process.
  • the screen example of FIG. 10 shows a screen example in which the degree of eye opening, the number of blinks, the movement of the face, and the time-series transition of biological information (pulse) are displayed on the input / output IF13 (display) of the server 10. It corresponds to the calculation result in step S226 of FIG.
  • a graph 1031a showing the degree of opening of the eyes of the viewer, the number of blinks, the movement of the face, and the time-series transition of biological information (pulse). Is displayed as an example.
  • the degree of eye opening, the number of blinks, the movement of the face, and the pulse at the start of the video content are set to 100, and the rate of change of each parameter with the subsequent passage is calculated and graphed.
  • the screening time of the video content is 2 hours.
  • the degree of eye opening and the pulse continue to decrease after the start of the video content, reverse to increase at 30 minutes after the start, and decrease after peaking at 90 minutes after the start. Subsequently, the value is around 100 at the end of the video content. In addition, the number of blinks and the movement of the face continue to increase after the start of the video content, peak at 30 minutes after the start and reverse to decrease, and then reverse to increase at 90 minutes after the start of the video content. At the end, the value is around 100.
  • the coefficient for calculating the degree of satisfaction is set so that the degree of eye opening and the degree of satisfaction are high when the pulse is large (positive coefficient), and when the number of blinks and the movement of the face are small. Set so that the satisfaction level is high (negative coefficient).
  • the image data obtained by capturing the viewer is analyzed, and the degree of eye opening and the number of blinks are acquired. Face movement and biometric information are acquired from the detection signal detected by the sensor. Based on this information, the degree of satisfaction with the video content is calculated. Therefore, it is possible to acquire information indicating the reaction of the viewer to the video content. This makes it possible to analyze the evaluation of the video content.
  • the degree of eye opening, the number of blinks, the movement of the face, and the biological information are acquired at predetermined time intervals, and the satisfaction level is calculated at predetermined time intervals.
  • the satisfaction level is calculated at predetermined time intervals.
  • An information processing device including a control unit, in which the control unit detects the viewer's eyes from image data obtained by capturing a viewer viewing the content and acquires the degree of opening of the viewer's eyes.
  • An information processing device including a calculation unit for calculating satisfaction.
  • the control unit further includes a motion acquisition unit that acquires the movement of the viewer's face from image data or from a detection signal output from a sensor provided in the audio output device worn by the viewer.
  • the information processing device according to (Appendix 1), wherein the calculation unit calculates the degree of satisfaction of the viewer with the video content using the movement of the face as a parameter.
  • the motion acquisition unit acquires the motion of the viewer's face from the detection signal output from the sensor provided in the audio output device worn by the viewer on his / her ear, according to (Appendix 2). Information processing device.
  • the motion acquisition unit acquires a motion in which the viewer's face is tilted or a motion in which the viewer's face is moved from the detection signal output from the gyro sensor provided in the voice output device. , (Appendix 2) or (Appendix 3).
  • the control unit further includes a biometric information acquisition unit that acquires the biometric information of the viewer from the detection signal output from the sensor provided in the audio output device worn by the viewer, and the calculation unit includes a biometric information acquisition unit.
  • the information processing device according to any one of (Appendix 1) to (Appendix 4), which calculates the degree of satisfaction of the viewer with the video content using biological information as a parameter.
  • the biometric information acquisition unit acquires the viewer's pulse or blood pressure information as biometric information from the detection signal output from the pulse sensor provided in the audio output device worn by the viewer (Appendix 5). ).
  • the biometric information acquisition unit acquires the viewer's body temperature information as biometric information from the detection signal output from the temperature sensor provided in the audio output device worn by the viewer (Appendix 5) or.
  • the information processing apparatus according to (Appendix 6).
  • the calculation unit determines whether the difference between each value of the parameter and the reference value at a predetermined time, the rate of change from the reference value, the value of change within a predetermined time, or the rate of change within a predetermined time.
  • the information processing apparatus according to any one of (Appendix 1) to (Appendix 7), which calculates satisfaction based on one or more of them.
  • the calculation unit performs an operation by a predetermined function for each value of the parameter, and adds each value after the operation to calculate the satisfaction level, from (Appendix 1) to (Appendix 8).
  • the information processing device according to any one.
  • the calculation unit calculates satisfaction by multiplying each value after the parameter or calculation by a predetermined coefficient and adding each value after the multiplication.
  • Information processing equipment
  • the calculation unit multiplies each predetermined value after the parameter or calculation by a coefficient of a positive value, and the coefficient of a negative value with respect to each predetermined value after the parameter or calculation.
  • (Appendix 12) The information processing device according to any one of (Appendix 1) to (Appendix 11), wherein the calculation unit identifies one or a plurality of viewers, and calculates satisfaction for each viewer.
  • the opening degree acquisition unit calculates the area of the portion where the viewer's eyes are opened and acquires it as the viewer's eye opening degree, according to any one of (Appendix 1) to (Appendix 12). Information processing device.
  • the degree of opening acquisition unit is the contact point between the upper eyelid and the lower eyelid and the upper end point of the upper eyelid, or the contact point between the iris and the upper eyelid, and the lower eyelid in the portion where the viewer's eyes are opened. Any of (Appendix 1) to (Appendix 12) that calculates the position information of the lower end point or the contact point between the iris and the lower eyelid and acquires it as the degree of opening of the viewer's eyes based on the position information. Information processing device described in the eyelids.
  • (Appendix 15) The information processing device according to any one of (Appendix 1) to (Appendix 14), wherein the blink count acquisition unit acquires the number of blinks of the viewer in a predetermined time of the image data obtained by photographing the viewer.
  • the camera photographs a viewer who views video content, generates image data, and causes the information processing device to acquire the image data.
  • the control unit includes an opening degree acquisition unit that detects the eyes of the viewer from the image data of the viewer viewing the video content and acquires the opening degree of the viewer's eyes, and image data. From, a blinking count acquisition unit that detects the viewer's eyes and acquires the number of blinks of the viewer, and a calculation unit that calculates the satisfaction level of the viewer with the video content using the degree of opening and the number of blinks as parameters. , A system.
  • Appendix 18 A method for being executed by a computer equipped with a processor, wherein the processor detects the eyes of the viewer from the image data of the viewer who views the video content, and the viewer's eyes are detected.
  • 10 server 20 video output device, 30 audio output device, 40 camera, 80 network, 12 communication IF, 13 input / output IF, 15 memory, 16 storage, 19 processor, 33 sensor unit, 101 communication unit, 102 storage unit, 181 Video content database, 182 sensor detection signal database, 183 camera shot image database, 184 viewer database, 103 control unit, 1033 video content output control module, 1034 opening degree acquisition module (opening degree acquisition unit), 1035 blink count acquisition module ( Blink count acquisition unit), 1036 Face movement acquisition module (movement acquisition unit), 1037 Biometric information acquisition module (biological information acquisition unit), 1038 Satisfaction calculation module (calculation unit)

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Strategic Management (AREA)
  • Accounting & Taxation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Finance (AREA)
  • Development Economics (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Game Theory and Decision Science (AREA)
  • Marketing (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Graphics (AREA)
  • Economics (AREA)
  • Health & Medical Sciences (AREA)
  • General Business, Economics & Management (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Image Analysis (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Un serveur (10) d'un système (1) de calcul de niveau de satisfaction comprend : un module d'acquisition de degré d'ouverture (1034), permettant d'acquérir et d'analyser des données d'images capturées par un dispositif de prise de vues et d'acquérir le degré d'ouverture d'un œil d'un spectateur ; un module d'acquisition de nombre de clignements (1035), permettant d'acquérir le nombre de clignements de l'œil du spectateur par analyse des données d'images ; et un module de calcul de niveau de satisfaction (1038) qui, à l'aide du degré d'ouverture de l'œil et du nombre de clignements comme paramètres, calcule le niveau de satisfaction du spectateur par rapport à un contenu vidéo.
PCT/JP2021/015615 2020-04-16 2021-04-15 Dispositif, système, procédé de traitement d'informations et programme WO2021210652A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020073382A JP6856959B1 (ja) 2020-04-16 2020-04-16 情報処理装置、システム、方法及びプログラム
JP2020-073382 2020-04-16

Publications (1)

Publication Number Publication Date
WO2021210652A1 true WO2021210652A1 (fr) 2021-10-21

Family

ID=75377961

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/015615 WO2021210652A1 (fr) 2020-04-16 2021-04-15 Dispositif, système, procédé de traitement d'informations et programme

Country Status (2)

Country Link
JP (2) JP6856959B1 (fr)
WO (1) WO2021210652A1 (fr)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002057765A (ja) * 2000-08-07 2002-02-22 Matsushita Electric Ind Co Ltd 無線電話用ヘッドセット
JP2002344904A (ja) * 2001-02-06 2002-11-29 Sony Corp コンテンツ再生装置、コンテンツ受信装置、コンテンツ呈示制御方法、コンテンツ評価収集解析方法、コンテンツ評価収集解析装置、コンテンツ評価集計管理方法およびコンテンツ評価集計管理装置
JP2007003618A (ja) * 2005-06-21 2007-01-11 Sharp Corp 表示装置および携帯端末装置
JP2009159073A (ja) * 2007-12-25 2009-07-16 Panasonic Corp 音響再生装置および音響再生方法
JP2010520554A (ja) * 2007-03-06 2010-06-10 エムセンス コーポレイション 生理学的データを用いて時間により変化するメディアにおけるユーザ反応の集約されたビューを作成する方法及びシステム
JP2011524656A (ja) * 2008-04-30 2011-09-01 ディーピー テクノロジーズ インコーポレイテッド 改良型ヘッドセット
JP2013114595A (ja) * 2011-11-30 2013-06-10 Canon Inc 情報処理装置、情報処理方法及びプログラム
JP2014127896A (ja) * 2012-12-27 2014-07-07 Samsung R&D Institute Japan Co Ltd 信号処理装置及び信号処理方法
JP2017041673A (ja) * 2015-08-17 2017-02-23 パナソニックIpマネジメント株式会社 視聴状態検出装置、視聴状態検出システムおよび視聴状態検出方法

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006293979A (ja) * 2005-03-18 2006-10-26 Advanced Telecommunication Research Institute International コンテンツ提供システム
FR3083346B1 (fr) * 2018-07-02 2020-07-10 Airbus Operations Procede et dispositif de surveillance de la capacite d'un membre d'equipage d'un aeronef

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002057765A (ja) * 2000-08-07 2002-02-22 Matsushita Electric Ind Co Ltd 無線電話用ヘッドセット
JP2002344904A (ja) * 2001-02-06 2002-11-29 Sony Corp コンテンツ再生装置、コンテンツ受信装置、コンテンツ呈示制御方法、コンテンツ評価収集解析方法、コンテンツ評価収集解析装置、コンテンツ評価集計管理方法およびコンテンツ評価集計管理装置
JP2007003618A (ja) * 2005-06-21 2007-01-11 Sharp Corp 表示装置および携帯端末装置
JP2010520554A (ja) * 2007-03-06 2010-06-10 エムセンス コーポレイション 生理学的データを用いて時間により変化するメディアにおけるユーザ反応の集約されたビューを作成する方法及びシステム
JP2009159073A (ja) * 2007-12-25 2009-07-16 Panasonic Corp 音響再生装置および音響再生方法
JP2011524656A (ja) * 2008-04-30 2011-09-01 ディーピー テクノロジーズ インコーポレイテッド 改良型ヘッドセット
JP2013114595A (ja) * 2011-11-30 2013-06-10 Canon Inc 情報処理装置、情報処理方法及びプログラム
JP2014127896A (ja) * 2012-12-27 2014-07-07 Samsung R&D Institute Japan Co Ltd 信号処理装置及び信号処理方法
JP2017041673A (ja) * 2015-08-17 2017-02-23 パナソニックIpマネジメント株式会社 視聴状態検出装置、視聴状態検出システムおよび視聴状態検出方法

Also Published As

Publication number Publication date
JP2021170248A (ja) 2021-10-28
JP2021170315A (ja) 2021-10-28
JP6856959B1 (ja) 2021-04-14

Similar Documents

Publication Publication Date Title
US20220337693A1 (en) Audio/Video Wearable Computer System with Integrated Projector
US11418893B2 (en) Selective modification of background noises
KR101788485B1 (ko) 활동에 기초한 지능적 디바이스 모드 변화
RU2601287C1 (ru) Устройство установления заинтересованности зрителя при просмотре контента
JP2019522300A (ja) 精神障害の療法のためのモバイルおよびウェアラブルビデオ捕捉およびフィードバックプラットフォーム
US20130245396A1 (en) Mental state analysis using wearable-camera devices
US11626127B2 (en) Systems and methods for processing audio based on changes in active speaker
CN106104650A (zh) 经由凝视检测进行远程设备控制
TW201526667A (zh) 助聽系統與助聽系統之語音擷取方法
US20220312128A1 (en) Hearing aid system with differential gain
JP7009342B2 (ja) 咀嚼や笑みに係る量に基づき食事を評価可能な装置、プログラム及び方法
US20220232321A1 (en) Systems and methods for retroactive processing and transmission of words
WO2017064891A1 (fr) Système de traitement d'informations, procédé de traitement d'informations, et support de stockage
US20150170674A1 (en) Information processing apparatus, information processing method, and program
US20210350823A1 (en) Systems and methods for processing audio and video using a voice print
CN108572729A (zh) 使用机器学习的电子设备和方法
KR20180017821A (ko) 실시간 시청자 반응을 전달하는 방송 서비스 장치
EP2402839A2 (fr) Système et procédé d'indexation de contenu affiché sur un dispositif électronique
US11580727B2 (en) Systems and methods for matching audio and image information
CN109255314B (zh) 信息提示方法、装置、智能眼镜及存储介质
US20220284915A1 (en) Separation of signals based on direction of arrival
WO2018075523A1 (fr) Système informatique vestimentaire audio/vidéo à projecteur intégré
US20200301398A1 (en) Information processing device, information processing method, and program
KR20140115153A (ko) 관객 반응 분석 장치 및 방법과 이를 이용한 관객 반응 분석 시스템
WO2021210652A1 (fr) Dispositif, système, procédé de traitement d'informations et programme

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21789166

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21789166

Country of ref document: EP

Kind code of ref document: A1