WO2013099629A1 - 情報処理装置及び方法 - Google Patents
情報処理装置及び方法 Download PDFInfo
- Publication number
- WO2013099629A1 WO2013099629A1 PCT/JP2012/082375 JP2012082375W WO2013099629A1 WO 2013099629 A1 WO2013099629 A1 WO 2013099629A1 JP 2012082375 W JP2012082375 W JP 2012082375W WO 2013099629 A1 WO2013099629 A1 WO 2013099629A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- ring
- group
- information processing
- information
- unit
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H05—ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
- H05B—ELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
- H05B47/00—Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
- H05B47/10—Controlling the light source
- H05B47/105—Controlling the light source in response to determined parameters
- H05B47/115—Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F21—LIGHTING
- F21V—FUNCTIONAL FEATURES OR DETAILS OF LIGHTING DEVICES OR SYSTEMS THEREOF; STRUCTURAL COMBINATIONS OF LIGHTING DEVICES WITH OTHER ARTICLES, NOT OTHERWISE PROVIDED FOR
- F21V33/00—Structural combinations of lighting devices with other articles, not otherwise provided for
- F21V33/0004—Personal or domestic articles
- F21V33/0008—Clothing or clothing accessories, e.g. scarfs, gloves or belts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/014—Hand-worn input/output arrangements, e.g. data gloves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04B—TRANSMISSION
- H04B7/00—Radio transmission systems, i.e. using radiation field
- H04B7/24—Radio transmission systems, i.e. using radiation field for communication between two or more posts
- H04B7/26—Radio transmission systems, i.e. using radiation field for communication between two or more posts at least one of which is mobile
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/72409—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/72409—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
- H04M1/724094—Interfacing with a device worn on the user's body to provide access to telephonic functionalities, e.g. accepting a call, reading or composing a message
- H04M1/724095—Worn on the wrist, hand or arm
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/72409—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
- H04M1/72412—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories using two-way short-range wireless interfaces
-
- H—ELECTRICITY
- H05—ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
- H05B—ELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
- H05B47/00—Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
- H05B47/10—Controlling the light source
- H05B47/175—Controlling the light source by remote control
- H05B47/19—Controlling the light source by remote control via wireless transmission
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02B—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
- Y02B20/00—Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
- Y02B20/40—Control techniques providing energy savings, e.g. smart controller or presence detection
Definitions
- the present technology relates to an information processing apparatus and method, and more particularly, to an information processing apparatus and method that can realize a new form of communication tool.
- This technology has been made in view of such a situation, and makes it possible to realize a new form of communication tool.
- An information processing apparatus includes a light emitting unit that emits light according to a light emission pattern that is specified by a combination of light emission parameters that indicate the characteristics of light emission, one or more other information processing apparatuses that include the light emitting unit, and the own apparatus. And a control unit that controls a light emission pattern of the light emitting unit based on a relationship between the group to which the light emitting device belongs and the own device or a relationship between the other device and another information processing device belonging to the group.
- the external target is another information processing device that does not belong to the group, and includes at least the light emitting unit and the sensor unit, and the control unit is configured such that the group to which the device belongs belongs to the sensor unit of the external target.
- the condition that the target is targeted by the external object is determined as the condition of the influence of the external object, and it is determined whether the condition of the external effect and the condition of the relationship are satisfied, and the light emitting unit
- the light emission pattern can be controlled.
- the external object is an imaging device that photographs a plurality of users of each of the own device and one or more other information processing devices belonging to the group, and the communication control unit is determined by the swell determination unit When the degree of swell is equal to or greater than the threshold value, a shooting start request can be transmitted to the imaging device so that shooting of the imaging device with the user of the device as a shooting target is started.
- the swell determination unit is configured to generate the swell based on information indicating a degree of relevance or difference in synchrony of movements of a plurality of users' bodies or body parts, which is generated based on a detection result of the sensor unit. The degree can be determined.
- the external object is a predetermined image of a user of the information processing apparatus belonging to the group or one or more of the other information processing apparatuses, from an imaging apparatus that captures a place where the group exists, and the imaging result of the imaging apparatus.
- an external device that obtains a photographing result indicating a state of a time zone, and the communication control unit determines a time when the degree of the swell determined by the swell determination unit is equal to or greater than a threshold value by a user of the user's own device. It can be transmitted to the external device as acquisition information used when acquiring a photographing result indicating a state of a time zone including the time.
- the communication control unit can also transmit the position of the device itself to the external device as the acquisition information.
- the communication control unit receives information capable of specifying a light emission pattern of the light emitting unit of another information processing apparatus belonging to the group, and the control unit can specify the light emission pattern received by the communication control unit
- the light emission pattern of the light emitting unit can be controlled based on the light emission pattern specified by the information.
- the connecting part can connect a part of the part by elastic deformation of the material.
- the information processing method according to the first aspect of the present technology is a method corresponding to the information processing apparatus according to the first aspect of the present technology described above.
- FIG. 1 shows schematic structure of a ring. It is a perspective view of a ring. It is a figure which shows schematic structure of a board
- An example of the configuration of a list of management information is shown.
- An example of the configuration of a list of management information is shown. It is a flowchart explaining the flow of the process which SNS and a video management apparatus perform in cooperation. It is a figure explaining the 5th example of group control. It is a flowchart explaining the flow of a behavior control process. It is a figure which shows the selection method of a leader. It is a figure which shows a mode that a group expands. It is a state transition diagram which shows an example of the state which a ring can take. It is a figure showing a state transition until a group disappears.
- An information processing apparatus to which the present technology is applied has a diameter suitable for being worn on a user's arm, such as a wristband, a bracelet, or the like, in which various sensors, wireless modules, light emitting elements, vibrators, and the like are incorporated.
- a ring-type device (hereinafter abbreviated as a ring), which is worn on the user's arm.
- the basic functions of the ring are light emission and vibration, but there are many patterns (substantially infinite) of these light emission and vibration.
- Information obtained by the built-in sensor and information obtained from the outside in the form of communication It is possible to selectively switch according to various types of information.
- a group consisting of a plurality of rings attached to respective arms of the plurality of users.
- the control using at least a part of the plurality of rings is executed.
- Such control is hereinafter referred to as group control.
- group control For example, control of light emission and vibration of each of a plurality of rings in the group, and control based on light emission and vibration of each of the plurality of rings with respect to an object (information processing device such as a camera) other than the ring are examples of group control. It is.
- each of a plurality of rings is based on the relationship between one or more other rings and the group to which the ring itself belongs and the ring itself, or the relationship between a predetermined other ring belonging to the group and the ring itself.
- the light emission and vibration are controlled.
- light emission and vibration of each of the plurality of rings are controlled based on the result of communication with an object other than the ring that does not belong to the group.
- FIG. 1 is a diagram illustrating a schematic configuration of a ring 1 as an embodiment of an information processing apparatus to which the present technology is applied.
- 1A is a front view of the ring 1.
- the front surface is a surface when the arm to which the ring 1 is attached is viewed from the fingertip side of the hand.
- the upper diagram in FIG. 1A is a top view of the ring 1 when the ring 1 is viewed in the direction of the arrow a.
- 1A is a right side view of the ring 1 when the ring 1 is viewed in the direction of the arrow b.
- the lower diagram in FIG. 1A is a bottom view of the ring 1 when the ring 1 is viewed in the direction of the arrow c.
- 1A is a left side view of the ring 1 when the ring 1 is viewed in the direction of the arrow d.
- FIG. 1B is a rear view of the ring 1.
- a substrate 21 is embedded in the inner peripheral portion 1 a of the ring-shaped ring 1.
- various sensors, a wireless module, a light emitting element, a vibration mechanism, and the like are mounted on the substrate 21.
- a connecting portion 22 is provided at a position facing the substrate 21.
- the connecting portion 22 opens and connects both ends of the circumferential portion constituting the side surface of the ring 1. That is, the user can pass the ring 1 through the arm by removing both ends of the connecting portion 22 or remove the ring 1 and can attach the ring 1 by connecting both ends of the connecting portion 22. it can.
- the outer peripheral portion 1b of the ring 1 When viewed from the front or the back, the outer peripheral portion 1b of the ring 1 is circular, whereas the inner peripheral portion 1a is elliptical.
- the reason why the outer peripheral portion 1b of the ring 1 is circular is to uniformly guide light from the light emitting element mounted on the substrate 21.
- the reason why the inner peripheral portion 1a of the ring 1 is elliptical is to improve the wearing feeling of the ring 1 by adopting a shape that matches the shape of the user's arm.
- FIG. 2 is a perspective view of the ring 1.
- FIG. 2A is a perspective view of the ring 1 from a direction in which the substrate 21 can be visually recognized.
- B of FIG. 2 is a perspective view of the ring 1 from a direction in which the connecting portion 22 can be visually recognized. As shown in FIG. 2, both ends of the circumferential portion of the ring 1 are connected at the connecting portion 22.
- FIG. 3 is a diagram showing a schematic configuration of the substrate 21 mounted on the ring 1.
- 3A is a front view of the substrate 21.
- the front surface is a surface on which various sensors, wireless modules, light emitting elements, vibration mechanisms, and the like are mounted.
- 3A is a top view of the substrate 21 when the substrate 21 is viewed in the direction of the arrow a.
- 3A is a right side view of the substrate 21 when the substrate 21 is viewed in the direction of the arrow b.
- the lower diagram in FIG. 3A is a bottom view of the substrate 21 when the substrate 21 is viewed in the direction of the arrow c.
- the left side of FIG. 3A is a left side view of the substrate 21 when the substrate 21 is viewed in the direction of the arrow d.
- FIG. 3B is a rear view of the substrate 21.
- FIG. 3C is a perspective view of the substrate 21.
- a CPU (Central Processing Unit) 31, a wireless module 32, a triaxial acceleration sensor 33, and an LED (Light Emitting Diode) 34 are mounted on the front surface of the substrate 21.
- the wireless module 32 is disposed in a substantially central portion, and each of the four LEDs 34 is disposed near the four corners of the wireless module 32.
- the CPU 31 is disposed at one end of the both ends of the substrate 21 in the longitudinal direction, and the triaxial acceleration sensor 33 is disposed at the other end.
- the CPU 31 executes various processes according to a program or the like recorded in a built-in memory (a storage unit 88 in FIG. 1 described later).
- the internal memory appropriately stores data necessary for the CPU 31 to execute various processes.
- the wireless module 32 exchanges various types of information with other information processing apparatuses or other rings 1 through wireless communication.
- the triaxial acceleration sensor 33 detects acceleration in three axial directions substantially orthogonal to each other, and supplies sensor information indicating the detection result to the CPU 31. That is, the triaxial acceleration sensor 33 detects a change in physical quantity, that is, acceleration caused by the user's operation.
- the LED 34 emits light in a plurality of patterns according to the control of the CPU 31.
- the light emission pattern is a combination of light emission parameters indicating the characteristics of light emission, that is, a combination of one or more elements among a plurality of elements including, for example, light quantity, light emission color, time of light emission interval, and time for which light emission continues. Identified.
- the substrate 21 may be provided with a sensor for acquiring biological information such as a heart rate sensor, a blood pressure sensor, and a body temperature sensor. Further, the substrate 21 may be provided with a sensor for acquiring environmental information, such as a geomagnetic sensor, a pressure sensor, an air temperature sensor, a humidity sensor, a sound sensor, a video sensor, an ultraviolet sensor, and a radioactivity sensor.
- a sensor for acquiring biological information such as a heart rate sensor, a blood pressure sensor, and a body temperature sensor.
- environmental information such as a geomagnetic sensor, a pressure sensor, an air temperature sensor, a humidity sensor, a sound sensor, a video sensor, an ultraviolet sensor, and a radioactivity sensor.
- a vibration mechanism 35 is mounted on the back surface of the substrate 21.
- the vibration mechanism 35 vibrates in a plurality of patterns according to the control of the CPU 31.
- the vibration pattern is specified by a combination of vibration parameters indicating the characteristics of the vibration, that is, a combination of one or more elements among a plurality of elements including, for example, the number of vibrations, the time between vibrations, and the time during which vibrations continue. .
- the substrate 21 has a size that can be hidden by a logo or mark drawn on the surface of the ring 1, so that even if the main body of the ring 1 is transparent or translucent, the appearance is not impaired.
- FIG. 4 is a cross-sectional view of the ring 1.
- 4A is a cross-sectional view of the ring 1 taken along the line mm ′ in the diagram on the left side of FIG. 1A.
- FIG. 4B is a cross-sectional view of the ring 1 taken along the line nn ′ in the diagram on the left side of FIG.
- the main body 20 of the ring 1 has a three-layer structure of an inner light guide layer 41 and a reflection layer / diffusion layer 42 sandwiching the light guide layer 41 from the inner peripheral side and the outer peripheral side. is doing.
- the light guide layer 41 guides the light emitted from the LEDs 34 of the substrate 21 to the entire ring 1 like the optical fiber cable.
- the reflection / diffusion layer 42 reflects and diffuses the light guided by the light guide layer 41 of the substrate 21 to the outside of the ring 1.
- the plurality of LEDs 34 are arranged in an axially symmetric position without a shield.
- the light emitted from the LEDs 34 arranged in this way is guided to the entire body 20 of the ring 1 by the light guide layer 41 and diffused to the outside of the ring 1 by the reflection / diffusion layer 42.
- a groove 20a is formed inside the connecting portion 22 of the ring 1, and a plurality of suction plates 43 made of iron or the like that are attracted to the magnet are formed at the bottom of the groove 20a (see FIG. 4).
- a smaller number (two in the example of FIG. 4) of magnets 44 than the attracting plate 43 is fitted on the outer side of the connecting portion 22 of the ring 1 and fits into a groove 20 a provided on the inner side of the connecting portion 22. Is provided.
- both ends of the circumferentially long portion of the ring 1 are connected by the attractive force that the magnet 44 attracts the attracting plate 43. Both ends are separated by applying a force larger than the adsorption force.
- connection portion 22 allows the ring 1 to be attached to arms of various sizes from thin arms to thick arms.
- connection portion 22 is not provided with a fastener or the like, it is easy for the user to wear and is easy to attach and detach, and the light from the LED 34 is guided and diffused without being blocked.
- the light quantity at each of both ends of the ring 1 is weak, the light is added to the user's eyes as a result of the addition of the light quantity because the both ends are overlapped by the connecting part 22. Since it looks as if it is uniformly diffused, the appearance is improved.
- the vibration mechanism 35 provided on the substrate 21 is provided so as to protrude from the inner peripheral portion of the ring 1. Thereby, since the vibration mechanism 35 is brought into contact with the user's arm, the vibration of the vibration mechanism 35 is easily conducted to the user's arm.
- a synthetic resin such as polyurethane elastomer (RU-842A-CLR) can be employed.
- Both the inner side and the outer side of the connecting portion 22 are transparent or translucent, but the outer side contains, for example, 5% of a diffusing material that diffuses light.
- suction plate 43 for example, SUS430 can be adopted. Further, as a material of the magnet 44, for example, neodymium can be adopted.
- the substrate 21 can be covered with a cover.
- a cover material in this case, for example, aluminum can be adopted.
- a spacer can be mounted between the cover and the vibration mechanism 35, and for example, PET can be adopted as the material of the spacer.
- FIG. 5 is a diagram illustrating a schematic configuration of a ring 51 having a configuration different from that of the ring 1.
- the ring 1 has a three-layer structure of the light guide layer 41 and the reflection / diffusion layer 42, whereas the ring 51 has a single-layer structure of only the light guide layer 61.
- the connecting portion 22 of the ring 1 is provided with the magnet 44 and the suction plate 43, whereas the connecting portion 62 of the ring 51 is not provided with anything.
- the connecting portion 62 is processed so that the inner end portion 60a and the outer end portion 60b come into contact with each other and overlap when the force is not applied to the main body 60 of the ring 51.
- the main body 60 is elastically deformed so that the diameter of the circle formed by the main body 60 increases.
- the main body 60 returns to the original position by the elastic force. That is, both ends of the connecting portion 62 are connected by elastic deformation of the material of the ring 51.
- the material of the main body 60 the same polyurethane elastomer as that of the main body 20 can be employed.
- the ring 51 can be attached to arms of various sizes by such a structure of the connecting portion 62.
- the connecting portion 62 is not provided with a fastener or the like, it is easy for the user to wear and is easy to attach and detach, and the light from the LED 34 is guided without being blocked.
- the light quantity in each of the both ends of the ring 51 is weakened, the light is added to the user's eyes as a result of the addition of the light quantity because the both ends are overlapped by the connecting part 62. Because it appears to shine evenly, the aesthetics of the appearance are improved.
- the attracting plate 43, the magnet 44, the fastener, and the like are not provided in the connecting portion 62, the manufacturing cost can be suppressed.
- the main body 60 of the ring 51 may have a three-layer structure like the ring 1.
- the main body 20 of the ring 1 described above may have a one-layer structure, like the main body 60 of the ring 51.
- the ring 1 will be described, but the present invention is also applicable to the ring 51.
- FIG. 6 is a functional block diagram showing a functional configuration of the substrate 21 of FIG.
- the substrate 21 includes a CPU 31, a wireless module 32, a triaxial acceleration sensor 33, an LED 34, and a vibration mechanism 35.
- the substrate 21 may be provided with other sensors (not shown).
- the sensor information acquisition part 81 mentioned later acquires the information detected from the other sensor which is not shown in figure.
- the CPU 31 includes a sensor information acquisition unit 81, a rise determination unit 82, a behavior control unit 83, a communication control unit 84, a position information acquisition unit 85, a time information acquisition unit 86, a distance calculation unit 87, a storage unit 88, A target identification unit 89, a group formation unit 90, and a group information management unit 91 are included.
- FIG. 7 is a diagram for explaining a first example of group control.
- the group control of the first example of FIG. 7 includes, for example, a plurality of users until a plurality of users gather at the predetermined venue under the common purpose of participating in a predetermined event held at the predetermined venue. Control of light emission and vibration of the ring mounted on each of the user's arms is executed. In the first example, it is considered that a plurality of rings 1 heading for a predetermined venue form a group.
- the user Uk wearing the ring 1-k (k is an arbitrary integer value of 1 or more) is heading to a venue where a predetermined event is held.
- group control is performed to change the light emission and vibration patterns of the ring 1-k according to the number of other rings 1.
- the storage unit 88 of the ring 1-k uses behavior information that defines the behavior of various patterns of light emission and vibration, and information including position information of a predetermined venue (hereinafter referred to as event information) in an arbitrary manner. It is assumed that it is acquired and stored in
- the position information acquisition unit 85 of the ring 1-1 acquires the current position information of the ring 1-1 by using GPS (Global Positioning System).
- the distance calculation unit 87 calculates the distance to the venue from the current location information of the ring 1-1 acquired by the location information acquisition unit 85 and the location information of the predetermined venue stored in the storage unit 88. To do. In this case, it is assumed that the distance calculation unit 87 calculates that the distance between the ring 1-1 and the venue is within a preset one kilometer.
- the communication control unit 84 of the ring 1-1 starts acquiring the position information of the other ring 1 by the wireless module 32, and detects the number of the other rings 1 existing within a predetermined range.
- the communication control unit 84 of the ring 1-1 determines the other ring 1 existing within the predetermined range from the acquired position information of the other ring 1-2 existing within the predetermined range. Is detected to be one.
- the behavior control unit 83 of the ring 1-1 has a predetermined pattern according to the behavior information stored in the storage unit 88, for example, when the number of other rings 1 existing within a predetermined range is 5 or less. In the first pattern defined in the above, light emission and vibration occur.
- the behavior control unit 83 of the ring 1-1 uses the number of other rings 1 existing within a predetermined range as the pattern of light emission and vibration from the first pattern in the upper side of FIG. Change to the second pattern prescribed in the case. As a result, the ring 1-1 emits light and vibrates in the second pattern. In this case, the second pattern of light emission and vibration is set so that the amount of light emission and the strength of vibration increase compared to the first pattern.
- the behavior control unit 83 of the ring 1-1 determines the number of other rings 1 existing within a predetermined range from the second pattern of the second diagram from the top in FIG.
- the pattern is changed to the third pattern defined in the case of 100 or more.
- the ring 1-1 emits light and vibrates in the third pattern.
- the third pattern of light emission and vibration is set so that the amount of light emission and the strength of vibration increase compared to the second pattern.
- FIG. 8 is a flowchart for explaining the flow of behavior control processing.
- the storage unit 88 stores event information.
- the event information includes behavior information and position information of a predetermined venue.
- the event information may be stored in advance in the storage unit 88 of the ring 1 or may be provided from the event organizer by any providing means.
- step S12 the distance calculation unit 87 calculates the distance to a predetermined venue. That is, the distance calculation unit 87 calculates the distance to the venue from the current location information of the ring 1 acquired by the location information acquisition unit 85 and the location information of the predetermined venue stored in the storage unit 88.
- step S13 the distance calculation unit 87 determines whether the distance from the current position of the ring 1 to the predetermined venue is within a predetermined range. For example, it is determined whether the distance from the current position of the ring 1 to a predetermined venue is within a preset one kilometer.
- step S13 If the distance is not within the predetermined range, it is determined as NO in step S13, the process returns to step S12, and the subsequent processes are repeated. That is, until the distance is within the predetermined range, the loop processing of step S12 and step S13 is repeated.
- Step S13 when the user wearing the ring 1 approaches the predetermined venue and the distance to the venue is within the predetermined range, it is determined as YES in Step S13, and the process proceeds to Step S14.
- step S14 the communication control unit 84 acquires position information of another ring 1 existing within a predetermined range. That is, the communication control unit 84 starts to acquire the position information of the other ring 1 by the wireless module 32, and detects the number of other rings 1 existing within a predetermined range.
- step S15 the behavior control unit 83 executes the behavior according to the behavior information stored in the storage unit 88.
- the behavior information defines a light emission and vibration pattern that changes in accordance with the number of other rings 1 existing within a predetermined range.
- the light emission amount and vibration intensity patterns defined in the behavior information are set so as to increase as the number of other rings 1 existing within a predetermined range increases.
- step S16 the behavior control unit 83 determines whether the end of the process is instructed.
- the instruction to end the process is not limited.
- a user who wears the ring 1 may perform a predetermined operation instructing the end of the process.
- predetermined information instructing the end of processing may be transmitted to the ring 1 by the event organizer.
- step S16 If the end of the process is not instructed, it is determined as NO in step S16, the process returns to step S14, and the subsequent processes are repeated. That is, the loop process of steps S14 to S16 is repeated until the end of the process is instructed.
- Step S16 Thereafter, when the end of the process is instructed, it is determined as YES in Step S16, and the behavior control process ends.
- the light emission amount and the vibration intensity pattern change using the distance to the venue and the number of rings 1 existing within a predetermined range as parameters. That is, the light intensity of the ring 1 and the intensity of vibration increase as the venue approaches the venue, and the user's sense of expectation for the event and a sense of excitement of participating in the event gradually increase.
- the user wearing the ring 1 can obtain sympathy and a sense of unity with other users in the surrounding area by observing that the ring 1 of the other users in the surrounding area also emits light and vibrates.
- the user of the ring 1 can also obtain a sense of superiority with respect to other users who are not wearing the ring 1.
- FIG. 9 is a diagram illustrating a second example of group control.
- the group control of the second example of FIG. 9 is a group of a plurality of rings 1 mounted on each of a plurality of user arms, for example, in a venue where an event by a predetermined artist is performed. Control is performed in which a group in which leaders are selected from the ring 1 is formed.
- group control of the second example when an artist wearing the ring 1 that does not belong to a group designates the group as a target, only the reader reacts to emit light and vibrate.
- the method for selecting the leader is not particularly limited, but in this embodiment, an action that can express the user's mood uplift, that is, the excitement of the event, is detected by each of the plurality of rings 1 in the group, and the detection result is A method in which a leader is elected based on this is adopted.
- the group formation unit 90 of the ring 1-k attached to the user Uk compares the degree of swell of the user Uk with the degree of swell of other users within a certain range. Then, a group is formed in which the ring 1 attached to the user with the highest degree of excitement is the leader.
- the sensor information acquisition unit 81 the swell determination unit 82, the behavior control unit 83, the communication control unit 84, the position The information acquisition unit 85, the storage unit 88, the target identification unit 89, the group formation unit 90, and the group information management unit 91 function.
- the sensor information acquisition unit 81 of the ring 1-k detects information (hereinafter referred to as motion information) indicating a change in physical quantity caused by the movement of the arm swing by the triaxial acceleration sensor 33.
- the swell determination unit 82 of the ring 1-k recognizes the number of times the arm on which the ring 1-k is worn is swung from the motion information, and determines that the degree of swell is greater as the number of times increases.
- the group forming unit 90 forms a group gr in which the leader exists by repeating the comparison between the degree of swell of the other ring 1 received by the communication control unit 84 and the degree of swell of itself. Details of the method for forming the group gr will be described later.
- a group gr including rings 1-1 to 1-N is formed. Then, the user U4 wearing the ring 1-4 on his / her arm has performed the action that can express the most excitement (for example, swings many arms), so the degree of the excitement of the ring 1-4 becomes the largest, and the ring 1 ⁇ 4 is elected as the leader.
- the most excitement for example, swings many arms
- the artist UA wearing the ring 1-A on his arm designates a predetermined group gr in the venue as a target by a predetermined movement.
- the predetermined movement the movement in which the artist UA swings the arm wearing the ring 1-A and points to the predetermined group gr as a target is employed.
- the target specifying unit 89 of the ring 1-A specifies the coordinate value of the target group gr. Details of the specification of the coordinate value of the target by the target specifying unit 89 will be described later with reference to FIGS.
- the nomination information including the target coordinate value and the behavior information is transmitted to the plurality of rings 1 existing in the venue by the communication control unit 84 of the ring 1-A.
- the behavior control unit 83 of the ring 1 that has received the designation information reacts only when it is a leader of the group gr that is targeted, and controls the light emission and vibration of the pattern defined in the behavior information.
- FIG. 10 is a configuration diagram of an information processing system in the second example of group control.
- the information processing system 100 includes a ring 1-A and a plurality of rings belonging to each of the groups gr1 to grM (M is an integer value of 1 or more).
- the group grk (k is an arbitrary integer value from 1 to M) is composed of a plurality of rings 1-1 to 1-N existing within a certain range.
- Ring 1-A does not belong to any group gr. *
- Ring 1-A and a plurality of rings 1 included in group gr are connected to each other by, for example, a multi-hop wireless LAN (Local Area Network) mesh network defined in IEEE802.11S.
- LAN Local Area Network
- FIG. 11 is a diagram showing a fixed terminal device for calculating the relative position between the rings 1.
- fixed terminal apparatuses cp1 to cp3 are arranged apart from each other on the stage ST installed in the venue.
- a fixed terminal device cp When it is not necessary to individually distinguish the fixed terminal devices cp1 to cp3, they are collectively referred to as a fixed terminal device cp.
- the fixed terminal device cp is fixed at a predetermined position of the stage ST, and its position information is known.
- the fixed terminal device cp is connected to a multi-hop wireless LAN mesh network in which the ring 1-A and a plurality of rings 1 are connected.
- the position information acquisition unit 85 of the ring 1-A uses information obtained in wireless communication with the plurality of rings 1 and fixed terminal devices cp by the communication control unit 84 on the wireless LAN mesh network including the fixed terminal device cp. Thus, the coordinates of the relative positions of the ring 1-A and the plurality of rings 1 are calculated. That is, the position information acquisition unit 85 has three points based on the position of the fixed terminal device cp based on information such as electric field strength and delay time obtained by wireless communication with the fixed terminal device cp by the communication control unit 84. By performing surveying, the coordinates of the relative positions of the ring 1-A and the plurality of rings 1 are calculated. The calculated coordinates of the relative positions of the ring 1-A and the plurality of rings 1 are stored in the storage unit 88 of the ring 1-A.
- the position information acquisition units 85 of the plurality of rings 1 included in the group gr also calculate the coordinates of the relative positions of the ring 1-A and the plurality of other rings 1, and store them in the storage unit 88.
- the group forming unit 90 of the ring 1 forms a group gr composed of a plurality of rings 1 existing within a certain range.
- the group forming unit 90 of the ring 1 stores the coordinate value of the formed group gr in the group information management unit 91 as its own coordinate value.
- the coordinate values of the group gr are the coordinate value of the center position of the group gr (the position of the ring 1 existing in the center among the plurality of rings 1 in the group gr) and the range of the group gr (the group existing farthest from the center position). distance to ring 1 in gr).
- the formation method of the group gr is not limited, for example, it is formed as follows.
- the group forming unit 90 of the ring 1 stores the coordinate value acquired by the position information acquiring unit 85 through communication using the fixed terminal device cp in the group information managing unit 91 as an initial value of the coordinate value of the group gr. .
- the group forming unit 90 stores a flag (hereinafter referred to as an affiliation flag) indicating whether or not it belongs to the group gr in the group information management unit 91.
- the initial value of the affiliation flag is “does not belong”.
- the group forming unit 90 stores in the group information management unit 91 a flag indicating whether or not it is a leader of the group gr (hereinafter referred to as a leader flag).
- the initial value of the reader flag is “not a reader”.
- the communication control unit 84 of the ring 1 searches for another ring 1 that can be paired, that is, another ring 1 that can establish wireless communication. Then, the group forming unit 90 of the ring 1 executes a change process of the following flag and its own coordinate value (that is, the coordinate value of the group gr) with the other ring 1 that can be paired.
- the group forming unit 90 of the ring 1 determines the degree of mutual swell by the determination result by the swell determination unit 82 when the affiliation flag of the ring 1 and the affiliation flag of the other ring 1 are both “not belonging”. Use and compare. In this case, it is determined that the degree of swell is greater as the number of times the arm with the ring 1 is swung is increased.
- the group forming unit 90 of the ring 1 with the higher degree of swell changes its own leader flag to “is a leader”.
- the group forming unit 90 of the ring 1 having the smaller swell degree changes its reader flag to “not a leader”.
- the group forming unit 90 of the ring 1 with the smaller degree of swell changes its coordinate value to the coordinate value of the group gr, that is, the coordinate value of the ring 1 with the higher degree of swell. Then, the group forming units 90 of both rings 1 change the belonging flag to “belongs”.
- Such a process is repeated between a plurality of rings 1 existing within a certain range, thereby forming a group gr where the ring 1 as a leader exists.
- the target specifying unit 89 uses the coordinate value of the relative position of the ring 1 stored in the storage unit 88, the direction of the group grk, and the distance to the group grk (hereinafter, the approximate coordinate value of the group grk (hereinafter referred to as “group grk”). (Referred to as target coordinate values).
- group grk the approximate coordinate value of the group grk
- Information including target coordinate values and behavior information determined in this way (hereinafter referred to as nomination information) is transmitted to a plurality of rings 1 existing in the venue.
- FIG. 13 is a diagram for explaining the flow of the process of nominating the target (hereinafter referred to as the nomination process) among the processes executed by the ring 1-A attached to the arm of the artist UA.
- FIG. 14 is a diagram for explaining the flow of the process of receiving a nomination from ring 1-A (hereinafter referred to as the nomination receiving process) among the processes executed by ring 1 on the audience side.
- the target specifying unit 89 specifies a predetermined group gr as a target. That is, the target specifying unit 89 is set as the target from the direction information d, the motion information v acquired in Steps S32 and S33, and the coordinates of the relative positions of the plurality of rings 1 calculated in advance using the fixed terminal device cp. A target coordinate value of a predetermined group gr is specified.
- step S35 the communication control unit 84 transmits the designation information to the plurality of rings 1 in the venue.
- the designation information includes the target coordinate value and the behavior information. Thereby, the nomination process ends.
- step S51 the group forming unit 90 forms a group gr in which one ring 1 serving as a leader exists by the above-described method of forming the group gr.
- step S53 the group information management unit 91 determines whether the target coordinate value included in the nomination information matches its own coordinate value. That is, the group information management unit 91 determines whether or not the target coordinate value and the own coordinate value (that is, the coordinate value of the group gr) stored when the group gr is formed in the process of step S51 match. Specifically, when the difference between the target coordinate value and the coordinate value of the ring 1 (that is, the coordinate value of the group gr) is within a predetermined range, it is determined that they match.
- Step S53 If the target coordinate value coincides with its own coordinate value, it is determined as YES in Step S53, and the process proceeds to Step S54. On the other hand, when the target coordinate value does not coincide with its own coordinate value, it is determined as NO in Step S53, and the process proceeds to Step S56. The processing after step S56 will be described later.
- step S54 the group information management unit 91 determines whether it is a reader. That is, the group information management unit 91 determines whether the reader flag is “reader”.
- step S54 If it is a reader, it is determined as YES in step S54, and the process proceeds to step S55. On the other hand, if it is not a reader, that is, if the reader flag is “not a reader”, it is determined as NO in step S54, and the process proceeds to step S56. The processing after step S56 will be described later.
- step S55 the behavior control unit 83 executes the behavior according to the behavior information. That is, the behavior control unit 83 controls the behavior of the LED 34 and the vibration mechanism 35 so that the ring 1 emits light and vibrates according to a predetermined pattern defined in the behavior information included in the nomination information received in step S52. To do.
- step S53 when it is determined NO in step S53 or step S54, the process proceeds to step S56.
- the shape of the ring 1-A attached by the artist UA may be the ring shape described above or a stick shape.
- the ring 1-A has a stick shape, it is held in the hand of the artist UA.
- the operation of the ring 1-A of artist UA is basically the same as described above, but no position information is used.
- the operation of the plurality of rings 1 on the audience side is basically the same as described above, but there is no concept of group gr or reader due to the characteristics of waves.
- the wave is executed specifically as follows.
- the ring 1 on the audience side Upon receiving this instruction, the ring 1 on the audience side emits light and vibrates in a predetermined pattern, subtracts the number of hops, and transmits an instruction including a unique ID and the number of pops to the other ring 1 on the audience side. To do.
- the other ring 1 on the spectator side that has received this instruction emits and vibrates a predetermined pattern when the ID is new (for example, larger than the last received ID), and subtracts the number of hops.
- an instruction having a unique ID and the number of pops is transmitted to another ring 1 on the other audience side.
- the wave is realized by sequentially transmitting light emission and vibration in the ring 1 until the pop number becomes zero.
- FIG. 15 is a diagram illustrating a third example of group control.
- the degree of excitement is based on predetermined actions of a plurality of users whose rings 1 are worn on their arms. Determined.
- the predetermined action for which the degree of excitement is determined is not particularly limited, but in the present embodiment, a plurality of users having the same type of ring 1 existing within a certain range on their arms perform synchronized movements. The action to be taken is adopted.
- a plurality of rings 1 existing within a certain range form a group gr.
- a shooting start request is transmitted from the ring 1 of the user who is determined to be excited in the group gr to the camera in the hall or the control device of the camera.
- the camera starts shooting with the user and its surroundings as shooting targets, and information indicating the start of the shooting (hereinafter referred to as shooting start information) is transmitted from the camera or the control device of the camera.
- shooting start information information indicating the start of the shooting
- photography start information light-emits and vibrates.
- the sensor information acquisition unit 81 When the group control of the third example of FIG. 15 is executed, among the functional configuration of the CPU 31 of FIG. 6, the sensor information acquisition unit 81, the swell determination unit 82, the behavior control unit 83, the communication control unit 84, the position The information acquisition unit 85 and the storage unit 88 function.
- the communication control unit 84 of the ring 1 receives, by the wireless module 32, the motion information detected by the sensor information acquisition unit 81 of one or more other rings 1 existing within a certain range.
- the swell determination unit 82 of the ring 1 is information (hereinafter referred to as “relationship”) indicating the degree of relevance or difference of the synchrony of movement among a plurality of users based on the received movement information of one or more other rings. (Referred to as information).
- the relevance level or dissimilarity of movement synchrony refers to the relevance level or dissimilarity of movements of a plurality of users' bodies or body parts. That is, as the motions of the bodies or parts of the plurality of users match, the relevance of motion synchrony increases (that is, the difference in motion synchrony decreases).
- the related information is information indicating whether or not each user's operation is consistent. As the synchronized movement, for example, a dance with the same rhythm, tempo, and pattern by a plurality of users can be employed.
- the swell determination unit 82 of the ring 1 determines that the swell degree is higher as the degree of relevance of the operation between the plurality of users is higher from the generated relation information between the plurality of users existing within a certain range.
- the calculation method of the related information is not particularly limited, but the calculation method described in JP 2011-87794 A is adopted in the present embodiment.
- the rising determination unit 82 of the ring 1-1 attached to the user U1 generates related information based on the movement information from the other ring 1 within a certain range, and a plurality of users are obtained from the related information. The degree of swell is determined based on the recognition result.
- the communication control unit 84 of the ring 1-1 transmits a photographing start request to a CCU (Camera Control Unit) that controls the camera 121.
- the shooting start request includes the position information of the ring 1-1 acquired in advance by the position information acquisition unit 85 of the ring 1-1 by an arbitrary method.
- FIG. 16 is a configuration diagram of an information processing system in the third example of group control.
- the plurality of rings 1 and the CCU 122 are connected to each other by, for example, a multi-hop wireless LAN mesh network.
- a communication method between the camera 121 and the CCU 122 is not particularly limited, and may be wired or wireless.
- the imaging unit 141 has a mechanism for changing the direction of the optical axis (imaging direction) in addition to an optical system such as a lens and an imaging element.
- the imaging unit 141 For example, the photographing is started after the direction of the optical axis is changed so as to be positioned at the center of the angle of view.
- the imaging unit 141 changes the direction of the optical axis in the direction in which the specified imaging target exists.
- the imaging unit 141 supplies shooting data obtained as a result of shooting to the conversion unit 142.
- the communication control unit 143 instructs the imaging unit 141 to stop imaging, the imaging unit 141 stops imaging of the imaging target.
- FIG. 18 is a diagram illustrating a flow of processing (hereinafter, referred to as a shooting request process) for transmitting a shooting request so as to be a shooting target of the camera 121 among the processes executed by the ring 1.
- FIG. 19 is a diagram for explaining the flow of processing for controlling the camera 121 (hereinafter referred to as camera control processing) among the processing executed by the CCU 122.
- FIG. 20 is a diagram for explaining a flow of processing (hereinafter referred to as shooting processing) for shooting a shooting target among the processing executed by the camera 121.
- FIG. 21 is a diagram for explaining a processing relationship among the ring 1, the camera 121, and the CCU 122 of the information processing system 110.
- the mutual processing relationship among the ring 1, the camera 121, and the CCU 122 in FIG. 21 can be easily understood by referring to the corresponding steps in FIGS.
- a flowchart (corresponding to FIG. 18) illustrating an example of the shooting request process of the ring 1 is shown on the left side of FIG. 21, and a flowchart (corresponding to FIG. 19) illustrating an example of the camera control process of the CCU 122 is shown at the center.
- On the right side a flowchart (corresponding to FIG. 20) for explaining an example of the photographing process of the camera 121 is shown.
- step S72 the communication control unit 84 of the ring 1 acquires and transmits sensor information. That is, the communication control unit 84 receives and acquires the motion information detected by the sensor information acquisition unit 81 of one or more other rings 1 existing within a certain range by the wireless module 32. Then, the communication control unit 84 transmits the motion information acquired by the sensor information acquisition unit 81 thereof to one or more other rings 1 existing within a certain range.
- step S73 the rising determination unit 82 of the ring 1 performs the rising determination. That is, the excitement determination unit 82 relates to the synchronization of motion among a plurality of users existing within a certain range based on the motion information of itself and one or more other rings 1 acquired in the process of step S72. Generate related information indicating the degree. Then, the upsurge determination unit 82 determines the upsurge degree from the related information.
- step S74 the rising determination unit 82 of the ring 1 determines whether the determination result is equal to or greater than a threshold value. That is, the swell determination unit 82 determines whether the swell degree, which is the determination result of the swell determination, is equal to or greater than a preset threshold value.
- step S76 the communication control unit 84 transmits an imaging start request to the CCU 122.
- the imaging start request includes the position information of the ring 1 acquired by the position information acquisition unit 85 in the process of step S71.
- step S91 in FIG. 19 the CCU 122 determines whether the imaging start request transmitted from the ring 1 in the process in step S76 in FIG. 18 has been received.
- step S92 the CCU 122 transmits an instruction to change the shooting target to the camera 121. That is, the CCU 122 transmits, to the camera 121, an instruction to change the imaging target to the ring 1 indicated by the position information included in the imaging start request received in step S91.
- step S101 of FIG. 20 the communication control unit 143 of the camera 121 determines whether the instruction to change the imaging target transmitted from the CCU 122 in the process of step S92 has been received.
- Step S101 when an instruction to change the imaging target is received, it is determined as YES in Step S101, and the process proceeds to Step S102.
- step S102 the imaging unit 141 of the camera 121 changes the shooting target to the ring 1 indicated by the position information included in the instruction to change the shooting target.
- step S93 the CCU 122 transmits a shooting start instruction to the camera 121.
- step S103 If the shooting start instruction is not received, it is determined as NO in step S103, the process returns to step S103, and the determination process of step S103 is repeated until the shooting start instruction is received.
- step S93 in FIG. 19 the CCU 122 transmits shooting start information to the ring 1 in step S94.
- the communication control unit 84 of the ring 1 determines whether or not the imaging start information transmitted from the CCU 122 in the process of step S94 in FIG. 19 has been received.
- Step S77 when the imaging start information is received, it is determined as YES in Step S77, and the process proceeds to Step S78.
- step S78 the behavior control unit 83 of the ring 1 executes the behavior according to the behavior information.
- the behavior in this case is set so that the amount of emitted light and the strength of vibration are increased compared to the behavior executed in step S75. Thereby, the user can know the most exciting moment. Further, the user can know that he / she has been taken by the camera 121.
- step S79 the communication control unit 84 of the ring 1 transmits a photographing stop request to the CCU 122 at a predetermined timing after photographing is performed. Thereby, the photographing request processing for the ring 1 is completed.
- step S95 in FIG. 19 the CCU 122 determines whether or not the imaging stop request transmitted from the ring 1 in step S79 has been received.
- step S95 If the photographing stop request is not received, it is determined as NO in step S95, the process returns to step S95, and the determination process in step S95 is repeated until the photographing stop request is received.
- Step S95 After, when a photographing stop request is received, it is determined as YES in Step S95, and the process proceeds to Step S96.
- step S ⁇ b> 96 the CCU 122 transmits an instruction to stop shooting to the camera 121. Thereby, the camera control process of the CCU 122 ends.
- step S105 of FIG. 20 the communication control unit 143 of the camera 121 determines whether or not the photographing stop instruction transmitted from the CCU 122 in the process of step S96 of FIG. 19 has been received.
- step S105 If the shooting stop instruction is not received, it is determined as NO in step S105, the process returns to step S105, and the determination process of step S106 is repeated until the shooting stop instruction is received.
- Step S105 when an instruction to stop photographing is received, it is determined as YES in Step S105, and the process proceeds to Step S106.
- step S ⁇ b> 106 the camera 121 stops shooting with the ring 1 as the shooting target. Thereby, the photographing process by the camera 121 ends.
- the camera 121 can capture the moment when the user is most excited.
- the camera 121 can target a user with a high degree of excitement and surrounding users, the audience can be informed of the excitement of the venue through the captured video.
- the user wearing the ring 1 can know that the ring 1 emits light and vibrates, the user can be photographed by the camera 121, so that an appropriate facial expression can be created.
- the user wearing the ring 1 can know that he / she has been photographed, the user can be more excited with the camera 121 in mind.
- FIG. 22 is a diagram illustrating a fourth example of group control.
- control 22 is roughly divided into control on the ring 1 side and control on the camera 121 and SNS (social network service) 171 side.
- the information which shows the time (namely, the time which became excited) is acquired.
- the communication control unit 84 of the ring 1-k sends the position information of the ring 1-k, the time information indicating the rising time, and the ring ID of the ring 1-k (hereinafter, this information is referred to as ring information) to the SNS 171. Send.
- the SNS 171 stores the ring information as appropriate every time the ring information is transmitted from the ring 1.
- FIG. 23 is a configuration diagram of an information processing system in the fourth example of group control.
- the plurality of rings 1 and the SNS 171 are connected to each other by, for example, a multi-hop wireless LAN mesh network.
- a communication method between the SNS 171 and the video management device 172 is not particularly limited, and is connected through, for example, the Internet. Further, the communication method between the video management device 172 and the camera 121 is not particularly limited, and may be wired or wireless.
- SNS 171 manages user account information registered in advance. Details of the SNS 171 will be described later with reference to FIG.
- the group forming unit 90 of the ring 1 searches for another ring 1 that can be paired, and executes the following flag and group ID change process with the other ring 1.
- the group formation unit 90 of the ring 1 compares the group IDs of each other when the affiliation flags of the ring 1 and the other ring 1 are “not belonging” or “belonging”. As a result of the comparison of the group IDs, for example, when the group ID of the own group is larger, the group forming unit 90 of the ring 1 sets its own group ID to that of another ring 1 (that is, another ring 1 having a small group ID) Change to group ID. Furthermore, when the affiliation flag is “not belonging”, the group forming unit 90 of the ring 1 changes the affiliation flag to “belongs”.
- FIG. 24 is a functional block diagram illustrating a functional configuration example of the SNS 171.
- the communication control unit 191 controls wireless communication performed between the communication unit 192 such as a modem, the ring 1, and the video management device 172.
- the imaging condition acquisition unit 146 acquires the imaging conditions detected by the detection unit 145 for each frame or field, for example.
- video data for example, moving image data including a plurality of frames or fields
- image capturing unit 141 was captured by a single camera 121 during a predetermined unit (for example, from a start instruction to an end instruction for capturing).
- a predetermined unit for example, from a start instruction to an end instruction for capturing.
- UMID Unique Material Identifier
- the metadata generation unit 147 generates metadata including the shooting conditions acquired by the shooting condition acquisition unit 146 and the UMID generated by the shooting condition acquisition unit 146 in a format suitable for communication.
- the communication control unit 211 controls communication performed between the communication unit 210 such as a modem, the camera 121, and the SNS 171. For example, when video data on which metadata is superimposed is transmitted from the camera 121, the communication control unit 211 causes the communication unit 210 to receive the video data.
- the metadata extraction unit 212 extracts metadata from the video data received by the communication unit 210.
- the metadata extraction unit 212 supplies the extracted metadata to the shooting target specifying unit 214 and supplies the video data to the read / write control unit 215.
- the shooting location information storage unit 213 stores shooting location information.
- the shooting location information is information indicating a location where the camera can shoot (for example, each location in the event venue). For example, in the present embodiment, the user (or its arm) in the venue where the predetermined event is performed. Since the attached ring 1) can be an object to be photographed, position information (seat number and latitude / longitude information) of each seat on which each user is seated is employed. Details of the shooting location information will be described below with reference to FIG.
- the shooting location information includes the seat number of the venue seat and the latitude and longitude information of the seat.
- the imaging target specifying unit 214 performs imaging for each segment data based on the metadata supplied from the metadata extraction unit 212 and the imaging location information stored in the imaging location information storage unit 213. Identify the target. In addition, since it is difficult to identify the user reflected in the classification data and the ring 1 itself using only the metadata and the shooting location information, the seat number predicted to be seated by the user is specified as the shooting target.
- the read / write control unit 215 associates the video data supplied from the metadata extraction unit 212 with the shooting target specified by the shooting target specifying unit 214 in the video data storage unit 216 in file units (units that can be specified by UMID). Write (ie, remember). Further, the read / write control unit 215 retrieves and reads out the corresponding segment data from the video data storage unit 216 in response to a request for the segment data acquisition from the SNS 171, and transmits it to the SNS 171 via the communication unit 210.
- the video data storage unit 216 stores video data associated with a shooting target in units of files. The management information of the stored video data will be described with reference to FIG.
- FIG. 27 is a diagram showing management information of video data stored in the video data storage unit 216.
- the video data stored in the video data storage unit 216 is managed in file units uniquely specified by the UMID.
- a file with a predetermined UMID includes one or more pieces of classification data.
- the UMID file contains only one actual video data, and this actual data is divided into one or more segment data and managed so that it can be read out in segment data units. Management information is stored in the video data storage unit 216 separately.
- the management information in FIG. 27 includes a UMID, a time code, and a shooting target.
- the time code is temporal information attached to each of a plurality of frames or fields constituting actual video data included in the file specified by the UMID, and is the first frame or field of the file.
- the relative time with respect to the reference time is shown in the case where the reference time is the reference time.
- the photographing target “around seat A15” is photographed at the position where the time code is “00: 01: 00: 05”.
- the time code is “00: 20: 30: 20”
- the subject to be photographed is “around seat B37”
- the time code is “01: 10: 00: 17”. From the position, it can be seen that the subject to be photographed “around seat C22” is photographed.
- the first segment data is the time “00: 01: 00: 05”.
- This is video data showing a situation where “Seat A15 and its surroundings” were photographed.
- the second segment data is video data indicating that “the vicinity of the seat B37” is captured at the time “00: 20: 30: 20”.
- the third segment data is video data indicating that “the vicinity of the seat C22” is photographed at the time “01: 10: 00: 17”.
- the shooting target of “around the seat Z33” is captured, and the time code is “01 : 02: 20: 21 ”, the subject around“ Seat Y21 ”is being shot, and the time code is“ 01: 10: 01: 12 ”where“ Seat X56 ” It can be seen that the subject is being photographed.
- the following first to third division data are included in the file with the UMID “Y”.
- the first classification data is the time “01: 01: 01: 15”.
- the second segment data is video data indicating that “the vicinity of the seat Y21” is captured at the time “01: 02: 20: 21”.
- the third segment data is video data indicating that “the vicinity of the seat X56” is captured at the time “01: 10: 01: 12”.
- SNS 171 requests the video management apparatus 172 storing such management information to acquire predetermined segment data including the video that the user has been excited about. That is, the SNS 171 requests the video management device 172 to acquire the classification data using the date and time when the shooting target was shot and the shooting location (seat number) as search keys.
- the read / write control unit 215 of the video management device 172 extracts the corresponding classification data from the video data stored in the video data storage unit 216 using the date and the shooting location as a search key.
- the date and time when the shooting target was shot can be calculated from, for example, an 8-byte time stamp recorded in the UMID and the time code or the number of frames from the beginning of the file.
- FIG. 28 is a diagram for explaining the flow of processing (hereinafter referred to as ring information transmission processing) for transmitting ring information to SNS 171 among the processing executed by ring 1.
- FIG. 29 is a diagram for explaining the flow of processing for providing video data (hereinafter referred to as video data provision processing) among the processing executed by the SNS 171.
- FIG. 30 is a diagram for explaining the flow of the process (hereinafter referred to as the shooting process) for shooting the shooting target among the processes executed by the camera 121.
- FIG. 31 is a diagram for explaining the flow of processing (hereinafter referred to as video data management processing) in which video management device 172 manages video data.
- FIG. 32 is a diagram for explaining a processing relationship among the ring 1, the SNS 171, the camera 121, and the video management apparatus 172 of the information processing system 160.
- the mutual processing relationship among the ring 1, the SNS 171, the camera 121, and the video management apparatus 172 in FIG. 32 can be easily understood by referring to the corresponding steps in FIGS.
- a flowchart (corresponding to FIG. 28) illustrating an example of ring information transmission processing of ring 1 is shown on the left side of FIG. 32, and a flowchart (example of video data providing processing of SNS 171) is illustrated second from the left (FIG. 32). Corresponding to 29). Also, a flowchart (corresponding to FIG.
- FIG. 30 illustrating an example of the photographing process of the camera 121 is shown on the third from the left in FIG. 32, and an example of the video data management process of the video management device 172 is illustrated on the right side.
- a flowchart (corresponding to FIG. 31) is shown.
- the communication control unit 84 of the ring 1 requests the SNS 171 for authentication.
- step S141 of FIG. 29 the communication control unit 191 of the SNS 171 determines whether an authentication request is made from the ring 1 in the process of step S121 of FIG.
- step S141 If no authentication request is made, it is determined as NO in step S141, the process returns to step S141, and the determination process of step S141 is repeated until an authentication request is made.
- step S141 if an authentication request is made from ring 1, it is determined as YES in step S141, and the process proceeds to step S142.
- step S142 the authentication unit 193 executes ring 1 authentication processing.
- step S143 the authentication unit 193 determines whether the ring 1 has been successfully authenticated.
- step S143 If the authentication is not successful, it is determined as NO in step S143, the process returns to step S141, and the subsequent processes are repeated. That is, the loop processing of steps S141 to S143 is repeated until authentication is successful.
- Step S143 After the authentication is successful, it is determined as YES in Step S143, and the process proceeds to Step S144. If the authentication is successful, information that the authentication is successful is transmitted to the ring 1.
- the communication control unit 84 of the ring 1 determines whether the authentication in the SNS 171 is successful.
- Step S122 If the authentication is not successful, that is, if the information indicating that the authentication is successful from the SNS 171 is not received, it is determined as NO in Step S122, the process returns to Step S121, and the subsequent processes are repeated. That is, the loop process of steps S121 and S122 is repeated until the authentication is successful.
- step S143 After, when the authentication is successful, that is, when information indicating that the authentication transmitted from the SNS 171 is successful is received in the process of step S143, it is determined as YES in step S122, and the process proceeds to step S123. .
- step S123 the group forming unit 90 of the ring 1 forms the group gr by the group gr forming method described above.
- step S124 the sensor information acquisition unit 81 of the ring 1 acquires the sensor information from the ring 1 belonging to the same group and transmits it. That is, the communication control unit 84 receives and acquires the motion information detected by the sensor information acquisition unit 81 of one or more other rings 1 belonging to the same group by the wireless module 32. And the communication control part 84 transmits the motion information acquired by the own sensor information acquisition part 81 to one or more other rings 1 which belong to the same group.
- step S125 the rising determination unit 82 of the ring 1 performs the rising determination. That is, the excitement determination unit 82 indicates the degree of relevance of motion synchronism among a plurality of users belonging to the same group based on the motion information of itself and one or more other rings 1 acquired in the process of step S124. Generate related information. Then, the upsurge determination unit 82 determines the upsurge degree from the related information.
- step S126 the rising determination unit 82 of the ring 1 determines whether the determination result is equal to or greater than a threshold value. That is, the swell determination unit 82 determines whether the swell degree, which is the determination result of the swell determination, is equal to or greater than a preset threshold value.
- step S126 If the determination result is not equal to or greater than the threshold value, it is determined as NO in step S126, the process returns to step S124, and the loop process of steps S124 to S126 is repeated until the determination result is equal to or greater than the threshold value.
- step S126 if the determination result is equal to or greater than the threshold value, it is determined as YES in step S126, and the process proceeds to step S127.
- step S127 the behavior control unit 83 executes the behavior defined by the behavior information stored in the storage unit 88 in advance.
- step S1208 the position information acquisition unit 85 acquires the current position information of the ring 1, and the time information acquisition unit 86 acquires information indicating the time at which the determination result is equal to or greater than the threshold (that is, the time when the determination has risen). To do.
- step S129 the communication control unit 84 transmits the ring information to the SNS 171. That is, the position information, time information, and ring ID acquired in step S128 are transmitted to SNS 171. Thereby, the ring information transmission process of ring 1 is completed.
- step S144 of FIG. 29 the communication control unit 191 of the SNS 171 controls the communication unit 192 to receive the ring information transmitted from the ring 1 in the process of step S129 of FIG.
- step S145 the storage control unit 194 of the SNS 171 controls the storage unit 195 to store the ring information received in the process of step S144.
- step S161 in FIG. 30 the imaging unit 141 of the camera 121 photographs the inside of the venue during the event.
- step S162 the conversion unit 142 of the camera 121 converts the shooting data obtained as a result of shooting in the process of step S161 into video data in a format suitable for communication.
- step S163 the imaging condition acquisition unit 146 generates a UMID of the video data file.
- step S164 the shooting condition acquisition unit 146 of the camera 121 acquires the shooting conditions detected by the detection unit 145.
- step S165 the metadata generation unit 147 of the camera 121 generates metadata including the UMID generated in the process of step S163 and the shooting conditions acquired in the process of step S164.
- step S166 the communication control unit 143 of the camera 121 superimposes the video data converted by the process of step S162 and the metadata generated by the process of step S165, and causes the communication unit 144 to transmit to the video management apparatus 172. Execute control.
- step S181 in FIG. 31 the communication control unit 211 of the video management device 172 causes the communication unit 210 to receive the video data on which the metadata is superimposed transmitted from the camera 121 in the process in step S166 in FIG. .
- step S182 the metadata extraction unit 212 of the video management device 172 extracts metadata from the video data received in the process of step S181.
- the metadata extraction unit 212 supplies the extracted metadata to the shooting target specifying unit 214 and supplies the video data to the read / write control unit 215.
- step S 183 the shooting target specifying unit 214 of the video management device 172 sets the classification data for each segment data based on the metadata supplied from the metadata extraction unit 212 and the shooting location information stored in the shooting location information storage unit 213. Specify the shooting target. That is, the seat number is specified as a subject to be photographed.
- step S184 the read / write control unit 215 of the video management device 172 stores the video data supplied from the metadata extraction unit 212 in the video data storage unit 216 in association with the shooting target specified in the process of step S183.
- step S146 of FIG. 29 the communication control unit 191 of the SNS 171 determines whether a predetermined video data acquisition request has been received from a terminal (not shown) such as a member belonging to the SNS 171.
- step S146 If the predetermined video data acquisition request is not received, it is determined NO in step S146, the process returns to step S146, and the determination in step S146 is performed until the predetermined video data acquisition request is received. The process is repeated.
- Step S146 After, when a predetermined video data acquisition request is received, it is determined as YES in Step S146, and the process proceeds to Step S147.
- step S147 the communication control unit 191 of the SNS 171 transfers a video data acquisition request to the video management device 172.
- step S185 of FIG. 31 the communication control unit 211 of the video management device 172 determines whether the video data acquisition request transmitted from the SNS 171 in the process of step S147 of FIG. 29 has been received.
- step S185 If the video data acquisition request has not been received, it is determined as NO in step S185, and the process returns to step S185. Until the video data acquisition request is received, the determination process in step S185 is performed. Repeated.
- step S185 YES is determined in step S185, and the process proceeds to step S186.
- step S186 the read / write control unit 215 of the video management device 172 searches and reads out the corresponding video data (that is, segment data) according to the acquisition request received in the process of step S186, and via the communication unit 210. Send to SNS 171.
- step S148 the communication control unit 191 of the SNS 171 receives the video data transmitted by the video management apparatus 172 in the process of step S186 in FIG.
- the cooperation between the SNS 171 and the video management device 172 is performed according to the time and place for identifying a scene that has been excited during the event, but is not particularly limited thereto.
- a keyword or the like indicating a scene that has risen during an event may be used as a tag, and the SNS 171 and the video management apparatus 172 may be linked with the tag.
- the management information shown in FIG. 33 is held on the SNS 171 side, and the management information shown in FIG. 34 is held on the video management device 172 side.
- FIG. 33 shows an example of the configuration of a list of management information held by the SNS 171.
- a predetermined line includes management information indicating a scene of excitement during the event. That is, each time one piece of ring information is supplied from ring 1, one line is added, and management information generated based on the ring information is added to the added one line.
- the time specified from the time information included in the ring information is added to the added one line.
- the time stamp is stored in the “time stamp” item.
- the position (latitude and longitude) specified from the position information included in the ring information is stored in the item of “GPS information” in the added one line.
- the item “SNS account” stores the SNS account of each user in the group to which the ring 1 to which the ring information is sent belongs, but the storage method itself is not particularly limited, and a manual input method may be used. Alternatively, a method of automatically inputting based on the ring ID of the ring 1 belonging to the group may be used.
- a keyword that allows an SNS member to recall such a lively scene corresponding to one line is added as a tag by an SNS administrator or user. Such a tag is stored in the “TAG information” in the one row.
- FIG. 34 shows an example of the configuration of a list of management information held by the video management apparatus 172.
- the management information list in FIG. 34 is a list in which only the item “TAG information” is added to each item in the management information list in FIG. For this reason, items other than “TAG information” have already been described with reference to FIG. 27, and thus description thereof is omitted here.
- the item “TAG information” in FIG. 34 corresponds to “TAG information” included in the management information in FIG. 33 managed in the SNS 171 in FIG. Accordingly, since the storage contents of “TAG information” are notified from the SNS 171 at an appropriate timing, the item “TAG information” in FIG. 34 is updated manually or automatically according to the notification contents. Thereby, cooperation between the SNS 171 and the video management apparatus 172 is achieved by mutual “TAG information”.
- FIG. 35 shows a flowchart of processing performed in cooperation between the SNS 171 and the video management apparatus 172 using the management information shown in FIGS.
- a flowchart for explaining an example of the process of the SNS 171 is shown on the left side of FIG. 35, and a flowchart for explaining an example of the process of the video management apparatus 172 is shown on the right side of FIG.
- step S201 the communication control unit 191 of the SNS 171 transmits a request for adding TAG information to the video management device 172.
- step S221 the communication control unit 211 of the video management apparatus 172 receives the TAG information addition request transmitted from the SNS 171 in the process of step S201.
- step S222 the communication control unit 211 of the video management device 172 determines whether the request for adding TAG information is valid.
- step S222 If the request for adding TAG information is not valid, it is determined as NO in step S222, the process returns to step S221, and the subsequent processes are repeated. That is, the loop processing of steps S221 and S222 is repeated until a request for adding valid TAG information is made.
- Step S222 After, when a request for adding valid TAG information is made, it is determined as YES in Step S222, and the process proceeds to Step S223.
- step S223 the read / write control unit 215 adds TAG information to the management information.
- step S202 when a request for acquisition of predetermined video data is made by a terminal such as a member belonging to the SNS 171, the communication control unit 191 of the SNS 171 transfers the request to the video management device 172.
- step S224 the communication control unit 211 of the video management apparatus 172 receives the predetermined video data acquisition request transmitted from the SNS 171 in the process of step S202.
- step S225 the read / write control unit 215 of the video management device 172 extracts predetermined video data corresponding to the request.
- step S226 the communication control unit 211 of the video management device 172 transmits the predetermined video data extracted in the process of step S225 to the SNS 171.
- step S203 the communication control unit 191 of the SNS 171 receives the predetermined video data transmitted by the video management device 172 in the process of step S226.
- the communication control unit 191 of the SNS 171 transmits the predetermined video data to a terminal such as a member belonging to the SNS 171 that has requested acquisition. Thereby, the process performed in cooperation between the SNS 171 and the video management apparatus 172 ends.
- the video data stored in the SNS and the video data stored in the video management apparatus are associated with each other, it is possible to easily search the video data of the climax scene.
- the group gr is formed by only the plurality of rings 1.
- the group gr may be formed by the SNS 171.
- a method for forming the group gr using the SNS 171 will be described.
- the communication control unit 84 of the ring 1 transmits its own location information acquired by the location information acquisition unit 85 to the SNS 171.
- the SNS 171 recognizes adjacent rings 1, that is, a plurality of rings 1 existing within a certain range, from the position information received from the plurality of rings 1, and groups the plurality of rings 1 existing within the certain range into one group. formed as gr. Then, the SNS 171 generates a group ID of the group gr and transmits the group ID to the plurality of rings 1 included in the group gr.
- the adjacent rings 1 are formed as one group gr by the SNS 171, it is not particularly limited to this.
- a group gr may be formed for each ring 1 attached to a user who supports the predetermined team. That is, the rings 1 attached to users who support the same team may form one group gr. Further, for example, the rings 1 attached to a user who likes a predetermined player or a predetermined singer may form one group gr.
- FIG. 36 is a diagram for explaining a fifth example of group control.
- a virtual group is formed with a plurality of other rings 1 attached to each of a plurality of other users included in the video of the event displayed on the television receiver.
- the ring 1 belonging to such a virtual group (that is, existing outside the event venue) is synchronized with another ring 1 existing in the event venue displayed on the television receiver, for example.
- Control such as light emission and vibration is executed as an example of group control.
- the event image includes an image in which the ring 1 attached to each of a plurality of users emits light and vibrates in a predetermined pattern.
- the ring 1-k of the user Uk also emits light and vibrates in a predetermined pattern similar to the ring 1 of the user included in the event video.
- the communication control unit 84 of the ring 1-k can identify the light emission and vibration patterns of other rings 1 of other users reflected on the television receiver TV by communicating with the television receiver TV.
- Such information hereinafter referred to as other ring pattern specifying information. That is, in this case, the other ring pattern specifying information is broadcast.
- the behavior control unit 83 of the ring 1-k identifies the light emission and vibration patterns of the other ring 1 by analyzing the other ring pattern identification information, and also issues the ring 1-k with the same pattern as the identified pattern. And control to vibrate.
- the form of other ring pattern specifying information (including transmission form) and its analysis method are not particularly limited.
- the other ring pattern specifying information may be transmitted through a transmission path (not shown) different from the broadcast.
- a transmission path (not shown) different from the broadcast.
- a mechanism that can synchronize broadcasting and communication on another transmission path is required.
- the communication control unit 84 communicates with the television receiver TV and receives the broadcast signal as other ring pattern specifying information. May be.
- the behavior control unit 83 performs analysis by performing various image processing on the video signal that is the other ring pattern specifying information, and specifies the light emission and vibration patterns of the other rings 1 based on the analysis result.
- the ring 1-k may be controlled to emit and vibrate in the same pattern as the specified pattern.
- FIG. 37 is a flowchart for explaining the flow of behavior control processing.
- step S242 the behavior control unit 83 of the ring 1 executes the identified behavior. That is, the behavior control unit 83 analyzes the other ring pattern specifying information received in the process of step S241, specifies the light emission and vibration patterns of the other ring 1 based on the analysis result, and is the same pattern as the specified pattern. The behavior of is executed.
- the user can watch the ring 1 attached to his arm execute the same behavior as the user at the venue by viewing the video of the event displayed on the television receiver. Therefore, the user can get a sense of unity with the user at the venue without going to the venue where the event is taking place. In addition, even when cheering or watching a game at a place other than the venue, the user can obtain a sense of realism similar to that of the user at the venue.
- the ring 1 attached to the arm of each user Uk in front of the television receiver forms a group and affects the venue where the event is held via the television receiver and the network. You may be able to. In this case, for example, when the user Uk who is watching the video of the event displayed on the television receiver supports the user by shaking the arm on which the ring 1 is worn, the result is the display of the venue where the event is being performed. To affect.
- each user Uk in front of the television receiver forms a group and can receive signals from the event venue.
- each user Uk can receive the nomination of artist UA shown in the second example of the group control in front of the television receiver.
- a leader is selected from a group formed of a plurality of rings 1.
- a method for selecting a leader a method is adopted in which an action that can express excitement is detected by each of the plurality of rings 1 in the group, and a leader is selected based on the detection result.
- the reader selection method is not particularly limited, and the method described with reference to FIGS. 38 to 40 may be employed.
- FIG. 38 is a diagram showing a reader selection method.
- one ring 1 is represented by a single circle, and a symbol in the circle represents the state of the ring 1.
- the following first to third states are provided. That is, the first state is an initial state and is represented by the symbol i.
- the second state is a reader state and is represented by a symbol L. It is assumed that there is only one ring 1 that can be in the leader state in the group gr.
- the third state is a slave state and is indicated by a symbol S. Rings 1 that have left the initial state are all in the slave state if they cannot enter the leader state.
- a square mark superimposed on a circle indicating the ring 1 indicates a value (hereinafter referred to as a sense value) indicated by sensor information of a sensor built in the ring 1.
- the sense value is not particularly limited, it is assumed here that it is motion information detected by the triaxial acceleration sensor 33.
- the magnitude of the sense value corresponds to the magnitude of the motion information described above.
- the magnitude of the motion information corresponds to the degree of excitement. Therefore, the greater the sense value, the greater the degree of excitement. If a user with a higher degree of excitement has a reader qualification, it can be said that the greater the sense value, the more qualified the reader.
- the ring 1-a in the initial state i is compared with the ring 1-b in the initial state i.
- the sense value of ring 1-a is the value Y
- the sense value of ring 1-b is the value X.
- the value X is assumed to be larger than the value Y.
- the state of the ring 1-b having a large sense value X transitions from the initial state i to the leader state L, and the ring having the small sense value Y
- the state 1-a transitions from the initial state i to the slave state S.
- the sense value of the ring 1-a that has transitioned to the slave state S is changed to a value X that is the sense value of the ring 1-b that has become a leader candidate.
- both sense values are compared.
- the state of the ring 1 with the larger sense value transitions to the reader state L, and the state of the ring 1 with the smaller sense value transitions to the slave state S.
- the sense value of ring 1 in the slave state S is changed to the sense value of ring 1 in the leader state L.
- the ring 1-c in the initial state i is compared with the ring 1-b in the reader state L that is a leader candidate.
- the sense value of the ring 1-c is the value B and the sense value of the ring 1-b is the value X.
- the state of the ring 1-c in the initial state i transitions from the initial state i to the slave state S.
- the sense value of the ring 1-c in the slave state S is changed to the value X that is the sense value of the ring candidate 1-b that is a leader candidate.
- the comparison of both sense values is executed regardless of the other state (that is, whether the other state is the reader state L or the slave state S). Not. Then, the state of the ring 1 in the initial state i transitions to the slave state S, and the sense value thereof is changed to the sense value of the other ring 1.
- the ring 1-d in the slave state S and the ring 1-c in the slave state S are compared.
- the sense value of the ring 1-d is the value A
- the sense value of the ring 1-c is the value X.
- the value X is assumed to be larger than the value A.
- the sense value of the ring 1-d having the small sense value A is the value X which is the sense value of the ring 1-c having the large sense value X.
- both sense values are compared, and as a result of comparison, the ring with the smaller sense value is compared.
- the sense value of 1 is changed to the sense value of ring 1 with the larger sense value. Further, the state of the ring 1 having the smaller sense value transitions to the slave state S.
- the ring 1-e in the leader state L which is a leader candidate
- the ring 1-b in the leader state L which is a leader candidate
- the sense value of ring 1-e is the value Z
- the sense value of ring 1-b is the value X.
- the value X is assumed to be larger than the value Z.
- the state of the ring 1-e having a small sense value Z transitions from the reader state L to the slave state S. Then, the sense value of the ring 1-e that has transitioned to the slave state S is changed to a value X that is the sense value of the ring 1-b of the leader state L.
- both sense values are compared, and as a result of comparison, the ring with the smaller sense value is compared.
- the sense value of 1 is changed to the sense value of ring 1 with the larger sense value. Further, the state of the ring 1 having the smaller sense value transitions to the slave state S.
- Each ring 1 periodically repeats such a state and change of the sense value with the ring 1 within a certain range.
- the ring 1 detects another ring 1 in the vicinity and starts establishing a connection with the detected one or more other rings 1.
- the ring 1 exchanges predetermined information including the state and the sense value with one or more other rings 1 in which the connection is successfully established, and the state and the sense value of each ring 1 are set as described above. Update.
- the ring 1 detects another ring 1 that is close again, and starts an attempt to establish a connection with one or more detected rings 1.
- the ring 1 exchanges predetermined information with one or more other rings 1 as described above, and updates its own state and sense value as described above.
- the ring 1 discards the communication connection attempt.
- a plurality of groups gr composed of one leader state ring 1 and a plurality of slave state S rings 1 are formed.
- the sense values of the plurality of rings 1 in the same group gr are all the same value.
- the sense value which the some ring 1 in the same group gr has is called a group sense value.
- each ring 1 can be connected simultaneously with a plurality of rings 1 existing within a communicable range.
- the number of rings 1 that can be connected depends on system constraints such as the storage capacity of the ring 1.
- FIG. 39 is a diagram showing how the group expands.
- the group gr1 and the group gr2 are formed in close proximity by repeating the comparison of the sense values between the plurality of rings 1.
- the group gr1 includes a ring 1-f in the leader state L and a plurality of rings 1 in the slave state S.
- the group gr2 includes a ring 1-g in the leader state L and a plurality of rings 1 in the slave state S.
- the group sense value of the group gr1 is the value Y and the group sense value of the group gr1 is the value X.
- the value Y is assumed to be larger than the value X.
- the group sense values of the group gr1 and the group gr2 are compared. Specifically, for example, as shown in FIG. 39B, the sense values of the ring 1-h in the slave state S included in the group gr1 and the ring 1-g in the leader state L included in the group gr2 are compared.
- the value Y that is the sense value of the ring 1-h is the group sense value of the group gr1
- the value X that is the sense value of the ring 1-g is the group sense value of the group gr2.
- the value Y that is the sense value of ring 1-h (that is, the group sense value of group gr1) is the sense value of ring 1-g. Since the value is larger than a certain value X (that is, the group sense value of the group gr2), the state of the ring 1-g transitions from the reader state L to the slave state S. Then, the sense value of the ring 1-g transitioned to the slave state S is changed from the value X to the value Y.
- FIG. 40 is a state transition diagram illustrating an example of a state that the ring 1 can take in the above-described comparison of score values.
- each state is represented in one rounded rectangular frame, and is identified by a code including “SI” drawn in the frame.
- a state transition from one state to another state (including a case where the state remains the same) is executed when a predetermined condition (hereinafter referred to as a state transition condition) is satisfied.
- a state transition condition is represented by adding a symbol including “C” to an arrow representing a transition from one state to another state in FIG.
- Each ring 1 has the following information in the comparison of score values. That is, the ring 1 is the ring ID of the other ring 1 (maximum N) that currently establishes the connection, the group ID of the group gr to which it belongs, its own state (ie, the initial state i, the slave state S, Or reader status) information. Further, when the state of the ring 1 transitions to the leader state L, the ring 1 has information indicating the time since the transition to the leader state L. Further, the ring 1 has a first sensor value and a second sensor value as information of its own sensor value. The first sensor value is a sensor value disclosed in another ring 1, and the second sensor value is a sensor value that is updated in itself. That is, the second sensor value is updated when a certain time elapses, and is used as the first sensor value in the initial state.
- the state SI1 is a state in which the ring 1 has not established a connection with another ring 1. If the connection with the other ring 1 is successfully established, the state of the ring 1 changes to the state SI2.
- State SI2 is a state in which a connection between ring 1 and another ring 1 is established.
- ring 1 When the state of the ring 1 is the initial state i, the state transits to the state SI11. In this case, ring 1 does not belong to any group gr.
- both sense values are compared.
- the state of the ring 1 having the larger sense value transitions from the initial state i to the reader state L.
- the state of the ring 1 with the smaller sense value transitions from the initial state i to the slave state S.
- the sense value and group ID of the ring 1 with the smaller sense value are changed to the sense value and group ID of the ring 1 with the larger sense value.
- the ring 1 in the leader state L is generated.
- the state transition condition C1 includes two conditions.
- State SI12 is a state in which ring 1 is in the slave state S.
- the state of the other ring 1 in which the connection is established is the initial state i
- the comparison of both sense values is not performed, and the sense value and group ID of the other ring 1 are the same as the sense value of the ring 1 and Changed to group ID.
- the state of the ring 1 remains in the state SI12.
- State SI13 is a state in which ring 1 is in the leader state L.
- the state of the other ring 1 in which the connection is established is the initial state i
- the comparison of both sense values is not performed, and the sense value and group ID of the other ring 1 are the same as the sense value of the ring 1 and Changed to group ID.
- the state of the ring 1 remains in the state SI13.
- state SI13 when a certain time has elapsed since ring 1 transitioned to the leader state L, or when the power of ring 1 was turned off, the state of ring 1 transitions to the initial state i. In this case, the disappearance of the group gr is notified to all of a plurality of other rings 1 having the same group ID (that is, other rings 1 in the slave state S). As a result, the group gr having the leader disappears.
- the state transition condition C4 that a fixed time has elapsed or the power is turned off after the ring 1 transits to the leader state L
- the state 1 of the ring 1 is the initial state from the state SI13 in which the state is the leader state L. It changes to state SI11 which is i.
- the state transition condition C5 that the disappearance of the group gr is notified is satisfied, and the state of the ring 1 transitions from the state SI12 that is the slave state S to the state SI11 that is the initial state i. To do.
- the sense value and group ID of ring 1 included in the group gr are initialized, and the group gr disappears.
- the state SI2 transits to the state SI1.
- FIG. 41 is a diagram illustrating state transitions until the group gr disappears.
- state SI21 a plurality of rings 1 are assembled.
- the states of the plurality of rings 1 are all in the initial state i.
- each ring 1 starts establishing a connection with another ring 1 and repeats comparison of sense values.
- the ring 1-i and the ring 1-j that are in the leader state L are generated.
- the sense value of ring 1-i is 15, and the sense value of ring 1-j is 10.
- the sense value of the ring 1-i is larger, so the state of the ring 1-j transitions to the slave state S. Then, the sense value of the ring 1-j transitioned to the slave state S is changed to 15.
- a group gr21 having ring 1-i as a leader is formed.
- a group gr22 in which the ring 1-k is in the leader state L is formed at a position close to the group gr21.
- the group sense value of the group gr21 is 15, and the group sense value of the group gr22 is 9.
- the group sense values of the group gr21 and the group gr22 are compared. As a result of comparison, since the group sense value of the group gr22 is smaller than the group sense value of the group gr21, the leader of the group gr22 disappears, and the group sense value of the group gr22 is the group sense value of the group gr21. Changed to
- an enlarged group gr31 is formed while taking in another group.
- the leader of group gr31 is ring 1-i, and the group sense value is 15.
- the fixed time until the group disappears is not particularly limited, and is set, for example, in minutes.
- the series of processes described above can be executed by hardware or can be executed by software.
- a program constituting the software is installed in the computer.
- the computer includes, for example, a general-purpose personal computer capable of executing various functions by installing various programs by installing a computer incorporated in dedicated hardware.
- the program executed by the computer can be provided by being recorded on a removable medium (not shown) such as a package medium.
- the program can be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
- the program can also be provided from another information processing apparatus or another ring 12.
- the program can be installed in the storage unit 88 by attaching the removable medium to a drive of another information processing apparatus, receiving the program via the wireless transmission medium, and the wireless module 32.
- the program can be installed in the storage unit 88 in advance.
- the program executed by the computer may be a program that is processed in time series in the order described in this specification, or in parallel or at a necessary timing such as when a call is made. It may be a program for processing.
- Embodiments of the present technology are not limited to the above-described embodiments, and various modifications can be made without departing from the gist of the present technology.
- the present technology can take a cloud computing configuration in which one function is shared by a plurality of devices via a network and is jointly processed.
- the ring 1 itself behaves such as light emission and vibration, but the manner of behavior is not particularly limited to the above example. That is, the subject to behave may be the ring 1 itself worn by the user as in the above example, or a communication partner of the ring 1, such as a television receiver, a video deck, a game device, Electrical equipment such as lighting, electric shades, etc., and facilities of facilities such as private homes, stores, halls, and stadiums may be used. Further, the type of behavior may be light emission or vibration as in the above-described example, or other types such as video or sound. Further, in the case of the ring 1 including the soft actuator, a change in strength of tightening the arm can also be adopted as a kind of behavior. In this case, any plurality of types may be combined.
- Excitement information is not limited to this.
- a detection result that can determine physical excitement with respect to the user's event such as the user's heartbeat, blood pressure, and body temperature, can be adopted as the excitement information.
- the predetermined action performed by the user of the ring 1 is stored in the ring 1 in advance even if the event organizer notifies the user in advance using an arbitrary transmission method. Also good.
- the position information of the ring 1 itself is acquired by the position information acquisition unit 85 of the ring 1 by an arbitrary method.
- the acquired position information is included in the shooting start request and transmitted to the CCU 122, and is used for specifying the shooting target of the camera 121.
- the position information of the ring 1 was obtained using GPS.
- the position may be acquired using, for example, Wi-Fi (wireless fidelity).
- Wi-Fi wireless fidelity
- a method for acquiring position information using Wi-Fi is not particularly limited.
- the position information of the ring 1 using GPS or Wi-Fi may be acquired by a terminal device such as a smartphone capable of wireless communication with the ring 1. It is assumed that the terminal device is held by the user wearing the ring 1. The position information acquired by the terminal device may be transmitted to the ring 1 or other information processing device.
- At least a part of the processing executed in the ring 1 may be executed in another information processing apparatus capable of wireless communication with the ring 1. Thereby, the processing load in the ring 1 is reduced. Further, all the information related to the processing in the above example may be set in advance for the ring 1. In this case, the ring 1 may be executed by appropriately selecting information from preset information.
- a mobile terminal capable of executing the processing described in the present embodiment can be employed as an information processing apparatus to which the present technology is applied. In this case, it is desirable that the user communicate with the mobile terminal in hand.
- each step described in the above flowchart can be executed by one device or can be shared by a plurality of devices.
- the plurality of processes included in the one step can be executed by being shared by a plurality of apparatuses in addition to being executed by one apparatus.
- this technique can also take the following structures.
- a light emitting unit that emits light according to a light emission pattern specified by a combination of light emission parameters indicating characteristics of light emission; Based on the relationship between one or more other information processing devices having the light emitting unit and the own device and the own device, or the relationship between the other information processing device belonging to the group and the own device, An information processing apparatus comprising: a control unit that controls a light emission pattern of the light emitting unit.
- a communication control unit for controlling wireless communication with the one or more other information processing apparatuses; The information processing apparatus according to (1), wherein the control unit controls a light emission pattern of the light emitting unit based on the relationship specified from a control result of the communication control unit.
- a vibration unit that vibrates according to a vibration pattern specified by a combination of vibration parameters indicating characteristics of vibration The information processing apparatus according to (1) or (2), wherein the control unit controls a vibration pattern of the vibration unit based on the relationship.
- the control unit controls a light emission pattern of the light emitting unit in consideration of an influence of an external target other than the group.
- the communication control unit identifies, as the relationship, the number of the information processing devices existing within a certain range based on the position information of the other information processing device, The control unit controls the light emission pattern of the light emitting unit to be changed using the distance calculated by the distance calculating unit and the number specified by the communication control unit as parameters.
- the information processing apparatus according to any one of 4).
- the communication control unit controls other devices that do not belong to the group as the external target, and controls wireless communication with the external target;
- the information processing unit according to any one of (1) to (5), wherein the control unit controls a light emission pattern of the light emitting unit based on a result of wireless communication with the external target under the control of the communication control unit. apparatus.
- a sensor unit for detecting a change in physical quantity caused by the user's operation Based on the detection result of the sensor unit, further comprising a swell determination unit that determines the degree of swell of the user, The communication control unit transmits the degree of swell to the one or more other information processing apparatuses and receives the degree of swell of at least some users of the one or more other information processing apparatuses.
- Control The control unit controls a light emission pattern of the light emitting unit based on the relationship indicated by the degree of the swell of at least a part of the own device and the one or more other information processing devices.
- the information processing apparatus according to any one of 6).
- the external object is another information processing apparatus that does not belong to the group, having at least the light emitting unit and the sensor unit.
- the controller is The condition that the group to which the device belongs is targeted by the external object based on the detection result of the sensor unit of the external object, as a condition for the influence of the external object,
- the information processing apparatus according to any one of (1) to (7), wherein whether or not the external influence condition and the relational condition are satisfied is determined, and a light emission pattern of the light emitting unit is controlled.
- the information processing apparatus according to any one of (1) to (8), wherein the condition of the relationship is a condition that the degree of the swell of the apparatus is the largest in the group.
- the external object is an imaging device that photographs a plurality of users of the own device belonging to the group and one or more of the other information processing devices,
- the communication control unit causes the imaging device to start shooting of the imaging device whose shooting target is the user of the own device.
- the information processing apparatus according to any one of (1) to (9), wherein a shooting start request is transmitted.
- the communication control unit receives information indicating that shooting is started by the imaging device according to the shooting start request from the own device, The information processing apparatus according to any one of (1) to (10), wherein the control unit controls a light emission pattern of the light emitting unit based on the information received by the communication control unit.
- the swell determination unit is configured to generate the swell based on information indicating a degree of relevance or difference in synchrony of movements of a plurality of users' bodies or body parts, which is generated based on a detection result of the sensor unit.
- the information processing apparatus according to any one of (1) to (11), wherein a degree is determined.
- the external object is An imaging device for photographing the location where the group exists; An external device that acquires a photographing result indicating a state of a predetermined time zone of a user of the information processing device belonging to the group or the one or more other information processing devices from the photographing result of the imaging device.
- the communication control unit uses a time at which the degree of the swell determined by the swell determination unit is equal to or greater than a threshold when acquiring a photographing result indicating a state of a time zone including the time of the user of the own device.
- the information processing apparatus according to any one of (1) to (12), which is transmitted as acquisition information to the external apparatus.
- the information processing apparatus according to (13), wherein the communication control unit also transmits the position of the own apparatus to the external apparatus as the acquisition information.
- the information processing apparatus according to (13) or (14), wherein the communication control unit also transmits an ID of the own apparatus to the external apparatus as the acquisition information.
- the communication control unit receives information capable of specifying a light emission pattern of the light emitting unit of another information processing apparatus belonging to the group, The control unit controls a light emission pattern of the light emitting unit based on a light emission pattern specified by information that can specify the light emission pattern received by the communication control unit. Any one of (1) to (15) An information processing apparatus according to claim 1.
- the information processing apparatus has a ring-shaped part attached to the user's arm, The information processing apparatus according to any one of (1) to (16), wherein the part is provided with a connecting portion that opens or connects a part thereof.
- the coupling unit includes a magnet and a suction plate for connecting a part of the portion.
- the connecting portion connects a part of the part by elastic deformation of a material.
- This technology can be applied to information processing devices used as communication tools.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Telephone Function (AREA)
- Optical Communication System (AREA)
Abstract
Description
はじめに、本技術の理解を容易なものとすべく、本技術の概略について説明する。
図1は、本技術が適用される情報処理装置の一実施形態としてのリング1の概略構成を示す図である。図1のAの中央の図は、リング1の正面図である。ここで、正面は、リング1が装着された腕を手の指先側からみた場合における面とされている。図1のAの上側の図は、矢印aの方向にリング1を見た場合のリング1の上面図である。図1のAの右側の図は、矢印bの方向にリング1を見た場合のリング1の右側面図である。図1のAの下側の図は、矢印cの方向にリング1を見た場合のリング1の下面図である。図1のAの左側の図は、矢印dの方向にリング1を見た場合のリング1の左側面図である。また、図1のBは、リング1の背面図である。
図2は、リング1の斜視図である。図2のAは、基板21が視認できる方向からのリング1の斜視図である。図2のBは、連結部22が視認できる方向からのリング1の斜視図である。図2に示されるように、連結部22においてリング1の周長部の両端部が連結される。
図3は、リング1に実装される基板21の概略構成を示す図である。図3のAの中央の図は、基板21の正面図である。ここで、正面は、各種センサ、ワイヤレスモジュール、発光素子、振動機構等が実装されている面とされている。図3のAの上側の図は、矢印aの方向に基板21を見た場合の基板21の上面図である。図3のAの右側の図は、矢印bの方向に基板21を見た場合の基板21の右側面図である。図3のAの下側の図は、矢印cの方向に基板21を見た場合の基板21の下面図である。図3のAの左側の図は、矢印dの方向に基板21を見た場合の基板21の左側面図である。また、図3のBは、基板21の背面図である。また、図3のCは、基板21の斜視図である。
図4は、リング1の断面図である。図4のAは、図1のAの左側の図の線m-m’におけるリング1の断面図である。図4のBは、図1のAの左側の図の線n-n’におけるリング1の断面図である。
図5は、リング1とは異なる構成のリング51の概略構成を示す図である。
図6は、図3の基板21の機能的構成を示す機能ブロック図である。
始めに、図7と図8を参照して、本技術を適用したリング1を用いた群制御の第1例について説明する。
次に、群制御の第1例においてリング1が実行する処理のうち、所定の会場までの距離と他のリング1の数に応じて変化する挙動の制御の処理(以下、挙動制御処理と称する)の流れについて説明する。
次に、図9乃至図14を参照して、本技術を適用したリング1を用いた群制御の第2例について説明する。
図10は、群制御の第2例における情報処理システムの構成図である。
リング1-Aと群grに含まれる複数のリング1は、予め、相互の相対位置を、例えば図11に示される固定端末装置を用いて演算する。
アーティストUAは、このようにして会場内に形成された複数の群grのうち、所定の群grをターゲットとして指名する。この場合、アーティストUAの腕に装着されたリング1-Aのターゲット特定部89は、ターゲットとされた群grの方向と、リング1-Aから群grまでの距離を演算し、その演算結果に基づいて、ターゲットとされた群grの座標値を特定する。
次に、このような構成の情報処理システム100の処理の流れについて、図13と図14を参照して説明する。図13は、アーティストUAの腕に装着されたリング1-Aが実行する処理のうち、ターゲットを指名する処理(以下、指名処理と称する)の流れについて説明する図である。図14は、観客側のリング1が実行する処理のうち、リング1-Aから指名を受ける処理(以下、指名受け処理と称する)の流れについて説明する図である。
図13は、リング1-Aが実行する指名処理の流れを説明するフローチャートである。
図14は、リング1が実行する指名受け処理の流れを説明するフローチャートである。
次に、図15乃至図21を参照して、本技術を適用したリング1を用いた群制御の第3例について説明する。
図16は、群制御の第3例における情報処理システムの構成図である。
図17は、カメラ121の機能的構成を示す機能ブロック図である。
次に、情報処理システム110の処理の流れについて、図18乃至図21を参照して説明する。図18は、リング1が実行する処理のうち、カメラ121の撮影対象とされるように撮影要求を送信する処理(以下、撮影要求処理と称する)の流れについて説明する図である。図19は、CCU122が実行する処理のうち、カメラ121を制御する処理(以下、カメラ制御処理と称する)の流れについて説明する図である。図20は、カメラ121が実行する処理のうち、撮影対象を撮影する処理(以下、撮影処理と称する)の流れについて説明する図である。
次に、図22乃至図35を参照して、本技術を適用したリング1を用いた群制御の第4例について説明する。
図23は、群制御の第4例における情報処理システムの構成図である。
図24は、SNS171の機能的構成例を示す機能ブロック図である。
第4例の群制御が実行される場合には、図17の機能的構成の全てが機能する。すなわち、撮像部141、変換部142、通信制御部143、通信部144、検出部145、撮影条件取得部146、及びメタデータ生成部147が機能する。なお、撮像部141、変換部142、通信制御部143、通信部144については図17を参照して説明済みなので、差異点のみを説明する。
図25は、映像管理装置172の機能的構成例を示す機能ブロック図である。
図26は、撮影場所情報のリストの構造を示す図である。
図27は、映像データ記憶部216に記憶される映像データの管理情報を示す図である。
次に、情報処理システム160の処理の流れについて、図28乃至図32を参照して説明する。図28は、リング1が実行する処理のうち、SNS171にリング情報を送信する処理(以下、リング情報送信処理と称する)の流れについて説明する図である。図29は、SNS171が実行する処理のうち、映像データを提供する処理(以下、映像データ提供処理と称する)の流れについて説明する図である。図30は、カメラ121が実行する処理のうち、撮影対象を撮影する処理(以下、撮影処理と称する)の流れについて説明する図である。図31は、映像管理装置172が映像データを管理する処理(以下、映像データ管理処理)の流れについて説明する図である。
図35は、このような図33及び図34の管理情報を用いてSNS171と映像管理装置172とが連携して行う処理のフローチャートが示されている。図35の左側には、SNS171の処理の一例を説明するフローチャートが示され、図35の右側には、映像管理装置172の処理の一例を説明するフローチャートが示されている。
次に、図36と図37を参照して、本技術を適用したリング1を用いた群制御の第5例について説明する。
次に、群制御の第5例においてリング1が実行する処理のうち、仮想的な群に属する他のリング1と同期して挙動を制御する処理(以下、挙動制御処理と称する)の流れについて説明する。
上述した群制御の第2例においては、複数のリング1から形成される群の中からリーダが選出された。この場合、リーダの選出手法としては、盛り上がりを表わすことができる行動が、群内の複数のリング1の各々によって検出され、検出結果に基づいてリーダが選出される手法が採用された。しかしながら、リーダの選出手法は特に限定されず、図38乃至図40を参照して説明する手法が採用されてもよい。
図40は、上述したスコア値の比較においてリング1が取り得る状態の一例を示す状態遷移図である。
図41は、群grが消滅するまでの状態遷移を示す図である。
上述した一連の処理は、ハードウエアにより実行することもできるし、ソフトウエアにより実行することもできる。一連の処理をソフトウエアにより実行する場合には、そのソフトウエアを構成するプログラムが、コンピュータにインストールされる。ここで、コンピュータには、専用のハードウエアに組み込まれているコンピュータや、各種のプログラムをインストールすることで、各種の機能を実行することが可能な、例えば汎用のパーソナルコンピュータなどが含まれる。
(1)
発光の特徴を示す発光パラメータの組み合わせで特定される発光パターンにしたがって発光する発光部と、
前記発光部を有する1以上の他の情報処理装置及び自装置が属する群と、前記自装置との関係、又は、前記群に属する他の情報処理装置と前記自装置との関係に基づいて、前記発光部の発光パターンを制御する制御部と
を備える情報処理装置。
(2)
前記1以上の他の情報処理装置との間の無線通信を制御する通信制御部をさらに備え、
前記制御部は、前記通信制御部の制御結果から特定される前記関係に基づいて、前記発光部の発光パターンを制御する
前記(1)に記載の情報処理装置。
(3)
振動の特徴を示す振動パラメータの組み合わせで特定される振動パターンにしたがって振動する振動部をさらに備え、
前記制御部は、前記関係に基づいて、前記振動部の振動パターンを制御する
前記(1)または(2)に記載の情報処理装置。
(4)
前記制御部は、前記群以外の外部対象の影響をさらに考慮して、前記発光部の発光パターンを制御する
前記(1)乃至(3)のいずれかに記載の情報処理装置。
(5)
前記群に属する前記情報処理装置及び1以上の前記他の情報処理装置の各々を装着した複数のユーザが所定位置に集合する過程において、前記所定位置と、前記情報処理装置の存在位置との間の距離を、前記外部対象として演算する距離演算部をさらに備え、
前記通信制御部は、前記他の情報処理装置の位置情報に基づいて、一定の範囲内に存在する前記情報処理装置の数を、前記関係として特定し、
前記制御部は、前記距離演算部により演算された前記距離と前記通信制御部により特定された前記数をパラメータにして、前記発光部の発光パターンを変化させるように制御する
前記(1)乃至(4)のいずれかに記載の情報処理装置。
(6)
前記通信制御部は、前記群に属さない他の装置を前記外部対象として、前記外部対象との無線通信を制御し、
前記制御部は、前記通信制御部の制御による前記外部対象との無線通信の結果に基づいて、前記発光部の発光パターンを制御する
前記(1)乃至(5)のいずれかに記載の情報処理装置。
(7)
ユーザの動作に起因して生ずる物理量の変化を検出するセンサ部と、
前記センサ部の検出結果に基づいて、ユーザの盛り上がりの度合いを判定する盛り上がり判定部をさらに備え、
前記通信制御部は、前記盛り上がりの度合いを前記1以上の他の情報処理装置に送信すると共に、前記1以上の他の情報処理装置の少なくとも一部のユーザの前記盛り上がりの度合いを受信するように制御し、
前記制御部は、自装置及び前記1以上の他の情報処理装置の少なくとも一部の前記盛り上がりの度合いにより示される前記関係に基づいて、前記発光部の発光パターンを制御する
前記(1)乃至(6)のいずれかに記載の情報処理装置。
(8)
前記外部対象は、前記発光部及び前記センサ部を少なくとも有する、前記群に属さない他の情報処理装置であり、
前記制御部は、
自装置が属する前記群が、前記外部対象の前記センサ部の検出結果に基づいて前記外部対象によりターゲットとされたという条件を、前記外部対象の影響の条件として、
前記外部の影響の条件と、前記関係の条件とを満たすかを判定して、前記発光部の発光パターンを制御する
前記(1)乃至(7)のいずれかに記載の情報処理装置。
(9)
前記関係の条件は、自装置の前記盛り上がりの度合いが、前記群の中で最大であるという条件である
前記(1)乃至(8)のいずれかに記載の情報処理装置。
(10)
前記外部対象は、前記群に属する自装置及び1以上の前記他の情報処理装置の各々の複数のユーザを撮影する撮像装置であり、
前記通信制御部は、前記盛り上がり判定部により判定された前記盛り上がりの度合いが閾値以上である場合、自装置のユーザを撮影対象とする前記撮像装置の撮影が開始されるように、前記撮像装置に撮影開始要求を送信する
前記(1)乃至(9)のいずれかに記載の情報処理装置。
(11)
前記通信制御部は、自装置からの前記撮影開始要求に従って前記撮像装置により撮影が開始されたことを示す情報を受信し、
前記制御部は、前記通信制御部に受信された前記情報に基づいて、前記発光部の発光パターンを制御する
前記(1)乃至(10)のいずれかに記載の情報処理装置。
(12)
前記盛り上がり判定部は、前記センサ部の検出結果に基づいて生成された、複数のユーザの身体または身体の一部の動作の同調性の関連度又は相違度を示す情報に基づいて、前記盛り上がりの度合いを判定する
前記(1)乃至(11)のいずれかに記載の情報処理装置。
(13)
前記外部対象は、
群が存在する場所を撮影する撮像装置と、
前記撮像装置の撮影結果の中から、前記群に属する前記情報処理装置又は1以上の前記他の情報処理装置のユーザの所定の時間帯の様子を示す撮影結果を取得する外部の装置とであり、
前記通信制御部は、前記盛り上がり判定部により判定された前記盛り上がりの度合いが閾値以上になった時刻を、自装置のユーザの前記時刻を含む時間帯の様子を示す撮影結果を取得する際に用いる取得用情報として前記外部の装置に送信する
前記(1)乃至(12)のいずれかに記載の情報処理装置。
(14)
前記通信制御部は、自装置の位置も、前記取得用情報として前記外部の装置に送信する
前記(13)に記載の情報処理装置。
(15)
前記通信制御部は、自装置のIDも、前記取得用情報として前記外部の装置に送信する
前記(13)又は(14)に記載の情報処理装置。
(16)
前記通信制御部は、前記群に属する他の情報処理装置の前記発光部の発光パターンを特定可能な情報を受信し、
前記制御部は、前記通信制御部に受信された前記発光パターンを特定可能な情報により特定された発光パターンに基づいて、前記発光部の発光パターンを制御する
前記(1)乃至(15)のいずれかに記載の情報処理装置。
(17)
前記情報処理装置は、前記ユーザの腕に取り付けられるリング形状の部位を有し、
前記部位は、その一部を開放または接続する連結部が設けられている
前記(1)乃至(16)のいずれかに記載の情報処理装置。
(18)
前記連結部は、前記部位の一部を接続するための磁石及び吸着板を有している
前記(1)乃至(17)のいずれかに記載の情報処理装置。
(19)
前記連結部は、素材の弾性変形により前記部位の一部を接続する
前記(1)乃至(18)のいずれかに記載の情報処理装置。
Claims (20)
- 発光の特徴を示す発光パラメータの組み合わせで特定される発光パターンにしたがって発光する発光部と、
前記発光部を有する1以上の他の情報処理装置及び自装置が属する群と、前記自装置との関係、又は、前記群に属する他の情報処理装置と前記自装置との関係に基づいて、前記発光部の発光パターンを制御する制御部と
を備える情報処理装置。 - 前記1以上の他の情報処理装置との間の無線通信を制御する通信制御部をさらに備え、
前記制御部は、前記通信制御部の制御結果から特定される前記関係に基づいて、前記発光部の発光パターンを制御する
請求項1に記載の情報処理装置。 - 振動の特徴を示す振動パラメータの組み合わせで特定される振動パターンにしたがって振動する振動部をさらに備え、
前記制御部は、前記関係に基づいて、前記振動部の振動パターンを制御する
請求項1に記載の情報処理装置。 - 前記制御部は、前記群以外の外部対象の影響をさらに考慮して、前記発光部の発光パターンを制御する
請求項2に記載の情報処理装置。 - 前記群に属する前記情報処理装置及び1以上の前記他の情報処理装置の各々を装着した複数のユーザが所定位置に集合する過程において、前記所定位置と、前記情報処理装置の存在位置との間の距離を、前記外部対象として演算する距離演算部をさらに備え、
前記通信制御部は、前記他の情報処理装置の位置情報に基づいて、一定の範囲内に存在する前記情報処理装置の数を、前記関係として特定し、
前記制御部は、前記距離演算部により演算された前記距離と前記通信制御部により特定された前記数をパラメータにして、前記発光部の発光パターンを変化させるように制御する
請求項4に記載の情報処理装置。 - 前記通信制御部は、前記群に属さない他の装置を前記外部対象として、前記外部対象との無線通信を制御し、
前記制御部は、前記通信制御部の制御による前記外部対象との無線通信の結果に基づいて、前記発光部の発光パターンを制御する
請求項4に記載の情報処理装置。 - ユーザの動作に起因して生ずる物理量の変化を検出するセンサ部と、
前記センサ部の検出結果に基づいて、ユーザの盛り上がりの度合いを判定する盛り上がり判定部をさらに備え、
前記通信制御部は、前記盛り上がりの度合いを前記1以上の他の情報処理装置に送信すると共に、前記1以上の他の情報処理装置の少なくとも一部のユーザの前記盛り上がりの度合いを受信するように制御し、
前記制御部は、自装置及び前記1以上の他の情報処理装置の少なくとも一部の前記盛り上がりの度合いにより示される前記関係に基づいて、前記発光部の発光パターンを制御する
請求項6に記載の情報処理装置。 - 前記外部対象は、前記発光部及び前記センサ部を少なくとも有する、前記群に属さない他の情報処理装置であり、
前記制御部は、
自装置が属する前記群が、前記外部対象の前記センサ部の検出結果に基づいて前記外部対象によりターゲットとされたという条件を、前記外部対象の影響の条件として、
前記外部の影響の条件と、前記関係の条件とを満たすかを判定して、前記発光部の発光パターンを制御する
請求項7に記載の情報処理装置。 - 前記関係の条件は、自装置の前記盛り上がりの度合いが、前記群の中で最大であるという条件である
請求項8に記載の情報処理装置。 - 前記外部対象は、前記群に属する自装置及び1以上の前記他の情報処理装置の各々の複数のユーザを撮影する撮像装置であり、
前記通信制御部は、前記盛り上がり判定部により判定された前記盛り上がりの度合いが閾値以上である場合、自装置のユーザを撮影対象とする前記撮像装置の撮影が開始されるように、前記撮像装置に撮影開始要求を送信する
請求項7に記載の情報処理装置。 - 前記通信制御部は、自装置からの前記撮影開始要求に従って前記撮像装置により撮影が開始されたことを示す情報を受信し、
前記制御部は、前記通信制御部に受信された前記情報に基づいて、前記発光部の発光パターンを制御する
請求項10に記載の情報処理装置。 - 前記盛り上がり判定部は、前記センサ部の検出結果に基づいて生成された、複数のユーザの身体または身体の一部の動作の同調性の関連度又は相違度を示す情報に基づいて、前記盛り上がりの度合いを判定する
請求項11に記載の情報処理装置。 - 前記外部対象は、
群が存在する場所を撮影する撮像装置と、
前記撮像装置の撮影結果の中から、前記群に属する前記情報処理装置又は1以上の前記他の情報処理装置のユーザの所定の時間帯の様子を示す撮影結果を取得する外部の装置とであり、
前記通信制御部は、前記盛り上がり判定部により判定された前記盛り上がりの度合いが閾値以上になった時刻を、自装置のユーザの前記時刻を含む時間帯の様子を示す撮影結果を取得する際に用いる取得用情報として前記外部の装置に送信する
請求項7に記載の情報処理装置。 - 前記通信制御部は、自装置の位置も、前記取得用情報として前記外部の装置に送信する
請求項13に記載の情報処理装置。 - 前記通信制御部は、自装置のIDも、前記取得用情報として前記外部の装置に送信する
請求項14に記載の情報処理装置。 - 前記通信制御部は、前記群に属する他の情報処理装置の前記発光部の発光パターンを特定可能な情報を受信し、
前記制御部は、前記通信制御部に受信された前記発光パターンを特定可能な情報により特定された発光パターンに基づいて、前記発光部の発光パターンを制御する
請求項1に記載の情報処理装置。 - 前記情報処理装置は、前記ユーザの腕に取り付けられるリング形状の部位を有し、
前記部位は、その一部を開放または接続する連結部が設けられている
請求項1に記載の情報処理装置。 - 前記連結部は、前記部位の一部を接続するための磁石及び吸着板を有している
請求項17に記載の情報処理装置。 - 前記連結部は、素材の弾性変形により前記部位の一部を接続する
請求項18に記載の情報処理装置。 - 発光の特徴を示す発光パラメータの組み合わせで特定される発光パターンにしたがって発光する発光部を有する情報処理装置の情報処理方法において、
前記情報処理装置が、
発光の特徴を示す発光パラメータの組み合わせで特定される発光パターンにしたがって発光し、
前記発光部を有する1以上の他の情報処理装置及び自装置が属する群と、前記自装置との関係、又は、前記群に属する他の情報処理装置と前記自装置との関係に基づいて、前記発光部の発光パターンを制御する
ステップを含む情報処理方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013551599A JP5884831B2 (ja) | 2011-12-27 | 2012-12-13 | 情報処理装置及び方法 |
US14/364,311 US9992851B2 (en) | 2011-12-27 | 2012-12-13 | Information processing apparatus, and method |
CN201280063730.5A CN104169836B (zh) | 2011-12-27 | 2012-12-13 | 信息处理设备和方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011286520 | 2011-12-27 | ||
JP2011-286520 | 2011-12-27 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013099629A1 true WO2013099629A1 (ja) | 2013-07-04 |
Family
ID=48697121
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/082375 WO2013099629A1 (ja) | 2011-12-27 | 2012-12-13 | 情報処理装置及び方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US9992851B2 (ja) |
JP (1) | JP5884831B2 (ja) |
CN (1) | CN104169836B (ja) |
WO (1) | WO2013099629A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014135000A (ja) * | 2013-01-11 | 2014-07-24 | Sony Computer Entertainment Inc | 情報処理装置、情報処理方法、携帯端末、およびサーバ |
EP2947968A3 (en) * | 2014-05-02 | 2015-12-09 | LG Electronics Inc. | Lighting system and control method thereof |
JP5883960B1 (ja) * | 2015-02-13 | 2016-03-15 | 日本電信電話株式会社 | イベント判定装置、イベント判定方法及びイベント判定プログラム |
JP2019004927A (ja) * | 2017-06-20 | 2019-01-17 | カシオ計算機株式会社 | 電子機器、リズム情報報知方法及びプログラム |
JP2020077094A (ja) * | 2018-11-06 | 2020-05-21 | 日本放送協会 | 触覚情報提示システム |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3171578A1 (en) * | 2015-11-18 | 2017-05-24 | Siemens Aktiengesellschaft | Method of improved finding someone or something within a larger group of people or things |
EP3171246A1 (en) * | 2015-11-18 | 2017-05-24 | Siemens Aktiengesellschaft | Method of improved finding someone or something within a larger group of people or things |
DE112017001918T5 (de) * | 2016-04-07 | 2018-12-20 | Sony Corporation | System, Endgerätevorrichtung, Verfahren und Aufzeichnungsmedium |
KR101939627B1 (ko) * | 2018-06-25 | 2019-01-17 | 주식회사 팬라이트 | 응원용 발광장치를 이용한 공연장에서의 공연연출 방법 및 이를 이용한 공연연출 시스템 |
JP7313189B2 (ja) * | 2019-05-20 | 2023-07-24 | シャープ株式会社 | 画像形成装置及び制御方法 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002280189A (ja) * | 2002-04-22 | 2002-09-27 | Ingusu:Kk | 発光装飾具 |
JP2004039415A (ja) * | 2002-07-03 | 2004-02-05 | K-Tech Devices Corp | パレード遠隔制御システム |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07244557A (ja) * | 1994-03-04 | 1995-09-19 | Matsushita Electric Ind Co Ltd | 入力装置 |
US7080322B2 (en) * | 1998-12-18 | 2006-07-18 | Tangis Corporation | Thematic response to a computer user's context, such as by a wearable personal computer |
JP2003060745A (ja) * | 2001-08-22 | 2003-02-28 | Sony Corp | 情報伝達装置、情報伝達方法及びモニタ装置 |
WO2005084534A1 (en) * | 2003-09-03 | 2005-09-15 | Life Patch International, Inc. | Personal diagnostic devices and related methods |
EP1524586A1 (en) * | 2003-10-17 | 2005-04-20 | Sony International (Europe) GmbH | Transmitting information to a user's body |
JP3954584B2 (ja) * | 2004-03-02 | 2007-08-08 | 日本無線株式会社 | 発光制御システム |
JP2005351994A (ja) * | 2004-06-08 | 2005-12-22 | Sony Corp | コンテンツ配信サーバ,コンテンツ配信方法,プログラム |
JP2007115412A (ja) * | 2005-10-18 | 2007-05-10 | Totoku Electric Co Ltd | 演出方法、演出システム及び携帯型発光装置 |
JP2007221355A (ja) | 2006-02-15 | 2007-08-30 | Fujitsu Ltd | 赤外線通信機能を有する携帯電話機 |
JP5045983B2 (ja) * | 2006-06-30 | 2012-10-10 | ソニー株式会社 | 情報処理装置、情報処理方法、およびプログラム |
JP4609527B2 (ja) * | 2008-06-03 | 2011-01-12 | 株式会社デンソー | 自動車用情報提供システム |
CN101410004A (zh) * | 2008-11-25 | 2009-04-15 | 史晓东 | 一种新型电子信息功能首饰 |
JP3168514U (ja) * | 2011-04-01 | 2011-06-16 | 鋒成科技股▲ふん▼有限公司 | 外部刺激で変化する分光器 |
KR101982645B1 (ko) * | 2011-07-22 | 2019-08-28 | 가부시키가이샤 라판 크리에이트 | 발광 장치 |
CN103930851B (zh) | 2011-11-15 | 2016-09-21 | 索尼公司 | 信息处理设备和方法 |
WO2013084742A1 (ja) * | 2011-12-06 | 2013-06-13 | ソニー株式会社 | 情報処理装置及び方法 |
US9205277B2 (en) * | 2012-02-21 | 2015-12-08 | Sharp Laboratories Of America, Inc. | Color adaptive therapeutic light control system |
-
2012
- 2012-12-13 WO PCT/JP2012/082375 patent/WO2013099629A1/ja active Application Filing
- 2012-12-13 US US14/364,311 patent/US9992851B2/en not_active Expired - Fee Related
- 2012-12-13 CN CN201280063730.5A patent/CN104169836B/zh not_active Expired - Fee Related
- 2012-12-13 JP JP2013551599A patent/JP5884831B2/ja not_active Expired - Fee Related
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002280189A (ja) * | 2002-04-22 | 2002-09-27 | Ingusu:Kk | 発光装飾具 |
JP2004039415A (ja) * | 2002-07-03 | 2004-02-05 | K-Tech Devices Corp | パレード遠隔制御システム |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014135000A (ja) * | 2013-01-11 | 2014-07-24 | Sony Computer Entertainment Inc | 情報処理装置、情報処理方法、携帯端末、およびサーバ |
US10291727B2 (en) | 2013-01-11 | 2019-05-14 | Sony Interactive Entertainment Inc. | Information processing device, information processing method, portable terminal, and server |
EP2947968A3 (en) * | 2014-05-02 | 2015-12-09 | LG Electronics Inc. | Lighting system and control method thereof |
US9655212B2 (en) | 2014-05-02 | 2017-05-16 | Lg Electronics Inc. | Lighting system having a plurality of lighting devices and an integrated control module |
JP5883960B1 (ja) * | 2015-02-13 | 2016-03-15 | 日本電信電話株式会社 | イベント判定装置、イベント判定方法及びイベント判定プログラム |
JP2019004927A (ja) * | 2017-06-20 | 2019-01-17 | カシオ計算機株式会社 | 電子機器、リズム情報報知方法及びプログラム |
US10511398B2 (en) | 2017-06-20 | 2019-12-17 | Casio Computer Co., Ltd. | Electronic device for improving cooperation among a plurality of members |
JP2020077094A (ja) * | 2018-11-06 | 2020-05-21 | 日本放送協会 | 触覚情報提示システム |
JP7245028B2 (ja) | 2018-11-06 | 2023-03-23 | 日本放送協会 | 触覚情報提示システム |
Also Published As
Publication number | Publication date |
---|---|
US9992851B2 (en) | 2018-06-05 |
JP5884831B2 (ja) | 2016-03-15 |
CN104169836B (zh) | 2017-05-10 |
CN104169836A (zh) | 2014-11-26 |
JPWO2013099629A1 (ja) | 2015-04-30 |
US20140333211A1 (en) | 2014-11-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5884831B2 (ja) | 情報処理装置及び方法 | |
US10925355B2 (en) | Functional, socially-enabled jewelry and systems for multi-device interaction | |
JP6431233B1 (ja) | 視聴ユーザからのメッセージを含む動画を配信する動画配信システム | |
JP7142113B2 (ja) | 仮想ペットの表示方法及び装置、並びに、端末及びプログラム | |
JP5898378B2 (ja) | 情報処理装置およびアプリケーション実行方法 | |
CN110147231A (zh) | 组合特效生成方法、装置及存储介质 | |
CN106507207B (zh) | 直播应用中互动的方法及装置 | |
CN110495819A (zh) | 机器人的控制方法、机器人、终端、服务器及控制系统 | |
JP2018506397A (ja) | マルチデバイス・インタラクションのための機能性、ソーシャル対応装身具およびシステム | |
TW201802663A (zh) | 影像顯示裝置、話題選擇方法及程式 | |
JP2019198053A (ja) | アクターの動きに基づいて生成されるキャラクタオブジェクトのアニメーションを含む動画を配信する動画配信システム、動画配信方法及び動画配信プログラム | |
JP6728863B2 (ja) | 情報処理システム | |
CN110506249A (zh) | 信息处理设备、信息处理方法和记录介质 | |
JP5915457B2 (ja) | 制御システムおよびプログラム | |
US11223717B2 (en) | Audience interaction system and method | |
US11972059B2 (en) | Gesture-centric user interface | |
CN110860087B (zh) | 虚拟对象控制方法、装置及存储介质 | |
JP2016051675A (ja) | 演出制御システム、通信端末および演出制御装置 | |
US9990049B2 (en) | Information presentation apparatus and information processing system | |
CN111544897B (zh) | 基于虚拟场景的视频片段显示方法、装置、设备及介质 | |
JP2019198060A (ja) | アクターの動きに基づいて生成されるキャラクタオブジェクトのアニメーションを含む動画を配信する動画配信システム、動画配信方法及び動画配信プログラム | |
KR102661381B1 (ko) | 액세서리의 장착이 가능한 로봇의 동작 제어 방법 및 장치 | |
US11216233B2 (en) | Methods and systems for replicating content and graphical user interfaces on external electronic devices | |
JP2020043578A (ja) | アクターの動きに基づいて生成されるキャラクタオブジェクトのアニメーションを含む動画を配信する動画配信システム、動画配信方法及び動画配信プログラム | |
JP6592214B1 (ja) | 視聴ユーザからのメッセージを含む動画を配信する動画配信システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12862320 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2013551599 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14364311 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12862320 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12862320 Country of ref document: EP Kind code of ref document: A1 |