WO2021177321A1 - Information processing system - Google Patents

Information processing system Download PDF

Info

Publication number
WO2021177321A1
WO2021177321A1 PCT/JP2021/008013 JP2021008013W WO2021177321A1 WO 2021177321 A1 WO2021177321 A1 WO 2021177321A1 JP 2021008013 W JP2021008013 W JP 2021008013W WO 2021177321 A1 WO2021177321 A1 WO 2021177321A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
display
unit
feature information
processing system
Prior art date
Application number
PCT/JP2021/008013
Other languages
French (fr)
Japanese (ja)
Inventor
仁 野々上
Original Assignee
株式会社ヴェルト
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ヴェルト filed Critical 株式会社ヴェルト
Publication of WO2021177321A1 publication Critical patent/WO2021177321A1/en

Links

Images

Classifications

    • GPHYSICS
    • G04HOROLOGY
    • G04GELECTRONIC TIME-PIECES
    • G04G9/00Visual time or date indication means
    • GPHYSICS
    • G04HOROLOGY
    • G04GELECTRONIC TIME-PIECES
    • G04G9/00Visual time or date indication means
    • G04G9/08Visual time or date indication means by building-up characters using a combination of indicating elements, e.g. by using multiplexing techniques
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers

Definitions

  • the present invention relates to an information processing system.
  • the present invention has been made in view of such a situation, and an object of the present invention is to enable a wearable terminal to easily express the features of a digital image or the impression or image given to a viewer of the digital image. do.
  • the information processing system of one aspect of the present invention is An information processing system including a first device for displaying an image and a second device in which a display unit composed of a plurality of pixels is arranged in an information display unit.
  • An extraction means for extracting feature information indicating the features of the image, and Based on the feature information extracted by the extraction means, a display control means that executes control regarding a pattern of a display form for each unit in units of one or more pixels among the plurality of pixels.
  • the characteristics of a digital image or the impression or image given to a viewer of the digital image can be easily expressed by a wearable terminal.
  • FIG. 2 It is a figure which shows the outline of this service to which the information processing system of this invention is applied. It is a figure which shows an example of the structure of the information processing system of this invention.
  • FIG. 2 it is a block diagram showing an example of the hardware configuration of an image display terminal.
  • FIG. 5 is a block diagram showing an example of a functional configuration for executing a color pick display process among the functional configurations of an information processing system to which the image display terminal of FIG. 3 is applied.
  • the service a service to which the information processing system of the present invention (see FIG. 2 described later) is applied.
  • FIG. 1 is a diagram showing an outline of the service to which the information processing system of the present invention is applied.
  • the image display terminal 1 shown in FIG. 1 is a smartphone that is usually carried by a user (not shown).
  • the wristwatch device 2 is a wearable terminal that the user usually wears.
  • the image display terminal 1 displays data P (hereinafter, referred to as “image P”) of captured images of the sea and the sky selected by the user.
  • a ring-shaped display unit 211 is arranged on the outer edge of the information display unit 201 of the wristwatch device 2.
  • the display unit 211 is formed by n display control units L composed of a plurality of pixels (n is an integer value of 1 or more).
  • the "pixels" constituting the display control unit L are not particularly limited.
  • a multicolor light emitting element more specific examples include an LED (Light Emitting Diode), an organic EL (Organic Electro-Luminesis), a liquid crystal element, and the like.
  • 24 display control units L1 to L24 are arranged.
  • the wristwatch device 2 controls the pattern (emission color, shading, intensity, etc.) of each display form of the display control units L1 to L24. Then, the wristwatch device 2 controls the pattern of each display form of the display control units L1 to L24, so that the feature of the image P displayed on the image display terminal 1 or the viewer of the image P can be perceived. Impressions and images are expressed by the balance of the entire display control units L1 to L24. As a result, the user can intuitively enjoy the atmosphere of the image P without directly displaying the image P on the wristwatch device 2.
  • the characteristics of the entire landscape including the combination of the sea and the sky included in the image P as a subject, and the impressions and images that the viewer of the image P can perceive are the display control units L1 to L24. It is expressed by the overall balance. Specifically, for example, it is assumed that the image P in FIG. 1 is captured in the midsummer daytime. In this case, the characteristics of the entire landscape, which is a combination of the azure sea and pure white clouds, and the refreshing and open image that the viewer can perceive are the entire display control units L1 to L24. It is expressed by the balance of.
  • each of the display control units L1 to L24 emits light in a color centered on blue or white, thereby expressing the characteristics of the image P or the impression or image given to the person who sees the image P. Further, for example, it is assumed that the image P is taken in the evening of midsummer. In this case, the characteristics of the entire landscape, which is a combination of the sunset-colored sea and clouds, and the calm and gentle image that the viewer can perceive are determined by the balance of the entire display control units L1 to L24. Be expressed. For example, each of the display control units L1 to L24 emits light in a color centered on orange or brown, thereby expressing the characteristics of the image P or the impression or image given to the person who sees the image P.
  • the method used by the wristwatch device 2 to express the features and images of the image P with the balance of the entire display control unit L is not particularly limited, but the following method is adopted in the present embodiment. That is, first, by using an image analysis technique such as cluster analysis, information indicating the features of the image P (hereinafter, referred to as "feature information”) is extracted.
  • the feature information includes, for example, information indicating color hue, lightness, saturation, and the like as information on color.
  • the feature information is extracted by dividing the image P into m areas B (m is an integer value of 1 or more) and each divided area B (hereinafter, referred to as "division area B"). In the example of FIG.
  • the image P is divided into the division areas B1 to B9, and the characteristic information of each of the nine division areas B1 to B9 is extracted. It should be noted that various operation buttons and the like arranged in the areas surrounded by broken lines in the upper part and the lower part of the image P shown in FIG. 1 are excluded from the target of extracting the feature information.
  • the pattern of each display form of the n display control units L is controlled based on the feature information extracted for each division area B.
  • the control of the pattern of the display form is performed so as to correspond to each of the m division areas B.
  • the image P is divided into the division areas B1 to B9
  • the feature information of each of the division areas B1 to B9 is extracted.
  • the pattern of each display form of the display control units L1 to L24 is controlled. That is, the feature information of the image P displayed on the image display terminal 1 is reconstructed and displayed on the wristwatch device 2.
  • the method for extracting the feature information for each of the m division regions B is not particularly limited.
  • feature information is extracted in units of a region A (hereinafter, referred to as “extraction region A”) composed of one or more pixels among a plurality of pixels constituting each of the division regions B1 to B9.
  • the feature information of the division area B1 is extracted by extracting the feature information of the extraction area A1 from the division area B1.
  • the feature information of the division area B3 is extracted by extracting the feature information of the extraction area A2 from the division area B3.
  • the individual division regions B which region is designated as the extraction region A is not particularly limited.
  • the region in which the most frequently used colors are concentrated in each of the individual division regions B may be the extraction region A, or may be randomly determined.
  • each of the n display control units L is associated with each of the m division areas B is not particularly limited.
  • the display control unit L1 may be fixed so as to always correspond to the division area B1.
  • the display control unit L2 may correspond to the division area B1 or the display control unit L3 may correspond to the division area B1 without fixing the correspondence.
  • the correspondence between each of the m division areas B and the n display control units L does not have to be one-to-one.
  • the display control units L1 and L12 correspond to the division area B1
  • the display control unit L10 corresponds to the division area B3
  • the display control units L3 and L18 correspond to the division area B5.
  • each of the display control units L2, L6, and L7 corresponds to each of the division areas B6, B7, and B9.
  • the feature information of the extraction region A1 of the division region B1 is extracted, and the pattern of the display form of the display control units L1 and L12 is controlled based on the feature information.
  • the feature information of the extraction region A2 of the division region B3 is extracted, and the pattern of the display form of the display control unit L10 is controlled based on the feature information.
  • the feature information of the extraction region A3 in the division region B5 is extracted, and the pattern of the display form of the display control units L3 and L18 is controlled based on the feature information.
  • the feature information of the extraction region A4 of the division region B6 is extracted, and the pattern of the display form of the display control unit L2 is controlled based on the feature information.
  • the feature information of the extraction region A5 of the division region B7 is extracted, and the pattern of the display form of the display control unit L6 is controlled based on the feature information. Further, the feature information of the extraction region A6 of the division region B9 is extracted, and the pattern of the display form of the display control unit L7 is controlled based on the feature information.
  • the extraction regions A1 to A6 shown in FIG. 1 are merely examples for simplifying the explanation. Therefore, the feature information of the extraction region A (not shown) can be extracted for each of the division regions B2, B4, and B8 in which the extraction region A is not drawn.
  • the phrase "can be extracted” means that the feature information does not necessarily have to be extracted from all of the m division regions B.
  • the user When using this service, the user installs the dedicated application software that enables this service on his or her smartphone. As a result, the user can use the service by making his / her smartphone function as the image display terminal 1. The user can also use the service by accessing a dedicated website that enables the service by using the browser function of his / her smartphone.
  • FIG. 2 is a diagram showing an example of the configuration of the information processing system of the present invention.
  • the information processing system shown in FIG. 2 is configured to include an image display terminal 1 and a wristwatch device 2.
  • the image display terminal 1 and the wristwatch device 2 are connected to each other via a predetermined network N.
  • the form of the network N is not particularly limited, and for example, Bluetooth (registered trademark), Wi-Fi, LAN (Local Area Network), the Internet, and the like can be adopted.
  • the image display terminal 1 and the wristwatch device 2 may be connected by a short-range wireless communication technology such as NFC (Near Field Communication) (registered trademark). That is, when the user uses this service, the image display terminal 1 and the wristwatch device 2 may be in the same place or in a distant place from each other.
  • NFC Near Field Communication
  • the image display terminal 1 is an information processing device operated by a user.
  • the image display terminal 1 is composed of, for example, a personal computer, a smartphone, a tablet, or the like. That is, the image display terminal 1 does not have to be a dedicated machine for using this service, and any device capable of displaying the image P can be adopted.
  • the wristwatch device 2 is a wristwatch-type information processing device operated by a user. Although it is a wristwatch type in this embodiment, it is not particularly limited and may be another wearable terminal. Further, further, any device that can provide the display control unit L may be used, and may be, for example, an IoT (Internet of Things) device or the like.
  • IoT Internet of Things
  • FIG. 3 is a block diagram showing an example of the hardware configuration of the image display terminal among the information processing systems shown in FIG.
  • the image display terminal 1 includes a CPU (Central Processing Unit) 11, a ROM (Read Only Memory) 12, a RAM (Random Access Memory) 13, a bus 14, an input / output interface 15, an input unit 16, and an output unit.
  • a 17, a storage unit 18, a communication unit 19, and a drive 20 are provided.
  • the CPU 11 executes various processes according to the program recorded in the ROM 12 or the program loaded from the storage unit 18 into the RAM 13. Data and the like necessary for the CPU 11 to execute various processes are also appropriately stored in the RAM 13.
  • the CPU 11, ROM 12 and RAM 13 are connected to each other via the bus 14.
  • An input / output interface 15 is also connected to the bus 14.
  • An input unit 16, an output unit 17, a storage unit 18, a communication unit 19, and a drive 20 are connected to the input / output interface 15.
  • the input unit 16 is composed of, for example, a keyboard or the like, and inputs various information.
  • the output unit 17 is composed of a display such as a liquid crystal display, a speaker, or the like, and outputs various information as images or sounds.
  • the storage unit 18 is composed of a DRAM (Dynamic Random Access Memory) or the like, and stores various data.
  • the communication unit 19 communicates with another device (for example, the wristwatch device 2 in FIG. 2) via the network N including the Internet.
  • a removable media 40 made of a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is appropriately mounted on the drive 20.
  • the program read from the removable media 40 by the drive 20 is installed in the storage unit 18 as needed. Further, the removable media 40 can also store various data stored in the storage unit 18 in the same manner as the storage unit 18.
  • the wristwatch device 2 of FIG. 2 can have basically the same configuration as the hardware configuration shown in FIG. Therefore, the description of the hardware configuration of the wristwatch device 2 will be omitted.
  • color pick display process refers to a process executed by the information processing system of FIG. 2 in order to provide this service.
  • FIG. 4 a functional configuration for executing the color pick display process executed in the image display terminal 1 according to the present embodiment will be described.
  • FIG. 4 is a block diagram showing an example of the functional configuration for executing the color pick display processing among the functional configurations of the information processing system to which the image display terminal of FIG. 3 is applied.
  • an image DB 181 is provided in one area of the storage unit 18 of the image display terminal 1.
  • One or more images P are stored and managed in the image DB 181.
  • the extraction unit 101 extracts feature information indicating the features of the image P displayed on the image display terminal 1. Specifically, the extraction unit 101 divides the image P into m division regions B and performs image analysis such as cluster analysis to acquire the feature information of each of the m division regions B. .. For example, in the example of FIG. 1, the extraction unit 101 acquires the feature information of each of the division regions B1 to B9.
  • the light emission control unit 102 relates to a pattern of each display form of n display control units L having one or more pixels out of a plurality of pixels as a unit, based on the feature information of the image P extracted by the extraction unit 101. Take control. Specifically, the light emission control unit 102 has a display form of each of the display control units L1 to Ln constituting the display unit 211 of the information display unit 201 of the wristwatch device 2 based on the extracted feature information of the image P. Performs control over patterns. For example, in the example of FIG. 1, the light emission control unit 102 controls the pattern of each display form of the display control units L1 to L24 of the wristwatch device 2 based on the characteristic information of each of the division areas B1 to B9 of the image P. To execute.
  • the wristwatch device 2 includes an information display unit 201 and a communication unit 202.
  • the display unit 211 functions.
  • the display unit 211 is composed of n display control units L.
  • the communication unit 202 communicates with another device (for example, the image display terminal 1 in FIG. 2) via the network N including the Internet.
  • the administrator of the image display terminal 1 and the wristwatch device 2 is described on the premise that they are the same user, but the present invention is not limited to this. That is, the administrator of the image display terminal 1 and the administrator of the wristwatch device 2 may be different. In this case, for example, the user who manages the image display terminal 1 may display the favorite image P of another user who manages the wristwatch device 2 on the image display terminal 1 and use the above-mentioned service.
  • the image P displayed on the image display terminal 1 is divided into nine division areas, but this is only an example.
  • the image P displayed on the image display terminal 1 can be divided into m division areas B.
  • the display unit 211 is formed by 24 display control units L1 to L24, but this is only an example. As described above, the display unit 211 can be divided into n display control units L. That is, the display control unit L may be less than 24 (for example, 12) or more than 24 (for example, 36).
  • the image P is stored and managed in the image DB 181 provided in one area of the storage unit 18 of the image display terminal 1, but this is only an example.
  • the image P stored in the predetermined cloud may be temporarily downloaded without being stored in the image DB 181, or may be linked to the image P stored in the predetermined cloud.
  • system configuration shown in FIG. 2 and the hardware configuration of the image display terminal 1 shown in FIG. 3 are merely examples for achieving the object of the present invention, and are not particularly limited.
  • the functional block diagram shown in FIG. 4 is merely an example and is not particularly limited. That is, it suffices if the information processing system is provided with a function capable of executing the above-mentioned series of processes as a whole, and what kind of functional block is used to realize this function is not particularly limited to the example of FIG. ..
  • the image DB 181 is provided in one area of the storage unit 18, but when a method of temporarily downloading an image P stored in a predetermined cloud or a method of linking the image P is adopted, , It is not necessary to provide the image DB181.
  • the location of the functional block is not limited to FIG. 4, and may be arbitrary.
  • the above-mentioned color pick display processing is configured to be performed on the image display terminal 1 side, but is not limited to this, and is separately provided by the wristwatch device 2 side or a person who provides this service.
  • At least a part of the color pick display process may be performed on the managed server (not shown) side. That is, the functional block required for executing the color pick display process is configured to be provided in the image display terminal 1, but this is only an example.
  • At least a part of the functional blocks arranged in the image display terminal 1 may be provided by the wristwatch device 2 or a server (not shown) separately managed by a person who provides this service.
  • one functional block may be configured by a single piece of hardware, a single piece of software, or a combination thereof.
  • the computer may be a computer embedded in dedicated hardware. Further, the computer may be a computer capable of executing various functions by installing various programs, for example, a general-purpose smartphone or a personal computer in addition to a server.
  • the recording medium containing such a program is not only composed of a removable medium (not shown) distributed separately from the device main body in order to provide the program to the user, but also is preliminarily incorporated in the device main body to the user. It is composed of the provided recording medium and the like.
  • the steps for describing a program to be recorded on a recording medium are not necessarily processed in chronological order, but also in parallel or individually, even if they are not necessarily processed in chronological order. It also includes the processing to be executed.
  • the term of the system means an overall device composed of a plurality of devices, a plurality of means, and the like.
  • the information processing system to which the present invention is applied suffices to have the following configuration, and various various embodiments can be taken. That is, the information processing system to which the present invention is applied is A first device (for example, the image display terminal 1 of FIG. 4) for displaying an image (for example, the image P of FIG. 1) and an information display unit (for example, the information display unit 201 of FIG. 4) have a display unit (for example) composed of a plurality of pixels.
  • An information processing system including a second device (for example, the wristwatch device 2 in FIG. 4) in which the display unit 211) in FIG. 4 is arranged.
  • An extraction means for extracting feature information indicating the features of the image (for example, extraction unit 101 in FIG.
  • control regarding the pattern of the display form for each unit is executed with one or more pixels among the plurality of pixels as a unit (for example, the display control unit L in FIG. 1).
  • Display control means for example, light emission control unit 102 in FIG. 4 and To be equipped.
  • the feature information of the image displayed on the first device is extracted, and based on the feature information, the display form is set for each unit of one or more pixels among the plurality of pixels constituting the display unit of the second device. Control over the pattern is performed.
  • the features of the image displayed on the first device, or the impression or image that the viewer can perceive the image is expressed on the display unit of the second device.
  • the user who manages the second device can intuitively enjoy the atmosphere of the image displayed on the first device.
  • the display control means is Based on the color feature as the feature information, control regarding the emission color pattern as the display form pattern for each unit can be executed.
  • control regarding the emission color pattern is executed for each unit of one or more pixels among the plurality of pixels constituting the display unit of the second device.
  • the color characteristics of the image displayed on the first device are expressed by the display unit of the second device.
  • the user who manages the second device can intuitively enjoy the atmosphere of the image displayed on the first device through the emission color displayed on the second device.

Abstract

The present invention addresses the problem of making it possible for features of a digital image, or for the impression or the image imparted to a person viewing a digital image, to easily be expressed on a wearable terminal. This information processing system includes an image display terminal (1), and a wrist watch device (2) in which disposed is a display unit (211) comprising n display control units L in an information display unit (201), wherein an extraction unit (101) extracts feature information indicating features of an image P displayed on the image display terminal (1). A light emission control unit (102) executes control relating to a pattern in a display mode for each display control unit L, on the basis of the feature information extracted by the extraction unit (101). Due to this configuration, the above problem is solved.

Description

情報処理システムInformation processing system
 本発明は、情報処理システムに関する。 The present invention relates to an information processing system.
 従来より、ユーザに装着されるウェアラブル端末の一部分に文字情報等を表示させる技術は存在する(例えば特許文献1参照)。 Conventionally, there is a technique for displaying character information or the like on a part of a wearable terminal worn by a user (see, for example, Patent Document 1).
特許5486087号公報Japanese Patent No. 5486087
 しかしながら、ユーザの中には、デジタル画像が持つ特徴、あるいはデジタル画像を見る者に与える印象やイメージをウェアラブル端末で表現したいとする者もいる。 However, some users want to express the characteristics of digital images, or the impressions and images given to viewers of digital images, on wearable terminals.
 本発明は、このような状況に鑑みてなされたものであり、デジタル画像が持つ特徴、あるいはデジタル画像を見る者に与える印象やイメージを、ウェアラブル端末で容易に表現できるようにすることを目的とする。 The present invention has been made in view of such a situation, and an object of the present invention is to enable a wearable terminal to easily express the features of a digital image or the impression or image given to a viewer of the digital image. do.
 上記目的を達成するため、本発明の一態様の情報処理システムは、
 画像を表示する第1デバイスと、情報表示部に複数の画素からなる表示部が配置された第2デバイスとを含む情報処理システムであって、
 前記画像の特徴を示す特徴情報を抽出する抽出手段と、
 前記抽出手段により抽出された前記特徴情報に基づいて、前記複数の画素のうち1以上の画素を単位として、当該単位毎の表示形態のパターンに関する制御を実行する表示制御手段と、
 を備える。
In order to achieve the above object, the information processing system of one aspect of the present invention is
An information processing system including a first device for displaying an image and a second device in which a display unit composed of a plurality of pixels is arranged in an information display unit.
An extraction means for extracting feature information indicating the features of the image, and
Based on the feature information extracted by the extraction means, a display control means that executes control regarding a pattern of a display form for each unit in units of one or more pixels among the plurality of pixels.
To be equipped.
 本発明によれば、デジタル画像が持つ特徴、あるいはデジタル画像を見る者に与える印象やイメージを、ウェアラブル端末で容易に表現することができる。 According to the present invention, the characteristics of a digital image or the impression or image given to a viewer of the digital image can be easily expressed by a wearable terminal.
本発明の情報処理システムの適用対象となる本サービスの概要を示す図である。It is a figure which shows the outline of this service to which the information processing system of this invention is applied. 本発明の情報処理システムの構成の一例を示す図である。It is a figure which shows an example of the structure of the information processing system of this invention. 図2に示す情報処理システムのうち、画像表示端末のハードウェア構成の一例を示すブロック図である。Of the information processing systems shown in FIG. 2, it is a block diagram showing an example of the hardware configuration of an image display terminal. 図3の画像表示端末が適用される情報処理システムの機能的構成のうち、カラーピック表示処理を実行するための機能的構成の一例を示すブロック図である。FIG. 5 is a block diagram showing an example of a functional configuration for executing a color pick display process among the functional configurations of an information processing system to which the image display terminal of FIG. 3 is applied.
 以下、本発明の実施形態について、図面を用いて説明する。 Hereinafter, embodiments of the present invention will be described with reference to the drawings.
 まず、図1を参照して、本発明の情報処理システム(後述する図2参照)の適用対象となるサービス(以下、「本サービス」と呼ぶ)の概要について説明する。 First, with reference to FIG. 1, an outline of a service (hereinafter referred to as "the service") to which the information processing system of the present invention (see FIG. 2 described later) is applied will be described.
 図1は、本発明の情報処理システムの適用対象となる本サービスの概要を示す図である。 FIG. 1 is a diagram showing an outline of the service to which the information processing system of the present invention is applied.
 図1に示す画像表示端末1は、図示せぬユーザが普段持ち歩いているスマートフォンである。腕時計装置2は、当該ユーザが普段装着しているウェアラブル端末である。
 画像表示端末1には、ユーザにより選択された、海と空の撮像画像のデータP(以下、「画像P」と呼ぶ)が表示されている。
 腕時計装置2の情報表示部201の外縁部には、リング形状の表示部211が配置されている。表示部211は、複数の画素からなるn個(nは1以上の整数値)の表示制御単位Lによって形成されている。なお、表示制御単位Lを構成する「画素」は特に限定されない。例えば多色発光素子、さらに具体的な例としてLED(Light Emitting Diode)、有機EL(Organic Electro-Luminescence)、液晶素子等が挙げられる。
 図1の例では、24個の表示制御単位L1乃至L24が配置されている。腕時計装置2は、表示制御単位L1乃至L24の夫々の表示形態のパターン(発光色、濃淡、強弱など)を制御する。そして、腕時計装置2は、表示制御単位L1乃至L24の夫々の表示形態のパターンを制御することで、画像表示端末1に表示された画像Pの特徴、あるいは画像Pを見る者が感得し得る印象やイメージを、表示制御単位L1乃至L24全体のバランスで表現する。これにより、ユーザは、腕時計装置2に画像Pを直接的に表示させることなく、直感的に画像Pの雰囲気を楽しむことができる。
The image display terminal 1 shown in FIG. 1 is a smartphone that is usually carried by a user (not shown). The wristwatch device 2 is a wearable terminal that the user usually wears.
The image display terminal 1 displays data P (hereinafter, referred to as “image P”) of captured images of the sea and the sky selected by the user.
A ring-shaped display unit 211 is arranged on the outer edge of the information display unit 201 of the wristwatch device 2. The display unit 211 is formed by n display control units L composed of a plurality of pixels (n is an integer value of 1 or more). The "pixels" constituting the display control unit L are not particularly limited. For example, a multicolor light emitting element, more specific examples include an LED (Light Emitting Diode), an organic EL (Organic Electro-Luminesis), a liquid crystal element, and the like.
In the example of FIG. 1, 24 display control units L1 to L24 are arranged. The wristwatch device 2 controls the pattern (emission color, shading, intensity, etc.) of each display form of the display control units L1 to L24. Then, the wristwatch device 2 controls the pattern of each display form of the display control units L1 to L24, so that the feature of the image P displayed on the image display terminal 1 or the viewer of the image P can be perceived. Impressions and images are expressed by the balance of the entire display control units L1 to L24. As a result, the user can intuitively enjoy the atmosphere of the image P without directly displaying the image P on the wristwatch device 2.
 図1の例では、画像Pに含まれる海と空との組合せからなる風景全体が持つ被写体としての特徴や、画像Pを見る者が感得し得る印象やイメージが、表示制御単位L1乃至L24全体のバランスによって表現される。
 具体的には例えば、図1の画像Pが、真夏の昼間に撮像されたものであるとする。この場合には、紺碧の海と真っ白な雲との組み合わせからなる風景全体が持つ被写体としての特徴や、見る者が感得し得る、爽やかで開放的なイメージが、表示制御単位L1乃至L24全体のバランスによって表現される。例えば、表示制御単位L1乃至L24の夫々が、青色や白色を中心とする色に発光することで、画像Pの特徴、あるいは画像Pを見た者に与える印象やイメージが表現される。
 また例えば、画像Pが真夏の夕方に撮像されたものであるとする。この場合には、夕焼け色の海と雲との組み合わせからなる風景全体が持つ被写体としての特徴や、見る者が感得し得る、穏やかで優しいイメージが、表示制御単位L1乃至L24全体のバランスによって表現される。例えば、表示制御単位L1乃至L24の夫々が、オレンジ色や茶色を中心とする色に発光することで、画像Pの特徴、あるいは画像Pを見た者に与える印象やイメージが表現される。
In the example of FIG. 1, the characteristics of the entire landscape including the combination of the sea and the sky included in the image P as a subject, and the impressions and images that the viewer of the image P can perceive are the display control units L1 to L24. It is expressed by the overall balance.
Specifically, for example, it is assumed that the image P in FIG. 1 is captured in the midsummer daytime. In this case, the characteristics of the entire landscape, which is a combination of the azure sea and pure white clouds, and the refreshing and open image that the viewer can perceive are the entire display control units L1 to L24. It is expressed by the balance of. For example, each of the display control units L1 to L24 emits light in a color centered on blue or white, thereby expressing the characteristics of the image P or the impression or image given to the person who sees the image P.
Further, for example, it is assumed that the image P is taken in the evening of midsummer. In this case, the characteristics of the entire landscape, which is a combination of the sunset-colored sea and clouds, and the calm and gentle image that the viewer can perceive are determined by the balance of the entire display control units L1 to L24. Be expressed. For example, each of the display control units L1 to L24 emits light in a color centered on orange or brown, thereby expressing the characteristics of the image P or the impression or image given to the person who sees the image P.
 ここで、腕時計装置2が、画像Pの特徴やイメージを、表示制御単位L全体のバランスで表現する際に用いられる手法は特に限定されないが、本実施形態では以下の手法が採用されている。
 即ち、まず、クラスター分析等の画像解析技術が用いられることで、画像Pの特徴を示す情報(以下、「特徴情報」と呼ぶ)が抽出される。特徴情報には、色に関する情報として、例えば、色の色相、明度、彩度等を示す情報が含まれる。特徴情報の抽出は、画像Pをm個(mは1以上の整数値)の領域Bに区分して、区分された領域B(以下、「区分領域B」と呼ぶ)毎に行われる。図1の例では、画像Pを区分領域B1乃至B9に区分して、区分された9個の区分領域B1乃至B9の夫々の特徴情報が抽出される。
 なお、図1に示す画像Pの上部及び下部の夫々に破線で囲まれた領域に配置された各種の操作ボタン等は、特徴情報の抽出の対象から除外される。
Here, the method used by the wristwatch device 2 to express the features and images of the image P with the balance of the entire display control unit L is not particularly limited, but the following method is adopted in the present embodiment.
That is, first, by using an image analysis technique such as cluster analysis, information indicating the features of the image P (hereinafter, referred to as "feature information") is extracted. The feature information includes, for example, information indicating color hue, lightness, saturation, and the like as information on color. The feature information is extracted by dividing the image P into m areas B (m is an integer value of 1 or more) and each divided area B (hereinafter, referred to as "division area B"). In the example of FIG. 1, the image P is divided into the division areas B1 to B9, and the characteristic information of each of the nine division areas B1 to B9 is extracted.
It should be noted that various operation buttons and the like arranged in the areas surrounded by broken lines in the upper part and the lower part of the image P shown in FIG. 1 are excluded from the target of extracting the feature information.
 次に、区分領域B毎に抽出された特徴情報に基づいて、n個の表示制御単位Lの夫々の表示形態のパターンが制御される。表示形態のパターンの制御は、m個の区分領域Bの夫々に対応するように行われる。例えば、図1に示すように、画像Pが区分領域B1乃至B9に区分された場合には、区分領域B1乃至B9の夫々の特徴情報が抽出される。そして、抽出された特徴情報に基づいて、表示制御単位L1乃至L24の夫々の表示形態のパターンが制御される。つまり、画像表示端末1に表示されている画像Pの特徴情報が再構成されて腕時計装置2に表示される。
 ここで、m個の区分領域B毎の特徴情報を抽出する手法は特に限定されない。なお、本実施形態では、区分領域B1乃至B9の夫々を構成する複数の画素のうち、1以上の画素からなる領域A(以下、「抽出領域A」と呼ぶ)の単位で特徴情報が抽出される。例えば、区分領域B1の特徴情報の抽出は、区分領域B1のうち、抽出領域A1の特徴情報を抽出することで行われる。また例えば、区分領域B3の特徴情報の抽出は、区分領域B3のうち、抽出領域A2の特徴情報を抽出することで行われる。なお、個々の区分領域Bのうち、どの領域を抽出領域Aとするかは特に限定されない。例えば個々の区分領域Bの夫々で一番多く用いられている色が集中している領域を抽出領域Aとしてもよいし、ランダムで決定されるようにしてもよい。
Next, the pattern of each display form of the n display control units L is controlled based on the feature information extracted for each division area B. The control of the pattern of the display form is performed so as to correspond to each of the m division areas B. For example, as shown in FIG. 1, when the image P is divided into the division areas B1 to B9, the feature information of each of the division areas B1 to B9 is extracted. Then, based on the extracted feature information, the pattern of each display form of the display control units L1 to L24 is controlled. That is, the feature information of the image P displayed on the image display terminal 1 is reconstructed and displayed on the wristwatch device 2.
Here, the method for extracting the feature information for each of the m division regions B is not particularly limited. In the present embodiment, feature information is extracted in units of a region A (hereinafter, referred to as “extraction region A”) composed of one or more pixels among a plurality of pixels constituting each of the division regions B1 to B9. NS. For example, the feature information of the division area B1 is extracted by extracting the feature information of the extraction area A1 from the division area B1. Further, for example, the feature information of the division area B3 is extracted by extracting the feature information of the extraction area A2 from the division area B3. It should be noted that, of the individual division regions B, which region is designated as the extraction region A is not particularly limited. For example, the region in which the most frequently used colors are concentrated in each of the individual division regions B may be the extraction region A, or may be randomly determined.
 また、m個の区分領域Bの夫々に対して、n個の表示制御単位Lのうちどの表示制御単位Lを対応させるのかは特に限定されない。例えば図1の例では、区分領域B1に表示制御単位L1が常に対応するように固定してもよい。また例えば、対応関係を固定させずに、区分領域B1に表示制御単位L2が対応したり、表示制御単位L3が対応したりするように変動させてもよい。
 また、m個の区分領域Bの夫々と、n個の表示制御単位Lとの対応関係は1対1である必要もない。例えば図1の例では、区分領域B1に表示制御単位L1及びL12が対応し、区分領域B3に表示制御単位L10が対応し、区分領域B5に表示制御単位L3及びL18が対応している。また、区分領域B6、B7、及びB9の夫々に、表示制御単位L2、L6、及びL7の夫々が対応している。
Further, which of the n display control units L is associated with each of the m division areas B is not particularly limited. For example, in the example of FIG. 1, the display control unit L1 may be fixed so as to always correspond to the division area B1. Further, for example, the display control unit L2 may correspond to the division area B1 or the display control unit L3 may correspond to the division area B1 without fixing the correspondence.
Further, the correspondence between each of the m division areas B and the n display control units L does not have to be one-to-one. For example, in the example of FIG. 1, the display control units L1 and L12 correspond to the division area B1, the display control unit L10 corresponds to the division area B3, and the display control units L3 and L18 correspond to the division area B5. Further, each of the display control units L2, L6, and L7 corresponds to each of the division areas B6, B7, and B9.
 このため、図1の例では、区分領域B1のうち抽出領域A1の特徴情報が抽出されて、その特徴情報に基づいて、表示制御単位L1及びL12の表示形態のパターンが制御される。また、区分領域B3のうち抽出領域A2の特徴情報が抽出されて、その特徴情報に基づいて、表示制御単位L10の表示形態のパターンが制御される。また、区分領域B5のうち抽出領域A3の特徴情報が抽出されて、その特徴情報に基づいて、表示制御単位L3及びL18の表示形態のパターンが制御される。また、区分領域B6のうち抽出領域A4の特徴情報が抽出されて、その特徴情報に基づいて、表示制御単位L2の表示形態のパターンが制御される。また、区分領域B7のうち抽出領域A5の特徴情報が抽出されて、その特徴情報に基づいて、表示制御単位L6の表示形態のパターンが制御される。また、区分領域B9のうち抽出領域A6の特徴情報が抽出されて、その特徴情報に基づいて、表示制御単位L7の表示形態のパターンが制御される。 Therefore, in the example of FIG. 1, the feature information of the extraction region A1 of the division region B1 is extracted, and the pattern of the display form of the display control units L1 and L12 is controlled based on the feature information. Further, the feature information of the extraction region A2 of the division region B3 is extracted, and the pattern of the display form of the display control unit L10 is controlled based on the feature information. Further, the feature information of the extraction region A3 in the division region B5 is extracted, and the pattern of the display form of the display control units L3 and L18 is controlled based on the feature information. Further, the feature information of the extraction region A4 of the division region B6 is extracted, and the pattern of the display form of the display control unit L2 is controlled based on the feature information. Further, the feature information of the extraction region A5 of the division region B7 is extracted, and the pattern of the display form of the display control unit L6 is controlled based on the feature information. Further, the feature information of the extraction region A6 of the division region B9 is extracted, and the pattern of the display form of the display control unit L7 is controlled based on the feature information.
 ここで、図1に示す抽出領域A1乃至A6は、説明を簡略化させるための例示に過ぎない。したがって、抽出領域Aが描画されていない区分領域B2、B4、及びB8の夫々についても、図示せぬ抽出領域Aの特徴情報が抽出され得る。なお、「抽出され得る」としたのは、必ずしもm個の区分領域Bのすべてから特徴情報が抽出されなくてもよいことを意味している。 Here, the extraction regions A1 to A6 shown in FIG. 1 are merely examples for simplifying the explanation. Therefore, the feature information of the extraction region A (not shown) can be extracted for each of the division regions B2, B4, and B8 in which the extraction region A is not drawn. The phrase "can be extracted" means that the feature information does not necessarily have to be extracted from all of the m division regions B.
 ユーザは、本サービスを利用する場合、本サービスを利用可能にする専用のアプリケーションソフトウェアを自分のスマートフォンにインストールする。これにより、ユーザは、自分のスマートフォンを画像表示端末1として機能させて、本サービスを利用することができる。また、ユーザは、自分のスマートフォンのブラウザ機能を用いて、本サービスを利用可能にする専用のWebサイトにアクセスすることでも本サービスを利用することができる。 When using this service, the user installs the dedicated application software that enables this service on his or her smartphone. As a result, the user can use the service by making his / her smartphone function as the image display terminal 1. The user can also use the service by accessing a dedicated website that enables the service by using the browser function of his / her smartphone.
 次に、図2を参照して、上述した本サービスの提供を実現化させる情報処理システムの構成について説明する。
 図2は、本発明の情報処理システムの構成の一例を示す図である。
Next, with reference to FIG. 2, the configuration of the information processing system that realizes the provision of the above-mentioned service will be described.
FIG. 2 is a diagram showing an example of the configuration of the information processing system of the present invention.
 図2に示す情報処理システムは、画像表示端末1と腕時計装置2とを含むように構成されている。
 画像表示端末1及び腕時計装置2は、所定のネットワークNを介して相互に接続されている。なお、ネットワークNは、その形態は特に限定されず、例えば、Bluetooth(登録商標)、Wi-Fi、LAN(Local Area Network)、インターネット等を採用することができる。また、画像表示端末1と腕時計装置2との間は、NFC(Near Field Communication)(登録商標)等の近距離無線通信技術によって接続がなされてもよい。
 即ち、ユーザが本サービスを利用するにあたり、画像表示端末1及び腕時計装置2は、互いに同じ場所にあってもよいし、離れた場所にあってもよい。
The information processing system shown in FIG. 2 is configured to include an image display terminal 1 and a wristwatch device 2.
The image display terminal 1 and the wristwatch device 2 are connected to each other via a predetermined network N. The form of the network N is not particularly limited, and for example, Bluetooth (registered trademark), Wi-Fi, LAN (Local Area Network), the Internet, and the like can be adopted. Further, the image display terminal 1 and the wristwatch device 2 may be connected by a short-range wireless communication technology such as NFC (Near Field Communication) (registered trademark).
That is, when the user uses this service, the image display terminal 1 and the wristwatch device 2 may be in the same place or in a distant place from each other.
 画像表示端末1は、ユーザにより操作される情報処理装置である。画像表示端末1は、例えばパーソナルコンピュータ、スマートフォン、タブレット等で構成される。即ち、画像表示端末1は、本サービスを利用するための専用機である必要はなく、画像Pを表示可能なあらゆるデバイスを採用することができる。 The image display terminal 1 is an information processing device operated by a user. The image display terminal 1 is composed of, for example, a personal computer, a smartphone, a tablet, or the like. That is, the image display terminal 1 does not have to be a dedicated machine for using this service, and any device capable of displaying the image P can be adopted.
 腕時計装置2は、ユーザにより操作される、腕時計タイプの情報処理装置である。なお、本実施形態では腕時計タイプとされているが、特に限定されず、他のウェアラブル端末であってもよい。また、さらに言えば、表示制御単位Lを備えることができる機器であればよく、例えば、IoT(Internet of Things)デバイス等であってもよい。 The wristwatch device 2 is a wristwatch-type information processing device operated by a user. Although it is a wristwatch type in this embodiment, it is not particularly limited and may be another wearable terminal. Further, further, any device that can provide the display control unit L may be used, and may be, for example, an IoT (Internet of Things) device or the like.
 図3は、図2に示す情報処理システムのうち、画像表示端末のハードウェア構成の一例を示すブロック図である。 FIG. 3 is a block diagram showing an example of the hardware configuration of the image display terminal among the information processing systems shown in FIG.
 画像表示端末1は、CPU(Central Processing Unit)11と、ROM(Read Only Memory)12と、RAM(Random Access Memory)13と、バス14と、入出力インターフェース15と、入力部16と、出力部17と、記憶部18と、通信部19と、ドライブ20とを備えている。 The image display terminal 1 includes a CPU (Central Processing Unit) 11, a ROM (Read Only Memory) 12, a RAM (Random Access Memory) 13, a bus 14, an input / output interface 15, an input unit 16, and an output unit. A 17, a storage unit 18, a communication unit 19, and a drive 20 are provided.
 CPU11は、ROM12に記録されているプログラム、又は、記憶部18からRAM13にロードされたプログラムに従って各種の処理を実行する。
 RAM13には、CPU11が各種の処理を実行する上において必要なデータ等も適宜記憶される。
The CPU 11 executes various processes according to the program recorded in the ROM 12 or the program loaded from the storage unit 18 into the RAM 13.
Data and the like necessary for the CPU 11 to execute various processes are also appropriately stored in the RAM 13.
 CPU11、ROM12及びRAM13は、バス14を介して相互に接続されている。このバス14にはまた、入出力インターフェース15も接続されている。入出力インターフェース15には、入力部16、出力部17、記憶部18、通信部19及びドライブ20が接続されている。 The CPU 11, ROM 12 and RAM 13 are connected to each other via the bus 14. An input / output interface 15 is also connected to the bus 14. An input unit 16, an output unit 17, a storage unit 18, a communication unit 19, and a drive 20 are connected to the input / output interface 15.
 入力部16は、例えばキーボード等により構成され、各種情報を入力する。
 出力部17は、液晶等のディスプレイやスピーカ等により構成され、各種情報を画像や音声として出力する。
 記憶部18は、DRAM(Dynamic Random Access Memory)等で構成され、各種データを記憶する。
 通信部19は、インターネットを含むネットワークNを介して他の装置(例えば図2の腕時計装置2等)との間で通信を行う。
The input unit 16 is composed of, for example, a keyboard or the like, and inputs various information.
The output unit 17 is composed of a display such as a liquid crystal display, a speaker, or the like, and outputs various information as images or sounds.
The storage unit 18 is composed of a DRAM (Dynamic Random Access Memory) or the like, and stores various data.
The communication unit 19 communicates with another device (for example, the wristwatch device 2 in FIG. 2) via the network N including the Internet.
 ドライブ20には、磁気ディスク、光ディスク、光磁気ディスク、或いは半導体メモリ等よりなる、リムーバブルメディア40が適宜装着される。ドライブ20によってリムーバブルメディア40から読み出されたプログラムは、必要に応じて記憶部18にインストールされる。
 また、リムーバブルメディア40は、記憶部18に記憶されている各種データも、記憶部18と同様に記憶することができる。
A removable media 40 made of a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is appropriately mounted on the drive 20. The program read from the removable media 40 by the drive 20 is installed in the storage unit 18 as needed.
Further, the removable media 40 can also store various data stored in the storage unit 18 in the same manner as the storage unit 18.
 なお、図示はしないが、図2の腕時計装置2も、図3に示すハードウェア構成と基本的に同様の構成を有することができる。従って、腕時計装置2のハードウェア構成の説明については省略する。 Although not shown, the wristwatch device 2 of FIG. 2 can have basically the same configuration as the hardware configuration shown in FIG. Therefore, the description of the hardware configuration of the wristwatch device 2 will be omitted.
 このような図3の画像表示端末1を含む図2の情報処理システムの各種ハードウェアと各種ソフトウェアとの協働により、情報処理システムにおけるカラーピック表示処理を含む各種処理の実行が可能になる。その結果、ユーザは、本サービスを利用することができる。
 「カラーピック表示処理」とは、本サービスを提供するために、図2の情報処理システムで実行される処理のことをいう。
 以下、図4を参照して、本実施形態に係る画像表示端末1において実行される、カラーピック表示処理を実行するための機能的構成について説明する。
By collaborating with various hardware and various software of the information processing system of FIG. 2 including the image display terminal 1 of FIG. 3, various processes including color pick display processing in the information processing system can be executed. As a result, the user can use this service.
The "color pick display process" refers to a process executed by the information processing system of FIG. 2 in order to provide this service.
Hereinafter, with reference to FIG. 4, a functional configuration for executing the color pick display process executed in the image display terminal 1 according to the present embodiment will be described.
 図4は、図3の画像表示端末が適用される情報処理システムの機能的構成のうち、カラーピック表示処理を実行するための機能的構成の一例を示すブロック図である。 FIG. 4 is a block diagram showing an example of the functional configuration for executing the color pick display processing among the functional configurations of the information processing system to which the image display terminal of FIG. 3 is applied.
 図4に示すように、画像表示端末1のCPU11においては、カラーピック表示処理の実行が制御される場合、抽出部101と、発光制御部102とが機能する。
 また、画像表示端末1の記憶部18の一領域には画像DB181が設けられている。画像DB181には、1以上の画像Pが記憶されて管理されている。
As shown in FIG. 4, in the CPU 11 of the image display terminal 1, when the execution of the color pick display process is controlled, the extraction unit 101 and the light emission control unit 102 function.
Further, an image DB 181 is provided in one area of the storage unit 18 of the image display terminal 1. One or more images P are stored and managed in the image DB 181.
 抽出部101は、画像表示端末1に表示された画像Pの特徴を示す特徴情報を抽出する。具体的には、抽出部101は、画像Pをm個の区分領域Bに区分して、クラスター分析等の画像解析を行うことで、そのm個の区分領域Bの夫々の特徴情報を取得する。
 例えば、図1の例において、抽出部101は、区分領域B1乃至B9の夫々の特徴情報を取得する。
The extraction unit 101 extracts feature information indicating the features of the image P displayed on the image display terminal 1. Specifically, the extraction unit 101 divides the image P into m division regions B and performs image analysis such as cluster analysis to acquire the feature information of each of the m division regions B. ..
For example, in the example of FIG. 1, the extraction unit 101 acquires the feature information of each of the division regions B1 to B9.
 発光制御部102は、抽出部101により抽出された画像Pの特徴情報に基づいて、複数の画素のうち1以上の画素を単位とするn個の表示制御単位Lの夫々の表示形態のパターンに関する制御を実行する。
 具体的には、発光制御部102は、抽出された画像Pの特徴情報に基づいて、腕時計装置2の情報表示部201の表示部211を構成する表示制御単位L1乃至Lnの夫々の表示形態のパターンに関する制御を実行する。
 例えば、図1の例において、発光制御部102は、画像Pの区分領域B1乃至B9の夫々の特徴情報に基づいて、腕時計装置2の表示制御単位L1乃至L24の夫々の表示形態のパターンに関する制御を実行する。
The light emission control unit 102 relates to a pattern of each display form of n display control units L having one or more pixels out of a plurality of pixels as a unit, based on the feature information of the image P extracted by the extraction unit 101. Take control.
Specifically, the light emission control unit 102 has a display form of each of the display control units L1 to Ln constituting the display unit 211 of the information display unit 201 of the wristwatch device 2 based on the extracted feature information of the image P. Performs control over patterns.
For example, in the example of FIG. 1, the light emission control unit 102 controls the pattern of each display form of the display control units L1 to L24 of the wristwatch device 2 based on the characteristic information of each of the division areas B1 to B9 of the image P. To execute.
 腕時計装置2は、情報表示部201と通信部202とを備える。
 情報表示部201では、表示部211が機能する。表示部211は、n個の表示制御単位Lで構成されている。
 通信部202は、インターネットを含むネットワークNを介して他の装置(例えば図2の画像表示端末1等)との間で通信を行う。
The wristwatch device 2 includes an information display unit 201 and a communication unit 202.
In the information display unit 201, the display unit 211 functions. The display unit 211 is composed of n display control units L.
The communication unit 202 communicates with another device (for example, the image display terminal 1 in FIG. 2) via the network N including the Internet.
 以上、本発明の一実施形態について説明したが、本発明は、上述の実施形態に限定されるものではなく、本発明の目的を達成できる範囲での変形、改良等は本発明に含まれるものとみなす。 Although one embodiment of the present invention has been described above, the present invention is not limited to the above-described embodiment, and modifications, improvements, and the like within the range in which the object of the present invention can be achieved are included in the present invention. Consider it as.
 例えば、上述の実施形態では、画像表示端末1及び腕時計装置2の管理者は、同一のユーザであることを前提として説明しているが、これに限定されない。即ち、画像表示端末1の管理者と腕時計装置2の管理者とが異なってもよい。この場合、例えば、画像表示端末1を管理するユーザが、腕時計装置2を管理する他のユーザの好みの画像Pを画像表示端末1に表示させて、上述の本サービスを利用してもよい。 For example, in the above-described embodiment, the administrator of the image display terminal 1 and the wristwatch device 2 is described on the premise that they are the same user, but the present invention is not limited to this. That is, the administrator of the image display terminal 1 and the administrator of the wristwatch device 2 may be different. In this case, for example, the user who manages the image display terminal 1 may display the favorite image P of another user who manages the wristwatch device 2 on the image display terminal 1 and use the above-mentioned service.
 また例えば、図1の例では、画像表示端末1に表示された画像Pは、9つの区分領域に区分されているが、これは例示に過ぎない。上述したように、画像表示端末1に表示された画像Pは、m個の区分領域Bに区分することができる。 Further, for example, in the example of FIG. 1, the image P displayed on the image display terminal 1 is divided into nine division areas, but this is only an example. As described above, the image P displayed on the image display terminal 1 can be divided into m division areas B.
 また例えば、図1の例では、表示部211は、24個の表示制御単位L1乃至L24によって形成されているが、これは例示に過ぎない。上述したように、表示部211は、n個の表示制御単位Lに区分することができる。つまり、表示制御単位Lは、24個未満(例えば12個など)であってもよいし、24個を超える数(例えば36個など)であってもよい。 Further, for example, in the example of FIG. 1, the display unit 211 is formed by 24 display control units L1 to L24, but this is only an example. As described above, the display unit 211 can be divided into n display control units L. That is, the display control unit L may be less than 24 (for example, 12) or more than 24 (for example, 36).
 また例えば、図3の例では、画像Pは画像表示端末1の記憶部18の一領域に設けられた画像DB181に記憶されて管理されているが、これは例示に過ぎない。例えば、画像DB181に記憶させずに所定のクラウドに記憶されている画像Pを一時的にダウンロードしてもよいし、所定のクラウドに記憶されている画像Pにリンクさせてもよい。 Further, for example, in the example of FIG. 3, the image P is stored and managed in the image DB 181 provided in one area of the storage unit 18 of the image display terminal 1, but this is only an example. For example, the image P stored in the predetermined cloud may be temporarily downloaded without being stored in the image DB 181, or may be linked to the image P stored in the predetermined cloud.
 また、図2に示すシステム構成、及び図3に示す画像表示端末1のハードウェア構成は、いずれも本発明の目的を達成するための例示に過ぎず、特に限定されない。 Further, the system configuration shown in FIG. 2 and the hardware configuration of the image display terminal 1 shown in FIG. 3 are merely examples for achieving the object of the present invention, and are not particularly limited.
 また、図4に示す機能ブロック図は、例示に過ぎず、特に限定されない。即ち、上述した一連の処理を全体として実行できる機能が情報処理システムに備えられていれば足り、この機能を実現するためにどのような機能ブロックを用いるのかは、特に図4の例に限定されない。例えば、図4には記憶部18の一領域に画像DB181が設けられているが、所定のクラウドに記憶されている画像Pを一時的にダウンロードする手法やリンクさせる手法が採用された場合には、画像DB181を設ける必要はない。 Further, the functional block diagram shown in FIG. 4 is merely an example and is not particularly limited. That is, it suffices if the information processing system is provided with a function capable of executing the above-mentioned series of processes as a whole, and what kind of functional block is used to realize this function is not particularly limited to the example of FIG. .. For example, in FIG. 4, the image DB 181 is provided in one area of the storage unit 18, but when a method of temporarily downloading an image P stored in a predetermined cloud or a method of linking the image P is adopted, , It is not necessary to provide the image DB181.
 また、機能ブロックの存在場所も、図4に限定されず、任意でよい。
 例えば、図4の例において、上述のカラーピック表示処理は画像表示端末1側で行われる構成となっているが、これに限定されず、腕時計装置2側、又は本サービスを提供する者によって別途管理されているサーバ(図示せず)側でカラーピック表示処理の少なくとも一部が行われてもよい。
 即ち、カラーピック表示処理の実行のために必要となる機能ブロックは、画像表示端末1が備える構成となっているが、これは例示に過ぎない。画像表示端末1に配置された機能ブロックの少なくとも一部を、腕時計装置2、又は本サービスを提供する者によって別途管理されているサーバ(図示せず)が備える構成としてもよい。
Further, the location of the functional block is not limited to FIG. 4, and may be arbitrary.
For example, in the example of FIG. 4, the above-mentioned color pick display processing is configured to be performed on the image display terminal 1 side, but is not limited to this, and is separately provided by the wristwatch device 2 side or a person who provides this service. At least a part of the color pick display process may be performed on the managed server (not shown) side.
That is, the functional block required for executing the color pick display process is configured to be provided in the image display terminal 1, but this is only an example. At least a part of the functional blocks arranged in the image display terminal 1 may be provided by the wristwatch device 2 or a server (not shown) separately managed by a person who provides this service.
 また、上述した一連の処理は、ハードウェアにより実行させることもできるし、ソフトウェアにより実行させることもできる。
 また、1つの機能ブロックは、ハードウェア単体で構成してもよいし、ソフトウェア単体で構成してもよいし、それらの組み合わせで構成してもよい。
Further, the series of processes described above can be executed by hardware or software.
Further, one functional block may be configured by a single piece of hardware, a single piece of software, or a combination thereof.
 一連の処理をソフトウェアにより実行させる場合には、そのソフトウェアを構成するプログラムが、コンピュータ等にネットワークや記録媒体からインストールされる。
 コンピュータは、専用のハードウェアに組み込まれているコンピュータであってもよい。
 また、コンピュータは、各種のプログラムをインストールすることで、各種の機能を実行することが可能なコンピュータ、例えばサーバの他汎用のスマートフォンやパーソナルコンピュータであってもよい。
When a series of processes are executed by software, the programs constituting the software are installed on a computer or the like from a network or a recording medium.
The computer may be a computer embedded in dedicated hardware.
Further, the computer may be a computer capable of executing various functions by installing various programs, for example, a general-purpose smartphone or a personal computer in addition to a server.
 このようなプログラムを含む記録媒体は、ユーザにプログラムを提供するために装置本体とは別に配布される図示せぬリムーバブルメディアにより構成されるだけでなく、装置本体に予め組み込まれた状態でユーザに提供される記録媒体等で構成される。 The recording medium containing such a program is not only composed of a removable medium (not shown) distributed separately from the device main body in order to provide the program to the user, but also is preliminarily incorporated in the device main body to the user. It is composed of the provided recording medium and the like.
 なお、本明細書において、記録媒体に記録されるプログラムを記述するステップは、その順序に沿って時系列的に行われる処理はもちろん、必ずしも時系列的に処理されなくとも、並列的あるいは個別に実行される処理をも含むものである。
 また、本明細書において、システムの用語は、複数の装置や複数の手段等より構成される全体的な装置を意味するものとする。
In the present specification, the steps for describing a program to be recorded on a recording medium are not necessarily processed in chronological order, but also in parallel or individually, even if they are not necessarily processed in chronological order. It also includes the processing to be executed.
Further, in the present specification, the term of the system means an overall device composed of a plurality of devices, a plurality of means, and the like.
 以上まとめると、本発明が適用される情報処理システムは、次のような構成を取れば足り、各種各様な実施形態を取ることができる。
 即ち、本発明が適用される情報処理システムは、
 画像(例えば図1の画像P)を表示する第1デバイス(例えば図4の画像表示端末1)と、情報表示部(例えば図4の情報表示部201)に複数の画素からなる表示部(例えば図4の表示部211)が配置された第2デバイス(例えば図4の腕時計装置2)とを含む情報処理システムであって、
 前記画像の特徴を示す特徴情報を抽出する抽出手段(例えば図4の抽出部101)と、
 前記抽出手段により抽出された前記特徴情報に基づいて、前記複数の画素のうち1以上の画素を単位(例えば図1の表示制御単位L)として、当該単位毎の表示形態のパターンに関する制御を実行する表示制御手段(例えば図4の発光制御部102)と、
 を備える。
Summarizing the above, the information processing system to which the present invention is applied suffices to have the following configuration, and various various embodiments can be taken.
That is, the information processing system to which the present invention is applied is
A first device (for example, the image display terminal 1 of FIG. 4) for displaying an image (for example, the image P of FIG. 1) and an information display unit (for example, the information display unit 201 of FIG. 4) have a display unit (for example) composed of a plurality of pixels. An information processing system including a second device (for example, the wristwatch device 2 in FIG. 4) in which the display unit 211) in FIG. 4 is arranged.
An extraction means for extracting feature information indicating the features of the image (for example, extraction unit 101 in FIG. 4) and
Based on the feature information extracted by the extraction means, control regarding the pattern of the display form for each unit is executed with one or more pixels among the plurality of pixels as a unit (for example, the display control unit L in FIG. 1). Display control means (for example, light emission control unit 102 in FIG. 4) and
To be equipped.
 これにより、第1デバイスに表示された画像の特徴情報が抽出され、その特徴情報に基づいて、第2デバイスの表示部を構成する複数の画素のうち1以上の画素の単位毎に表示形態のパターンに関する制御が実行される。
 その結果、第1デバイスに表示された画像の特徴、あるいはその画像を見る者が感得し得る印象やイメージが、第2デバイスの表示部で表現される。これにより、第2デバイスを管理するユーザは、第1デバイスに表示された画像の雰囲気を直感的に楽しむことができる。
As a result, the feature information of the image displayed on the first device is extracted, and based on the feature information, the display form is set for each unit of one or more pixels among the plurality of pixels constituting the display unit of the second device. Control over the pattern is performed.
As a result, the features of the image displayed on the first device, or the impression or image that the viewer can perceive the image is expressed on the display unit of the second device. As a result, the user who manages the second device can intuitively enjoy the atmosphere of the image displayed on the first device.
 また、前記表示制御手段は、
 前記特徴情報としての色の特徴に基づいて、前記単位毎の表示形態のパターンとしての発光色のパターンに関する制御を実行することができる。
Further, the display control means is
Based on the color feature as the feature information, control regarding the emission color pattern as the display form pattern for each unit can be executed.
 これにより、第2デバイスの表示部を構成する複数の画素のうち1以上の画素の単位毎に発光色のパターンに関する制御が実行される。
 その結果、第1デバイスに表示された画像の色の特徴が第2デバイスの表示部で表現される。これにより、第2デバイスを管理するユーザは、第1デバイスに表示された画像の雰囲気を、第2デバイスに表示された発光色を通じて直感的に楽しむことができる。
As a result, control regarding the emission color pattern is executed for each unit of one or more pixels among the plurality of pixels constituting the display unit of the second device.
As a result, the color characteristics of the image displayed on the first device are expressed by the display unit of the second device. As a result, the user who manages the second device can intuitively enjoy the atmosphere of the image displayed on the first device through the emission color displayed on the second device.
 1・・・画像表示端末、2・・・腕時計装置、11・・・CPU、12・・・ROM、13・・・RAM、14・・・バス、15・・・入出力インターフェース、16・・・入力部、17・・・出力部、18・・・記憶部、19・・・通信部、20・・・ドライブ、40・・・リムーバブルメディア、101・・・抽出部、102・・・発光制御部、181・・・画像DB、201・・・情報表示部、202・・・通信部、211・・・表示部、B・・・区分領域、L・・・表示制御単位、A・・・抽出領域、U・・・ユーザ、N・・・ネットワーク 1 ... image display terminal, 2 ... watch device, 11 ... CPU, 12 ... ROM, 13 ... RAM, 14 ... bus, 15 ... input / output interface, 16 ... Input unit, 17 ... output unit, 18 ... storage unit, 19 ... communication unit, 20 ... drive, 40 ... removable media, 101 ... extraction unit, 102 ... light emission Control unit, 181 ... image DB, 201 ... information display unit, 202 ... communication unit, 211 ... display unit, B ... division area, L ... display control unit, A ... -Extraction area, U ... user, N ... network

Claims (2)

  1.  画像を表示する第1デバイスと、情報表示部に複数の画素からなる表示部が配置された第2デバイスとを含む情報処理システムであって、
     前記画像の特徴を示す特徴情報を抽出する抽出手段と、
     前記抽出手段により抽出された前記特徴情報に基づいて、前記複数の画素のうち1以上の画素を単位として、当該単位毎の表示形態のパターンに関する制御を実行する表示制御手段と、
     を備える情報処理システム。
    An information processing system including a first device for displaying an image and a second device in which a display unit composed of a plurality of pixels is arranged in an information display unit.
    An extraction means for extracting feature information indicating the features of the image, and
    Based on the feature information extracted by the extraction means, a display control means that executes control regarding a pattern of a display form for each unit in units of one or more pixels among the plurality of pixels.
    Information processing system equipped with.
  2.  前記表示制御手段は、
     前記特徴情報としての色の特徴に基づいて、前記単位毎の表示形態のパターンとしての発光色のパターンに関する制御を実行する、
     請求項1に記載の情報処理システム。
    The display control means
    Based on the color feature as the feature information, the control regarding the emission color pattern as the display form pattern for each unit is executed.
    The information processing system according to claim 1.
PCT/JP2021/008013 2020-03-03 2021-03-02 Information processing system WO2021177321A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-035891 2020-03-03
JP2020035891A JP2023056055A (en) 2020-03-03 2020-03-03 Information processing system

Publications (1)

Publication Number Publication Date
WO2021177321A1 true WO2021177321A1 (en) 2021-09-10

Family

ID=77612693

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/008013 WO2021177321A1 (en) 2020-03-03 2021-03-02 Information processing system

Country Status (2)

Country Link
JP (1) JP2023056055A (en)
WO (1) WO2021177321A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100066765A1 (en) * 2006-04-11 2010-03-18 Soo Man Lee Display System and Power Control Method Thereof
JP2010102097A (en) * 2008-10-23 2010-05-06 Sharp Corp Mobile communication device, display control method, and display control program
JP2015152401A (en) * 2014-02-13 2015-08-24 株式会社ヴェルト Portable watch

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100066765A1 (en) * 2006-04-11 2010-03-18 Soo Man Lee Display System and Power Control Method Thereof
JP2010102097A (en) * 2008-10-23 2010-05-06 Sharp Corp Mobile communication device, display control method, and display control program
JP2015152401A (en) * 2014-02-13 2015-08-24 株式会社ヴェルト Portable watch

Also Published As

Publication number Publication date
JP2023056055A (en) 2023-04-19

Similar Documents

Publication Publication Date Title
KR102656507B1 (en) Display processing circuitry
WO2021027649A1 (en) Dark-mode display interface processing method, electronic device, and storage medium
US10854017B2 (en) Three-dimensional virtual image display method and apparatus, terminal, and storage medium
US9984658B2 (en) Displays with improved color accessibility
EP3944080B1 (en) A method, an apparatus and a computer program product for creating a user interface view
EP2990937B1 (en) Wearable electronic device
CN107247548B (en) Method for displaying image, image processing method and device
CN112215934A (en) Rendering method and device of game model, storage medium and electronic device
CN106054655B (en) A kind of smart home inter-linked controlling method and device
EP3971838A1 (en) Personalized face display method and apparatus for three-dimensional character, and device and storage medium
WO2018124736A1 (en) Method and apparatus for modifying display settings in virtual/augmented reality
KR20210105210A (en) Electronic apparatus and system for providing exhibition based on artistic taste of user
WO2021177321A1 (en) Information processing system
US11683358B2 (en) Dynamic user-device upscaling of media streams
US20200292825A1 (en) Attention direction on optical passthrough displays
CN104462470A (en) Display method and device for dynamic image
US10706590B2 (en) Subtitle beat generation method, image processing method, terminal, and server
CN108604367A (en) A kind of display methods and hand-hold electronic equipments
CN112513939A (en) Color conversion for environmentally adaptive digital content
CN113805830B (en) Distribution display method and related equipment
CN113473392A (en) Helmet team display system, method, computer device, and storage medium
CN112256220A (en) Screen control method, device, terminal and storage medium
US20220391436A1 (en) Methods and systems for generating a string image
US11684853B1 (en) Interactive puzzle
WO2023128698A1 (en) System and method for adaptive volume-based scene reconstruction for xr platform applications

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21765045

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21765045

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP