WO2018220694A1 - Dispositif de traitement vidéo, procédé de traitement vidéo et programme - Google Patents

Dispositif de traitement vidéo, procédé de traitement vidéo et programme Download PDF

Info

Publication number
WO2018220694A1
WO2018220694A1 PCT/JP2017/019991 JP2017019991W WO2018220694A1 WO 2018220694 A1 WO2018220694 A1 WO 2018220694A1 JP 2017019991 W JP2017019991 W JP 2017019991W WO 2018220694 A1 WO2018220694 A1 WO 2018220694A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
soma
unit
color
image
Prior art date
Application number
PCT/JP2017/019991
Other languages
English (en)
Japanese (ja)
Inventor
裕子 石若
Original Assignee
ソフトバンク株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソフトバンク株式会社 filed Critical ソフトバンク株式会社
Priority to PCT/JP2017/019991 priority Critical patent/WO2018220694A1/fr
Priority to JP2019521556A priority patent/JP6924829B2/ja
Publication of WO2018220694A1 publication Critical patent/WO2018220694A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof

Definitions

  • the present invention relates to a video processing device that processes and outputs received video.
  • Such a video processing device includes an input unit that receives a frame including a left-eye image and a right-eye image, and a left-input image that has been input, and excludes at least a part of a peripheral image of the left-eye image.
  • Two left-eye images and a second right-eye image including the input right-eye image and excluding at least a part of an image around the right-eye image correspond to the screen resolution of the display device.
  • a display control means for displaying on the display device at a resolution.
  • the video processing apparatus includes an input information receiving unit that receives a video having two or more color images, and a part of the color image is monochromeized for each of the two or more color images that the video has A first process to obtain red color information of at least a part of the color image temporally preceding the color image, and a second process to add the color information to the color image.
  • a change unit that performs one or more of the third processes for making the region black or white and acquires two or more changed images and an output unit that outputs the two or more changed images are provided.
  • a video processing apparatus includes an input information receiving unit that receives a video having two or more color images, and a part of the color image is monochromeized for each of the two or more color images that the video has A first process to obtain red color information of at least a part of the color image temporally preceding the color image, and a second process to add the color information to the color image.
  • a change unit that performs one or more of the third processes for making the region black or white and acquire
  • the changing unit performs at least a first process with respect to the first invention, and in the first process, two or more color images of the video are included.
  • This is a video processing apparatus that converts some different areas into monochrome.
  • the video processing apparatus provides two or more pieces of soma having a soma identifier for identifying the soma and firing condition information relating to a condition for firing the soma with respect to the first or second invention.
  • a soma related information storage unit in which related information is stored, a combined information storage unit in which one or more pieces of combined information for specifying a connection between two or more soma are stored, and an ignition pattern having one or more soma identifiers And an output management information storage unit that stores one or more output management information having output conditions that are output conditions and output information that is output information, and the change unit has an image For one or more color images, one or more feature information that is one or more pieces of information of R, G, B, and Y is acquired from one or more partial images constituting each color image.
  • One or more soma identifiers for identifying one or more feature information acquired by the feature information acquiring unit and the first soma to be fired for each of the information acquiring unit, the two or more color images, and the one or more partial images And one or more feature information acquired from one or more other soma or one or more feature information obtained from one or more feature information and one or more soma of one or more soma to be determined for firing Information transmitting means for acquiring an identifier, one or more feature information acquired by the information transmitting means, and one or more acquired by the information transmitting means for each of two or more color images and for each of one or more partial images Using the firing condition information paired with each soma identifier, determination means for judging whether or not the soma identified by each soma identifier fires, for each of two or more color images, and one or more For each partial image, an ignition pattern acquisition unit that acquires an ignition pattern including one or more soma identifiers that identify the soma determined by the determination unit to be ignited
  • the video processing apparatus obtains color information relating to the colors of two or more partial images constituting two or more color images for any one of the first to third inventions.
  • the input information receiving unit includes a first video having a first color image of the right eye and a left eye.
  • the change unit receives the second video having the second color image, and the changing unit performs the first process, the second process, and the third process for each of the first video and the second video.
  • a synthesizing unit that performs any one or more of the processes, acquires the first modified image of the right eye and the second modified image of the left eye, and synthesizes the first modified image and the second modified image;
  • the output unit is a video processing device that outputs the image synthesized by the synthesis unit.
  • the video processing apparatus can obtain a video obtained by imitating the function of the human eye.
  • Block diagram of video processing apparatus A in the first embodiment Flowchart for explaining the operation of the video processing apparatus A Flow chart explaining the first process Flow chart explaining the second process Flow chart explaining the third process Flowchart explaining the attention area determination process Diagram showing an example of the same color image
  • the figure explaining the change process of the color image of the change part 31 The figure explaining the change process of the color image of the change part 31
  • the figure explaining the change process of the color image of the change part 31 The figure explaining the change process of the color image of the change part 31 Figure showing the same output example Figure showing the same output example Block diagram of video processing apparatus B in the second embodiment Block diagram of a processing unit constituting the video processing apparatus B
  • Block diagram of a processing unit constituting the video processing apparatus B A flowchart for explaining the operation of the video processing apparatus B
  • a video processing apparatus that performs image processing that imitates the function of the eyes on a received video, acquires a changed image, and outputs the changed image will be described.
  • the result of performing the image processing imitating the function of the eye is usually the result of processing the image input to the brain through the eye in the brain.
  • the color image input to the brain through the eyes becomes an image in which a partial area of the color image is converted into black and white by feeding back the red color information by information processing in the brain.
  • the image processing according to the present embodiment includes a first process for converting a partial area of a color image into a monochrome image, a second process for feeding back red (R) information and adding it to a subsequent image, a blind spot It is assumed that the process is one or more of the third processes.
  • a video processing apparatus that detects the motion of an object based on a change in luminance of each pixel constituting an output image will be described.
  • FIG. 1 is a block diagram of the video processing apparatus A in the present embodiment.
  • the video processing apparatus A includes a storage unit 1, a reception unit 2, a processing unit 3, and an output unit 4.
  • the receiving unit 2 includes an input information receiving unit 21.
  • the processing unit 3 includes a change unit 31, a color information acquisition unit 32, an attention area determination unit 33, and a synthesis unit 34.
  • the storage unit 1 stores various types of information.
  • the various types of information are, for example, videos received by the receiving unit 2.
  • the accepting unit 2 accepts various information and instructions.
  • the various information and instructions are, for example, a video, a saccade position, and a saccade position.
  • the saccade position is a position in the image and a saccade center point. Saccade can be said to be a mechanism that automatically moves the eyeball to cover the blind spot.
  • the saccade position may be considered as the center point of the blind spot described later.
  • the saccade position is, for example, a coordinate value (x, y) in the image.
  • the accepting unit 2 may accept a saccade position from an eye tracking device (not shown).
  • the accepting unit 2 may acquire the saccade position from information on a position accepted from an eye tracking device (not shown).
  • the method of accepting the saccade position does not matter.
  • reception means reception of information input from an input device such as a keyboard, mouse, touch panel, reception of information transmitted via a wired or wireless communication line, recording on an optical disk, magnetic disk, semiconductor memory, or the like. It is a concept including reception of information read from a medium.
  • various information and instruction input means are, for example, a touch panel, a keyboard, a mouse, a menu screen, and the like, and may be anything.
  • the accepting unit 2 can be realized by a device driver for input means such as a touch panel and a keyboard, control software for a menu screen, and the like.
  • the input information receiving unit 21 receives input information.
  • the input information is a video having two or more color images. That is, the video received by the input information receiving unit 21 is usually a so-called moving image. However, the input information receiving unit 21 may receive a still image.
  • the input information may be two or more videos.
  • the input information may be two videos.
  • the two images are, for example, an image captured with the right eye and an image captured with the left eye.
  • the color image is a set of pixels having RGB color information, for example, and includes a monochrome image.
  • a color image is usually information having a luminance value.
  • the pixels constituting the color image are expressed by (R, G, B, Y), for example.
  • R is red color information
  • G green color information
  • B blue color information
  • Y luminance information.
  • the input information may include the saccade position.
  • the processing unit 3 performs various processes.
  • the various processes are processes performed by the changing unit 31, the color information acquiring unit 32, and the attention area determining unit 33, for example.
  • the processing unit 3 normally performs a first process described later, a second process described later, and a third process described later.
  • the processing unit 3 may perform only one or two processes among the first process, the second process, and the third process.
  • the changing unit 31 performs one or more processes among the first process, the second process, and the third process, and acquires two or more changed images.
  • the first process is a process for imitating that a part of an image viewed by a human, in particular, a part of a peripheral part excluding the central part is monochrome.
  • the first process is a process that imitates that the monochrome area dynamically changes.
  • the second process is a process that mimics the function of the eye that red color information is fed back to a later image in the previously viewed image.
  • the third process is a process that imitates a blind spot.
  • the first process is a process of converting a partial area of the color image into monochrome for each of two or more color images included in the video received by the input information receiving unit 21.
  • the changing unit 31 first determines one or more areas to be monochromeized for each color image.
  • the changing unit 31 converts the determined one or more areas into monochrome.
  • region from color to monochrome is a well-known technique, detailed description is abbreviate
  • a part of the area to be monochromeized is an area excluding an area within a threshold from the center point of the color image (referred to as a candidate area), and the changing unit 31 divides the candidate area into predetermined areas, For each divided area, for example, a part of the area to be monochromeized is determined by determining whether or not to monochrome at random.
  • the candidate area is a candidate area to be monochromeized.
  • the area within the threshold from the center point may be a circle, a rectangle, or any other random shape.
  • the random shape means that, for example, the threshold value is randomly changed depending on the direction (angle) from the center point.
  • the changing unit 31 may first determine whether or not each pixel constituting each color image is to be converted into monochrome according to a predetermined algorithm.
  • the predetermined algorithm is, for example, determining whether or not each pixel in the candidate area excluding the area within the threshold from the center point of the color image is to be monochromatic, and decided to monochromatic.
  • the obtained pixels are converted to monochrome. It is preferable that the probability of monochrome conversion increases as the distance from the center point of the color image increases.
  • the process for monochrome conversion may be, for example, a process for reducing luminance.
  • the process for monochrome conversion may be, for example, a process for making the target area gray scale.
  • the process for converting to monochrome may be a process for changing the pixel of the target region to one of binary values of black or white.
  • the change unit 31 can be various algorithms for the change unit 31 to determine a candidate area and an algorithm for determining a part of an image to be processed to be monochrome.
  • the changing unit 31 performs at least the first processing, and in the first processing, a part of different areas of each of the two or more color images included in the video is monochromeized. That is, it is preferable that the area to be monochrome changed dynamically.
  • the changing unit 31 may change the area to be converted into monochrome for each predetermined number of consecutive fields (color images), or may change the area different from the previous field in all fields into monochrome.
  • the second process acquires red color information of at least a part of the color image temporally preceding the one color image, and adds the red color information to the one color image. It is processing to do.
  • the red color information of at least a partial region of the image obtained as a result of the first process being performed in time is acquired, and the red color information is immediately before the first process. It is preferable to add to the image obtained as a result of the above.
  • the process of acquiring red color information may be a process of acquiring all the color information (R, G, B) of the pixel when the value of the red component of the pixel is greater than or equal to the threshold.
  • processing for obtaining the value of the red component of the pixel may be performed, or the value of the red component of the pixel is equal to or greater than the first threshold value, and other colors (G, B ) Is less than or equal to the second threshold value (when it can be said that the color of the pixel is red)
  • it may be a process of acquiring all color information (R, G, B) of the pixel.
  • the component value is equal to or greater than the first threshold value and the other colors (G, B) are equal to or smaller than the second threshold value
  • processing for obtaining the value of the red component of the pixel may be performed.
  • the process of adding red color information to a color image is a process of adding red color information to a subsequent color image at a position corresponding to the red position of the previous color image. is there.
  • the process of adding red color information is, for example, acquiring the red value (X) of the pixel A of the previous color image in time and the red value (Y) of the pixel B of the subsequent color image in time.
  • To the red value (X) of the pixel A or a value obtained by subtracting the red value (for example, “ ⁇ X” [0 ⁇ ⁇ 1]), and processing of the pixel B of the color image later in time This is a process for obtaining a later red value (“Y + X” or “Y + ⁇ X”).
  • the red value of the pixel of the previous color image is the pixel corresponding to the pixel, and the red value of the pixel of the subsequent color image is temporally changed.
  • the pixel B of the color image after the above time is a pixel corresponding to the pixel A of the color image before the time.
  • the corresponding pixels are usually pixels having the same relative position from the gaze position. That is, the gaze position of the previous color image in time is (x1 0 , y1 0 ), the position of the pixel A is (x1 1 , y1 1 ), and the gaze position of the color image later in time.
  • the gaze position is a position in Fovea.
  • Fovea is an area with high color resolution.
  • the gaze position may be the center (center of gravity) of the Fovea region or the point with the highest color resolution.
  • the changing unit 31 acquires all red color information and adds it to a color image that is later in time, but acquires some red color information. Alternatively, it may be added to a color image later in time.
  • the partial red color information may be determined at random by the changing unit 31 among all the red color information, or may be determined by a predetermined algorithm.
  • the changing unit 31 acquires red color information of at least a partial area of the color image just before the one color image, and adds the red color information to the one color image.
  • the red color information of at least a part of the color image two or more previous colors is acquired, and the red color information is also stored in the same color. It may be added to the image.
  • the change unit 31 is less affected by the red color information acquired from the earlier color image in time than the influence of the red color information acquired from the color image closer to the time such as immediately before.
  • the changing unit 31 stores at least temporarily which color image was originally acquired for the red pixel of each color image, and acquired from a previous color image.
  • the probability that a red pixel is added to the previous color image is lowered, and whether to add the red color information of the previous color image to the subsequent color image is determined according to the probability.
  • the storage unit 1 stores two or more pairs of information indicating how much previous color image (for example, natural numbers such as 1, 2,..., N) and probability information indicating the probability. Has been.
  • the third process is a process for setting a partial area of the color image to a predetermined specific value.
  • the specific value is, for example, NULL.
  • the specific value is preferably a value that the color information cannot take.
  • the partial area is referred to as a blind spot area.
  • the blind spot area is preferably outside the color area.
  • the blind spot region includes the saccade position of the color image.
  • the blind spot region is preferably a region centered on the saccade position of the color image.
  • the blind spot region dynamically varies.
  • the size of the blind spot region may be fixed or dynamically changed.
  • the shape of the blind spot region may be a fixed shape such as a circle or a rectangle, or may be dynamically changed.
  • the blind spot region dynamically changes, for example, when the saccade position that dynamically changes is received by the receiving unit 2.
  • the changing unit 31 first determines the blind spot region using a predetermined algorithm. For example, the changing unit 31 randomly changes within a predetermined range with reference to the center point of the color image, and determines the coordinates of the center point of the blind spot region. Next, the changing unit 31 determines the size of the blind spot region randomly within a predetermined range, for example. Next, the changing unit 31 changes, for example, each pixel in the area whose radius is the size of the blind spot area from the coordinates of the center point of the blind spot area to black.
  • the changing unit 31 may change the blind spot region at random timing, may change the blind spot region at a predetermined cycle (for example, every 10 fields), or may be changed for each received image. May be changed.
  • the changing unit 31 When the changing unit 31 performs two or more processes among the first process, the second process, and the third process, the order of the processes does not matter. However, it is preferable that the changing unit 31 performs the first process first. That is, when the changing unit 31 performs three processes, the changing unit 31 performs the first process, the second process, the third process, or the first process, the third process, and the second process. It is preferable to carry out in any of the following order.
  • the changing unit 31 performs at least one of the first process, the second process, and the third process for each of the first video and the second video. And the first modified image of the right eye and the second modified image of the left eye may be acquired.
  • the first video is a video having a first color image of one or more right eyes.
  • the second video is a video having a first color image of one or more left eyes.
  • the processing performed by the changing unit 31 for each of the first video and the second video is performed in parallel.
  • the color information acquisition unit 32 acquires color information of two or more partial images constituting two or more color images.
  • the color information acquisition unit 32 may acquire, for each partial image, color information representing the color information of the pixels constituting each of the two or more partial images.
  • the representative color information is a representative value of the color information.
  • the representative value of the color information is, for example, an average value of a plurality of partial images of the average value of each color and luminance value constituting the color information of the pixels constituting the partial image.
  • the representative value of the color information is, for example, the median value of a plurality of partial images, such as the median value of each color and luminance value constituting the color information of the pixels constituting the partial image.
  • the attention area determination unit 33 determines the attention area.
  • the attention area determination unit 33 determines an area centered on the saccade position, for example.
  • the attention area determination unit 33 determines, for example, an area having a predetermined shape (for example, a circle area having a radius N) around the saccade position.
  • the attention area determination unit 33 determines, for example, an area that dynamically changes around the saccade position.
  • the attention area determination unit 33 for example, generates a random number with a range (x1 to xn), and determines an area that dynamically changes using the random number.
  • the attention area determination unit 33 determines, for example, a circle area centered on the saccade position and having the generated random number as a radius.
  • the attention area determination unit 33 determines, for example, an n-shaped figure area by dynamically changing n points around the saccade position.
  • the attention area determination unit 33 determines the area of the color information so as to satisfy a predetermined condition, for example. For example, the attention area determination unit 33 determines an area having a larger color information value as the predetermined condition is satisfied.
  • This region is a region of interest.
  • the region of interest is Fovea (where color resolution is high).
  • the attention area determination unit 33 for example, the statistic of the value of the color information acquired by the color information acquisition unit 32 (for example, the total value of RGB, the total value of RGBY, the average value of RGB, the RGBY value) Average value, RGB median value, RGBY median value, RGB maximum value, RGBY maximum value, etc.) are acquired, and a region that is large enough so that the statistic of the value of the color information satisfies a predetermined condition is determined.
  • the attention area determination unit 33 may determine a continuous area that is centered on the gaze position and that is large enough that the statistic of the value of the color information satisfies a predetermined condition.
  • determining the region includes, for example, obtaining two or more coordinate values that specify the region, and two or more values that specify the region (for example, the coordinates and radius of the center point).
  • the region of interest is usually one and a continuous region, but may be two or more discrete regions.
  • the synthesizing unit 34 synthesizes the first changed image and the second changed image acquired by the changing unit 31. That is, the synthesis unit 34 synthesizes the image obtained with the left eye and the image obtained with the right eye. For example, when the image obtained with the left eye is the image a and the image b obtained with the right eye, the synthesizing unit 34 aligns the images a and b and synthesizes them.
  • combines the image obtained with a left eye and the image obtained with a right eye is a well-known technique, detailed description is abbreviate
  • the synthesizing unit 34 is a pixel existing at a position corresponding to the pixel a1 and the pixel a1 of the image a, detects the correspondence with the pixel b1 of the image b, and the pixel a1 and the pixel b1 are at the same position.
  • the image a or / and the image b are translated so that Next, the synthesizing unit 34 integrates the pixel values of the corresponding two pixels for each set of pixels at the same position in the two overlapped images, and calculates the pixel value of each pixel after synthesis. Then, the synthesis unit 34 obtains an image obtained by synthesizing the image obtained with the left eye and the image obtained with the right eye. It is preferable that the two images that are combined by the combining unit 34 are changed images that have been processed by the changing unit 31.
  • the output unit 4 outputs two or more changed images acquired by the changing unit 31.
  • the output unit 4 sequentially outputs two or more changed images acquired by the changing unit 31. That is, the two or more changed images output from the output unit 4 are usually so-called moving images. However, the output unit 4 may output one changed image (still image).
  • the output unit 4 outputs an attention region transition that is a change in one or more attention regions determined by the attention region determination unit 33 so as to be visually recognizable.
  • the output unit 4 normally outputs the modified image in a conspicuous manner by comparing the region determined by the attention region determining unit 33 with other regions.
  • the area determined by the attention area determination unit 33 is referred to as an attention area.
  • the conspicuous aspect in comparison with other areas is, for example, surrounding the attention area with a red frame, highlighting the attention area, and the like.
  • the aspect that stands out compared to other areas may be an aspect that makes other areas inconspicuous compared to the attention area, such as graying out the other areas.
  • region is not ask
  • the output unit 4 may output the image synthesized by the synthesis unit 34.
  • Output refers to display on a display, projection using a projector, printing on a printer, transmission to an external device, storage in a recording medium, delivery of processing results to another processing device or another program, etc. It is a concept that includes
  • the image output by the output unit 4 may be considered as a result of being input to the eye and processed using the cells of the eye.
  • the image output by the output unit 4 may be passed to a brain information processing module that processes an image or video (not shown).
  • the storage unit 1 is preferably a non-volatile recording medium, but can also be realized by a volatile recording medium.
  • the process in which information is stored in the storage unit 1 does not matter. For example, information may be stored in the storage unit 1 via a recording medium, information transmitted via a communication line or the like may be stored in the storage unit 1, or Information input via the input device may be stored in the storage unit 1.
  • the processing unit 3, the changing unit 31, the color information acquiring unit 32, the attention area determining unit 33, and the synthesizing unit 34 can be usually realized by an MPU, a memory, or the like.
  • the processing procedure of the processing unit 3 or the like is usually realized by software, and the software is recorded on a recording medium such as a ROM. However, it may be realized by hardware (dedicated circuit).
  • the output unit 4 may or may not include an output device such as a display or a speaker.
  • the output unit 4 can be realized by driver software for an output device or driver software for an output device and an output device.
  • Step S201 The input information receiving unit 21 determines whether a video has been received. If a video is accepted, the process goes to step S202. If no video is accepted, the process returns to step S201.
  • Step S202 The changing unit 31 assigns 1 to the counter i.
  • Step S203 The changing unit 31 determines whether or not the i-th color image is present in the video received in step S201. If the i-th color image exists, the process goes to step S204. If the i-th color image does not exist, the process returns to step S201.
  • Step S204 The changing unit 31 performs the first process.
  • the first process will be described with reference to the flowchart of FIG.
  • Step S205 The changing unit 31 performs the second process.
  • the second process will be described with reference to the flowchart of FIG. It is preferable that the second processing target is an image obtained as a result of the first processing.
  • Step S206 The changing unit 31 performs a third process.
  • the third process will be described with reference to the flowchart of FIG. Note that the target of the third process is preferably an image obtained as a result of performing the first process and the second process.
  • Step S207 The output unit 4 outputs a modified image that is a result of the processing in step S206.
  • Step S208 The processing unit 3 performs a process of determining a region of interest.
  • the attention area determination process will be described with reference to the flowchart of FIG.
  • Step S209 The output unit 4 outputs the attention area determined in step S208 in a conspicuous manner as compared with other areas. More specifically, the output unit 4 outputs the partial area identified by the partial area identifier accumulated in the buffer in step S208 in a conspicuous manner as compared with other areas.
  • Step S210 The changing unit 31 increments the counter i by one. The process returns to step S203.
  • the process is terminated by turning off the power or interrupting the termination of the process.
  • step S204 Next, the first process of step S204 will be described using the flowchart of FIG.
  • Step S301 The changing unit 31 determines whether or not to change the area to be monochromeized to an area that is different from the area monochromeized with respect to the immediately preceding color image. If the area is different, go to step S302. If not, go to step S309. Note that the first color image of the process is determined to be “different area”.
  • Step S302 The changing unit 31 acquires a candidate area for a monochrome area.
  • the candidate area is an area excluding the center point of the color image and its peripheral area.
  • Step S303 The changing unit 31 assigns 1 to the counter i.
  • Step S304 The changing unit 31 determines whether or not the i-th partial region in the candidate region exists.
  • the process goes to step S305, and when the i-th partial area does not exist, the process returns to the upper process.
  • the partial area may be a pixel, or may be a divided area when the candidate area is divided by a predetermined algorithm.
  • the predetermined algorithm is, for example, to divide the candidate area into a predetermined number of lengths, widths, and the like.
  • the determination method of a partial area does not matter.
  • Step S305 The changing unit 31 determines whether or not the i-th partial area is to be monochromeized. If monochromeization is to be performed, the process proceeds to step S306, and if monochromeization is not to be performed, the process proceeds to step S308. For example, the changing unit 31 sets the i-th partial region so that the probability that the center point of the i-th partial region and the center point of the image to be processed is determined to be monochrome increases as the distance between the central point increases. Decide whether to convert to monochrome.
  • Step S306 The changing unit 31 monochromeizes the i-th partial area.
  • Step S307 The changing unit 31 temporarily stores the area identifier of the i-th partial area in a buffer (not shown).
  • the area identifier may be anything as long as it is information that can identify the partial area.
  • the area identifier is, for example, a set of two or more coordinate values that can specify a partial area.
  • Step S308 The changing unit 31 increments the counter i by one.
  • the process returns to step S304.
  • Step S309 The changing unit 31 substitutes 1 for the counter j.
  • Step S310 The changing unit 31 examines information in a buffer (not shown) and determines whether or not the j-th monochrome partial area exists. If the j-th monochrome partial area exists, the process proceeds to step S311. If the j-th monochrome partial area does not exist, the process returns to the upper process.
  • Step S311 The changing unit 31 acquires the j-th monochrome partial region identifier from a buffer (not shown). Then, the changing unit 31 monochromeizes the area corresponding to the jth monochrome partial area identifier.
  • Step S312 The changing unit 31 increments the counter j by 1. The process returns to step S310.
  • step S205 Next, the second process of step S205 will be described using the flowchart of FIG.
  • Step S401 The changing unit 31 assigns 1 to the counter i.
  • Step S402 The changing unit 31 determines whether or not the i-th pixel exists in the image data to be processed. When the i-th pixel exists, the process goes to step S403, and when the i-th pixel does not exist, the process returns to the upper process.
  • Step S403 The changing unit 31 determines whether the red color information of the i-th pixel is large enough to satisfy a predetermined condition. If it is large enough to satisfy a predetermined condition, go to step S404, and if not large enough to satisfy a predetermined condition, go to step S406.
  • “larger so as to satisfy a predetermined condition” means, for example, that the red value included in the color information of the pixel is larger than the first threshold and other colors (G, B) included in the color information of the pixel. The value of may be smaller than the second threshold.
  • the changing unit 31 pairs red color information, a pixel identifier, and a color image identifier, and stores them in a buffer (not shown).
  • the pixel identifier is information indicating the position of the i-th pixel, for example, and is, for example, a coordinate value (x, y) of a space in the color image to be processed.
  • the color image identifier is information for identifying the color image to be processed, and is, for example, a frame ID in the received video.
  • Step S405 The changing unit 31 determines whether or not the red color information of the i-th pixel of the immediately preceding color image is stored in a buffer (not shown). If stored, go to step S406, and if stored, go to step S407.
  • Step S406 The changing unit 31 acquires the red color information of the i-th pixel of the immediately preceding color image from a buffer (not shown).
  • Step S407 The changing unit 31 adds the red color information (for example, the red value X) acquired in step S406 to the red color information (for example, the red value Y) of the i-th pixel of the image data to be processed. ) Or the calculation (for example, multiplying the above ⁇ ) on the red color information acquired in step S406, and adding the reduced red color information (for example, the above ⁇ X). As a result, the red color of the i-th pixel is emphasized (for example, “Y + ⁇ X”). For example, originally a blue pixel becomes a purple pixel with a red color enhanced.
  • Step S408 The changing unit 31 increments the counter i by one.
  • the process returns to step S402.
  • step S206 Next, the third process of step S206 will be described using the flowchart of FIG.
  • Step S501 The changing unit 31 determines whether or not to change the blind spot region. If so, go to step S502; otherwise, go to step S510. In the case of the first image, it is determined that “the blind spot region is changed”.
  • Step S502 The changing unit 31 acquires the center coordinates of a new blind spot region.
  • the acquisition algorithm of a center coordinate is not ask
  • the changing unit 31 randomly acquires center coordinates that are changed by a predetermined distance from the center coordinates of the immediately preceding blind spot region.
  • the change part 31 acquires the coordinate of the range previously determined from the center coordinate at random, for example.
  • Step S503 The changing unit 31 acquires the size of a new blind spot region.
  • the algorithm for obtaining the size of the blind spot region is not limited.
  • the changing unit 31 randomly acquires the size of the immediately preceding blind spot region from a predetermined range.
  • Step S504 The changing unit 31 acquires blind spot information using the center coordinates acquired in step S502 and the size acquired in step S503, and accumulates the information in a buffer (not shown).
  • the information on the blind spot area may be only the center coordinates and size, or may be information indicating the boundary of the blind spot area.
  • Step S505 The changing unit 31 assigns 1 to the counter i.
  • Step S506 The changing unit 31 determines whether or not the i-th pixel exists in the image data to be processed. When the i-th pixel exists, the process goes to step S507, and when the i-th pixel does not exist, the process returns to the upper process.
  • the image data to be processed is, for example, an image obtained by performing the first process and the second process on the received color image.
  • Step S507 The changing unit 31 determines whether or not the i-th pixel is a pixel in the blind spot region, using the information on the blind spot region. If the i-th pixel is a pixel in the blind spot region, the process proceeds to step S508. If the i-th pixel is not a pixel in the blind spot area, the process proceeds to step S509.
  • Step S508 The changing unit 31 changes the color of the i-th pixel to black.
  • Step S509 The changing unit 31 increments the counter i by one.
  • the process returns to step S502.
  • the changing unit 31 reads information on the blind spot area from a buffer (not shown).
  • the blind spot area information is information on the blind spot area recently accumulated in the buffer.
  • step S208 the attention area determination processing in step S208 will be described with reference to the flowchart of FIG.
  • Step S601 The attention area determination unit 33 substitutes 1 for the counter i.
  • Step S602 The attention area determination unit 33 determines whether or not the i-th partial area exists in the image data to be processed. When the i-th partial area exists, the process goes to step S603, and when the i-th partial area does not exist, the process goes to step S608.
  • the image data to be processed is usually an image obtained by performing the first process, the second process, and the third process on the received color image.
  • Step S603 The color information acquisition unit 32 acquires the color information of each pixel constituting the i-th partial region.
  • the method for determining the partial area does not matter.
  • Step S604 The color information acquisition unit 32 acquires a representative value of one or more pieces of color information acquired in step S603.
  • the representative value of the color information is, for example, an average value of one or more pixels among average values of values constituting the color information of each pixel.
  • Step S605 The attention area determination unit 33 determines whether or not the representative value acquired in Step S604 satisfies a predetermined condition. If the predetermined condition is satisfied, the process goes to step S606. If the predetermined condition is not satisfied, the process goes to step S607. Note that satisfying a predetermined condition means that, for example, the representative value of color information is larger as the predetermined condition is satisfied.
  • the attention area determination unit 33 stores the partial area identifier of the i-th partial area in a buffer (not shown).
  • Step S607 The attention area determination unit 33 increments the counter i by one. The process returns to step S602.
  • the attention area determination unit 33 acquires the partial area identifier of the partial area closest to the center point, which is the partial area identifier stored in the buffer (not shown), and accumulates it in the buffer (not shown).
  • the partial area of this partial area identifier is an area that is the center of the area of interest.
  • the partial region closest to the center point may be a partial region including the center point.
  • Step S609 The attention area determination unit 33 acquires the identifier of the partial area directly or indirectly connected to the partial area identified by the partial area identifier acquired in step S608, and stores it in a buffer (not shown). Return to upper process.
  • the attention area to be determined is an area that is a combination of the area focused by the right eye and the area focused by the left eye, and is one connected area.
  • the attention area determination unit 33 may be an area centered on the received gaze position.
  • image A is (A) and image B is (B).
  • the changing unit 31 first performs a first process on the image A as follows, for example, and acquires the image A (1).
  • the changing unit 31 acquires the coordinate value (x 0 , y 0 ) of the center point of the color image. Then, the changing unit 31, the radial extent of the color guarantee area to X 1 ⁇ X n determined in advance, random, to determine the X m. For example, the changing unit 31 specifies a range between X 1 to X n , generates a random number, and acquires X m . Note that “X 1 ” and “X n ” are stored in the storage unit 1.
  • the changing unit 31 acquires information for specifying a candidate area (R 0 ) that is an area excluding the color guarantee area (see FIG. 8).
  • Such information is, for example, the coordinate value (x 0 , y 0 ) of the center point and the radius X m .
  • the changing unit 31 divides the candidate region (R 0 ) into two or more partial regions.
  • the partial area is an area divided by a predetermined width W by vertical and horizontal lines (see FIG. 9).
  • the width W is usually constant. However, the width W may change as the distance from the center of the image increases.
  • the partial area may be a single pixel.
  • the changing unit 31 determines whether or not each partial region in FIG.
  • “probabilistically” means, for example, acquiring a value of “1” for monochrome conversion or “0” for no monochrome conversion with a predetermined probability. It is assumed that the predetermined probability is stored in the storage unit 1. Then, the changing unit 31 monochromeizes a partial region in the candidate region (R 0 ), and acquires the image A (1). Such an image A (1) is shown in FIG. In FIG. 10, it is assumed that the shaded partial area is a monochrome area. Further, it is preferable that the partial area farther from the center point is more likely to be monochrome.
  • the changing unit 31 does not perform the second process.
  • the changing unit 31 performs a third process on the image A (1) as follows to obtain an image A (2), for example. That is, first, the changing unit 31 acquires the center coordinates (x 0 , y 0 ) of the blind spot region. Next, the changing unit 31 randomly determines Y m from a predetermined radius range of the size of the blind spot region from Y 1 to Y n .
  • the changing unit 31 around the center point of the coordinates (x 0, y 0), to determine the area of a circle with a radius of Y m and blind spot region.
  • the changing unit 31 changes the color of each pixel in the blind spot region to black. Then, the changing unit 31 obtains an image A (2) shown in FIG. In FIG. 11, a black area 1101 is a blind spot area.
  • the output unit 4 outputs the image A (2) that is the changed image acquired by the changing unit 31.
  • An actual output example is as shown in FIG. Note that it can be said that the image A (2) is an image close to the image A in FIG. 7 changed by the brain.
  • the processing unit 3 performs a process of determining a region of interest as follows. That is, first, the attention area determination unit 33 divides, for example, the image A (2) into two or more partial areas (see FIG. 13). Here, it is preferable that the attention area determination unit 33 divides only the area (referred to as “center area” as appropriate) of the image A (2) excluding the candidate area into two or more partial areas. In addition, although a partial area is a rectangle, the shape is not ask
  • the color information acquisition unit 32 acquires the color information of each pixel in each partial area divided by the attention area determination unit 33.
  • the color information acquisition unit 32 acquires the representative value of the color information of each partial region for each partial region, using the color information of each pixel of each partial region divided by the attention region determination unit 33.
  • the attention area determination unit 33 acquires partial area identifiers of one or more partial areas in which the acquired representative value satisfies a predetermined condition.
  • the attention area determination unit 33 normally acquires partial area identifiers of one or more partial areas that are so large that the acquired representative value satisfies a predetermined condition.
  • the attention area determination unit 33 acquires the partial area identifier of the partial area closest to the center point among the acquired identifiers of one or more partial areas. Then, the attention area determination unit 33 acquires the identifier of the partial area that is directly or indirectly connected to the partial area closest to the center point. One or more partial areas identified by the one or more partial area identifiers are attention areas.
  • the output unit 4 makes the determined attention area stand out in comparison with other areas.
  • a conspicuous aspect is, for example, surrounding the region with a thick line.
  • An example of such output is shown in FIG. Needless to say, the conspicuous mode may be, for example, highlighting.
  • the changing unit 31 or the like processes the image B in FIG. 7 ((B) in FIG. 7). That is, first, the changing unit 31 performs the first process as described above on the image B to obtain the image B (1) (see FIG. 15). Note that the monochrome partial area obtained as a result of the first process for the image B is usually different from the monochrome partial area obtained as a result of the first process for the image A. This is because whether or not each partial region is converted to monochrome is determined stochastically here.
  • the changing unit 31 performs the second process on the image B (1) in FIG. 15 to obtain the image B (2).
  • the changing unit 31 has a pixel at a corresponding position (for example, the same position) in the previously output image (here, the image A (2) in FIG. 14) for each pixel in the image B (1).
  • the color information includes a red component (red value [for example, X])
  • the red component (for example, Y) constituting the color information of the pixel in the image B (1) is replaced with the image A ( 2)
  • the changing unit 31 obtains the image B (2) in FIG. In some pixels of the image B (2), red of the image A (2) in FIG. 14 is reflected.
  • the changing unit 31 performs a third process on the image B (2) to obtain the image B (3). That is, first, the changing unit 31 determines the blind spot region and changes the value of each pixel in the blind spot region to NULL. Then, the changing unit 31 obtains an image B (3).
  • the value of each pixel in the blind spot region may be ⁇ 1 or the like. That is, it is preferable that the value of each pixel in the blind spot region is a value that RGBY cannot take and can be distinguished from a pixel having color information.
  • the output unit 4 outputs the image B (3) that is the changed image acquired by the changing unit 31.
  • An actual output example is shown in FIG. Note that it can be said that the image B (3) is an image close to the image B of FIG. 7 changed by the brain.
  • the processing unit 3 performs a process of determining the attention area of the image B (3).
  • the output part 4 makes it the aspect which the determined attention area stands out compared with another area
  • the following processing may be performed. That is, for example, it is assumed that the input information receiving unit 21 receives FIG. 7A that is an image of the right eye and FIG. 7B that is an image of the left eye. Then, the changing unit 31 processes two images (FIG. 7A and FIG. 7B). Then, for example, the changing unit 31 processes FIG. 7A and obtains the image of FIG. Here, it is preferable that the changing unit 31 processes two images (FIG. 7A and FIG. 7B) in parallel. Next, the synthesizing unit 34 synthesizes the changed right-eye image and the changed right-eye image processed by the changing unit 31 by aligning the positions.
  • the output unit 4 outputs the image synthesized by the synthesis unit 34.
  • Such an output example is shown in FIG. In FIG. 31, the blind spot region is zero.
  • the combining unit 34 combines the changed right-eye image and the changed left-eye image, the blind spot area in the changed right-eye image disappears due to the changed left-eye image, and the changed left-eye image This is because the blind spot area in FIG.
  • the processing in the present embodiment may be realized by software. Then, this software may be distributed by software download or the like. Further, this software may be recorded on a recording medium such as a CD-ROM and distributed. This also applies to other embodiments in this specification.
  • the software that implements the video processing apparatus A in the present embodiment is the following program. In other words, this program causes the computer to monochromeize a partial area of the color image with respect to each of the two or more color images included in the video and an input information reception unit that receives the video including two or more color images.
  • a change unit that obtains two or more changed images by performing one or more of the third processes for setting a partial area of the image to black or white, and an output that outputs the two or more changed images
  • Embodiment 1 is different from Embodiment 1 in that the structure and function in the brain are used.
  • the functions to be realized are the same as those in the first embodiment. That is, also in the present embodiment, a video processing apparatus that performs image processing that imitates the function of the eyes on the received video, acquires a changed image, and outputs the changed image will be described.
  • the image processing is a first process for making a partial area of a color image monochrome, a second process for feeding back red (R) information and adding it to a subsequent image, and a third process for providing a blind spot.
  • R red
  • ignition conditions of one or more soma (may be called nerve cell bodies) are stored, and each soma fires using one or more information obtained from the input.
  • a video processing apparatus that determines whether or not, acquires an ignition pattern from the determination result, determines output information using the ignition pattern, and outputs the output information will be described.
  • a video processing apparatus that performs a feature information transmission process using firing start point information that manages information of a soma that fires first corresponding to the feature information will be described.
  • a video processing apparatus that has two or more soma groups (may be referred to as nerve cell body groups) and transfers information between the soma groups that are connected to each other will be described.
  • information is transmitted or not transmitted through a link having AXON (may be called axon) and Dentrites (may be called dendrite).
  • AXON may be called axon
  • Dentrites may be called dendrite
  • the element is, for example, soma, AXON, or Dendrites.
  • the external information includes, for example, weather, temperature, external scenery, smell, sound, light, and the like.
  • the growth trigger is ignition or acceptance of information.
  • a video processing apparatus that has glial cell information and that glial cell information affects growth will be described.
  • soma a video processing apparatus that mimics the apoptosis process of cells such as soma
  • soma when the number of soma is large enough to satisfy a predetermined condition, soma is extinguished, or soa that has a smaller number of firings is extinguished as the predetermined condition is satisfied, or soma that is not connected to Dendrites. Or the connected AXON annihilates soma that has not reached a predetermined goal.
  • FIG. 19 is a block diagram of the video processing apparatus B in the present embodiment.
  • the video processing apparatus B includes a storage unit 5, a reception unit 2, a processing unit 6, and an output unit 4.
  • the receiving unit 2 includes an input information receiving unit 21.
  • the output unit 4 includes an information output unit 41.
  • the storage unit 5 includes a soma related information storage unit 51, a soma group information storage unit 52, a combined information storage unit 53, a glial cell information storage unit 54, an ignition start point information storage unit 55, an output management information storage unit 56, and a learning condition storage unit. 57, a learning information storage unit 58, an ignition information storage unit 59, and a usage combination information storage unit 60.
  • FIG. 20 is a block diagram of the processing unit 6 constituting the video processing apparatus B.
  • the processing unit 6 includes a change unit 61, a learning detection unit 62, a learning information storage unit 63, a growth unit 64, an apoptosis processing unit 65, an ignition information storage unit 66, a color information acquisition unit 32, a region of interest determination unit 33, and a synthesis unit 34.
  • the change unit 61 includes a feature information acquisition unit 611, an information transmission unit 612, a soma calculation unit 613, a determination unit 614, an ignition probability change unit 615, an ignition pattern acquisition unit 616, an output information acquisition unit 617, and a control unit 618.
  • the information transmission unit 612 includes an ignition start point soma determination unit 6121, a combination detection unit 6122, and a transmission information acquisition unit 6123.
  • the storage unit 5 stores various types of information.
  • the various types of information include, for example, an image received by the receiving unit 2, soma related information described later, soma group information described later, combination information described later, glial cell information described later, ignition start point information described later, output management information described later, These are learning conditions to be described later, learning information to be described later, ignition information to be described later, and use combination information to be described later.
  • the soma related information storage unit 51 stores two or more pieces of soma related information.
  • the soma related information is information related to soma.
  • the soma related information includes a soma identifier and firing condition information.
  • the soma-related information usually includes one or more Dendrites information and one or more AXON information.
  • the Dendrites information is information about Dendrites that realizes input of information to soma.
  • the Dendrites information has a Dendrites identifier.
  • the Dendrites identifier is information for identifying Dendrites, such as an ID and a name. It is preferable that the Dendrites information includes the Dendrites position information.
  • the AXON information is information related to AXON that realizes output of information from soma.
  • the AXON information has an AXON identifier.
  • the AXON identifier is information for identifying the AXON, such as an ID and a name. It is preferable that the AXON information includes AXON position information.
  • the soma related information may include goal information.
  • the goal information is information for specifying a goal.
  • a goal is a destination to which AXON or Dendrites connected to soma extends.
  • the goal information is information indicating a position.
  • the goal information is, for example, position information.
  • the goal information is, for example, a three-dimensional coordinate value (x, y, z), a two-dimensional coordinate value (x, y), or a four-dimensional quarteranion (x, y, x, w).
  • soma having ignition condition information regarding color is called a cone.
  • the soma having the firing condition information relating to the luminance is called a lot.
  • the soma related information does not need to have Dendrites information or AXON information.
  • the soma related information may include a synapse identifier for identifying a synapse or a spine identifier for identifying a spine in association with AXON information or Dendrites information.
  • a synapse identifier corresponds to AXON information.
  • the spine identifier usually corresponds to the Dendrites information.
  • the soma related information may include a soma group identifier for identifying a soma group to which the soma belongs.
  • the soma related information may be associated with the soma group identifier.
  • the soma related information has soma position information indicating the position of soma.
  • the soma position information is, for example, a three-dimensional coordinate value (x, y, z), a two-dimensional coordinate value (x, y), or a four-dimensional quarteranion (x, y, x, w).
  • the Dendrites position information is information for specifying the position of the Dendrites. For example, one or two or more three-dimensional coordinate values (x, y, z), or one or two or more two-dimensional coordinate values (x , Y).
  • Dendrites connects each point of the two or more coordinate values.
  • the AXON position information is information for specifying the position of the AXON. For example, one or two or more three-dimensional coordinate values (x, y, z), or one or two or more two-dimensional coordinate values (x , Y).
  • AXON position information has two or more coordinate values
  • AXON is a connecting line connecting each point of the two or more coordinate values.
  • Dendrites and AXON may be branched. Each position information when Dendrites and AXON branch can be expressed by three or four or more coordinate values. However, the method for expressing Dendrites position information and AXON position information is not limited.
  • the soma identifier is information for identifying the soma.
  • the soma identifier is, for example, an ID, a name, or the like.
  • the firing condition information is information relating to conditions for firing soma.
  • the firing condition information usually has one or more pieces of feature information.
  • the feature information may be information having an information identifier for identifying information and an information amount indicating the size of information, or may be information having only an information amount indicating the size of information.
  • the amount of information is a numerical value larger than 0, for example.
  • the feature information constituting the ignition condition information is an information amount.
  • the feature information is, for example, a feature amount, but may be input information itself.
  • the feature amount is, for example, a feature amount of an image obtained as a result of image analysis or a feature amount of a sound obtained as a result of voice analysis. It is preferable that the firing condition information includes firing probability information.
  • the ignition probability information is information related to the probability of ignition.
  • the ignition probability information may be the ignition probability itself, or a value obtained by converting the ignition probability with a function or the like. Even if the probability information is referred to and the characteristic information is the same as the probability information indicated by the ignition probability information, it is preferable that the soma fire or not fire.
  • soma related information Two or more soma related information may be grouped. “Grouped” means, for example, that a soma group identifier is associated with the soma related information. The fact that there is a correspondence means that it is sufficient if the correspondence is possible.
  • the soma group identifier is information for identifying the soma group that is a group to which the soma belongs. “Grouped” means, for example, that the goal information included in the soma related information is the same. “Grouped” means that, for example, they have the same soma group identifier, and “grouped” means that they are connected by a link, for example. Needless to say, the method and data structure for grouping soma-related information are not limited. Note that the soma related information may include a soma group identifier.
  • the soma related information includes retained energy amount information indicating the amount of energy held by soma. Further, it is preferable that the soma related information includes necessary energy amount information indicating an energy amount necessary for ignition. In addition, it is preferable that the Dendrites information includes retained energy amount information indicating the amount of energy held by the Dendrites. In addition, it is preferable that the Dendrites information includes necessary energy amount information necessary for performing information transmission using the Dendrites. Further, it is preferable that the AXON information includes retained energy amount information indicating the energy amount held by the AXON. Furthermore, it is preferable that the AXON information includes necessary energy amount information necessary for information transmission using AXON.
  • the soma group information storage unit 52 stores two or more soma group information.
  • the soma group information includes a soma group identifier and goal information.
  • the goal information is information that identifies a destination to which AXON or Dendrites connected to the soma belonging to the soma group extends.
  • the goal information is, for example, one or two or more three-dimensional coordinate values (x, y, z), or one or two or more two-dimensional coordinate values (x, y).
  • the goal information may be information indicating a direction, for example.
  • the combined information storage unit 53 stores one or more pieces of combined information.
  • the combination information is information that specifies a connection between two or more soma.
  • the combination information may be information specifying the combination of AXON of one soma and Dendrites of another soma. Such information is also information for specifying a bond between soma.
  • the binding information may be information that specifies the binding between one synapse and one spine. Such information is also information for specifying a bond between soma.
  • the combination information includes, for example, two soma identifiers to be combined.
  • the combination information includes, for example, an AXON identifier of AXON and a Dendrites identifier of Dendrites to be combined with the AXON.
  • the combined information includes, for example, a synapse identifier of a synapse and a spine identifier of a spine that can transmit information to and from the synapse.
  • the combined information may include information transmission probability information.
  • the information transmission probability information is information relating to the probability of performing information transmission between one soma and another soma.
  • the information transmission probability information may be information regarding the probability of performing information transmission between AXON and Dendrites.
  • the information transmission probability information is information regarding the probability of performing information transmission between one soma and another soma.
  • the information transmission probability information may be information regarding the probability of performing information transmission between the synapse and the spine.
  • the information transmission probability information is information regarding the probability of performing information transmission between one soma and another soma.
  • the coupling direction between soma is usually one direction.
  • the combination information may be information indicating the connection between soma and AXON. In such a case, the combined information has a soma identifier and an AXON identifier. Further, the combination information may be information indicating a combination of soma and Dendrites. In such a case, the combined information has a soma identifier and a Dendrites identifier.
  • the binding information may be information that identifies the binding between glial cells and AXON or Dendrites.
  • the binding information includes, for example, a glial cell identifier that identifies glial cell information and an AXON identifier.
  • the binding information may include a glial cell identifier and a Dendrites identifier.
  • the glial cell information storage unit 54 stores one or more glial cell information.
  • the glial cell information is information on glial cells.
  • the glial cell information preferably has a glial cell identifier that identifies the glial cell.
  • the glial cell information includes, for example, a soma identifier that identifies a soma to be coupled, or a binding information identifier that identifies binding information to be coupled.
  • the glial cell information includes, for example, an AXON identifier of AXON that binds to the glial cell or a Dendrites identifier of Dendrites that binds to the glial cell.
  • the glial cell information has a glial cell type identifier for identifying the type of glial cell.
  • the types of glial cells are, for example, oligodendrocytes (hereinafter referred to as “oligo” as appropriate) and astrocites.
  • An oligo is a cell that can be connected to an axon.
  • Astrocites are cells that can connect to soma or Dendrites.
  • the glial cell information has glial cell position information that specifies the position of the glial cell.
  • the oligo glial cell information preferably has glial cell location information.
  • glial cell information may include hand length information indicating the length of one or more hands.
  • the glial cell information may include number information indicating the number of hands coming out of the glial cell. Usually, when the total length of glial cells calculated from the hand length information of each hand reaches the threshold value, it is preferable that the glial cells do not grow any more.
  • the ignition start point information storage unit 55 stores one or more ignition start point information.
  • the firing start point information includes an information identifier that identifies feature information and one or more soma identifiers that identify the soma that fires when the feature information is accepted.
  • the information identifier is, for example, information that identifies the type of the feature amount of the image, and is, for example, red “R” that constitutes a color, green “G” that constitutes a color, and blue “B” that constitutes a color.
  • the soma identified by the soma identifier paired with the information identifier indicating red “R” constituting the color is the soma that is fired when the red color information is accepted.
  • the case where the red color information is received is, for example, a case where the value of the red component among the pixel values of each pixel constituting the color image received by the input information receiving unit 21 is greater than or equal to the threshold value.
  • the red color information is received, for example, the value of the red component of the pixel values of each pixel constituting the color image received by the input information receiving unit 21 is greater than or equal to the threshold value, This is a case where the values of components other than red (G, B) are less than or less than the threshold value.
  • the output management information storage unit 56 stores one or more output management information.
  • the output management information is information having output conditions and output information.
  • the output management information may be information on a pair of output conditions and output information.
  • the output condition is a condition used for determining output information.
  • the output condition is a condition for output using an ignition pattern.
  • the output condition may be the firing pattern itself, or information having the firing pattern and output probability information.
  • the output probability information is information relating to the probability for acquiring the output information.
  • the output condition may be information on the lower limit of the number of soma identifiers included in the firing pattern and the applied firing pattern, information on the lower limit of the ratio of soma identifiers included in the firing pattern and the applied firing pattern, and the like.
  • the firing pattern has one or more soma identifiers.
  • the firing pattern is a firing pattern of one or more soma.
  • the output condition may be a condition using an ignition pattern and information on one or more external information.
  • External information is external information.
  • External information may be referred to as a user context.
  • the external information is, for example, temperature, weather, smell, and the like.
  • the external information is information that is normally accepted in addition to the input information when the input information is accepted.
  • Output information is information corresponding to the firing pattern.
  • the output information is, for example, a pixel value.
  • the output information is information indicating a color, for example.
  • the output information includes, for example, emotion information related to human emotions, intention information indicating human intentions, and behavior information related to human body movements.
  • the emotion information is, for example, happy, sad, frightened, surprised, or the like.
  • Emotion information is, for example, an ID that identifies an emotion.
  • the intention information is information for identifying the intention, for example.
  • the behavior information is, for example, information reflected in the movement of the avatar (character). Since the technique for operating the avatar is a known technique, a detailed description thereof is omitted.
  • the learning condition storage unit 57 stores one or more learning conditions.
  • the learning condition is a condition for learning.
  • the learning condition is a condition using an ignition pattern.
  • the learning condition may be the firing pattern itself.
  • the learning conditions include, for example, an ignition pattern having one or more soma identifiers, the number of soma that needs to be ignited for learning, or a ratio of soma that needs to be ignited for learning (included in the firing pattern of the learning condition) Of the soma identifiers to be fired, the number of fired soma / the number of soma identifiers included in the firing pattern of the learning condition may be used.
  • the learning condition may include an ignition pattern and learning probability information.
  • the learning probability information is information regarding the probability determined to be learned. If the learning condition has learning probability information, even if the determination using the firing pattern can be determined as “learning”, it is determined probabilistically using the learning probability information, and “not learning” is determined. It can be done.
  • the learning information storage unit 58 stores one or more learning information.
  • the learning information is learned information.
  • the learning information is information used after learning.
  • the learning information includes input information or one or more feature information acquired from the input information, and an ignition pattern.
  • the feature information here may be an information identifier for identifying the feature information.
  • the feature information here may be an information identifier and an information amount.
  • the learning information may include holding time information indicating the holding time of the firing pattern. The holding time is a time for erasing when the ignition pattern is not used.
  • the process which deletes the learning information which was not used for the time which holding time information shows is performed by the process part 6, for example.
  • the ignition information storage unit 59 stores one or more ignition information.
  • the ignition information here is information related to the result of ignition.
  • the firing information has a soma identifier for identifying the fired soma.
  • the ignition information may usually include timer information indicating when the ignition has occurred.
  • the timer information may be information indicating relative time or time information indicating absolute time. Note that the ignition information stored in the ignition information storage unit 59 may be automatically deleted by the processing unit 6 after a predetermined time has elapsed since the accumulation.
  • the usage combination information storage unit 60 stores one or more usage combination information.
  • the use combination information is information indicating a history of use of the combination information for information transmission.
  • the usage combination information may be information indicating a history of use of AXON or Dendrites for information transmission.
  • the usage combination information includes, for example, a combination information identifier.
  • the usage association information includes, for example, an AXON identifier or / and a Dendrites identifier.
  • the usage connection information includes, for example, a synapse identifier or / and a spine identifier.
  • the usage combination information may include timer information indicating when it is used, for example.
  • the accepting unit 2 accepts various information.
  • the various types of information are, for example, input information such as video, external information, and the like.
  • the various information input means may be anything such as a camera, a microphone, a numeric keypad, a keyboard, a mouse, a menu screen, and various sensors including a motion sensor and a temperature sensor.
  • the accepting unit 2 can be realized by a device driver for input means such as a camera, a microphone, a numeric keypad, a keyboard, or control software for a menu screen.
  • reception means reception of information input from an input device such as a camera, microphone, keyboard, mouse, touch panel, reception of information transmitted via a wired or wireless communication line, an optical disk, a magnetic disk, and a semiconductor memory. It is a concept including reception of information read from a recording medium.
  • the input information receiving unit 21 receives input information.
  • Input information is input information.
  • the input information is, for example, a moving image or a still image.
  • the data type and data structure of input information are not limited. It is preferable that the input information receiving unit 21 also receives one or more external information.
  • the processing unit 6 performs various processes.
  • the various processes include, for example, a change unit 61, a learning detection unit 62, a learning information storage unit 63, a growth unit 64, an apoptosis processing unit 65, an ignition information storage unit 66, a color information acquisition unit 32, an attention area determination unit 33, This is processing performed by the combining unit 34 and the like.
  • the changing unit 61 performs, for each of two or more color images included in the video, a first process for making a partial area of the color image monochrome, at least a partial area of the color image temporally preceding the color image One or more of a second process for acquiring red color information and adding the color information to the color image, and a third process for making a partial area of the color image black or white. Perform two or more changed images.
  • the changing unit 61 performs at least the first process, and in the first process, a part of different areas of each of the two or more color images included in the video is monochromeized. Note that the processing result performed by the changing unit 61 may be considered to be the same as the processing result performed by the changing unit 31.
  • the feature information acquisition means 611 constituting the changing unit 61 acquires one or more feature information from the input information.
  • the feature information is, for example, a feature amount, input information itself, or the like.
  • the feature information includes, for example, an information identifier and an information amount.
  • the feature information acquisition unit 611 performs R, G, and B from one or more partial images constituting each color image for two or more color images included in the video received by the input information receiving unit 21. , Y, one or more pieces of feature information that is one or more pieces of information are acquired. It is preferable that the feature information acquisition unit 611 acquires four feature information of R, G, B, and Y from one or more partial images. Note that the size of the partial image here does not matter.
  • the partial image may be one pixel or may have two or more pixels. That is, it does not matter how each color image included in the video is divided.
  • the processing for acquiring the feature amount of the image (which may be referred to as feature information) is a known technique, and thus detailed description thereof is omitted.
  • the information transmission unit 612 acquires one or more feature information acquired by the feature information acquisition unit 611 and one or more soma identifiers for identifying the first fired soma.
  • the information transmission unit 612 acquires one or more soma identifiers that are paired with the information identifier included in the feature information acquired by the feature information acquisition unit 611 from the ignition start point information storage unit 55, and each of the one or more soma identifiers and the feature information. Acquisition is performed by pairing with the information amount of the feature information acquired by the acquisition unit 611. Normally, an information amount is given to each soma identified by the one or more soma identifiers. Further, the soma identified by the one or more soma identifiers may be the identifier of the soma that fires first.
  • the information transmission means 612 includes one or more feature information passed from one or more other soma or one or more feature information acquired from one or more feature information and one or more of each soma to be determined for firing. Get the soma identifier.
  • the information transmission unit 612 identifies, for example, one or more pieces of feature information acquired by the feature information acquisition unit 611 and the first fired soma for each of two or more color images and for each of one or more partial images.
  • the information transmission unit 612 includes one or more feature information applied to the soma related information of the soma determined by the determination unit 614 to be ignited, or one or more feature information acquired from the one or more feature information, and the determination unit 614 ignites Then, the soma identifier of each of the one or more soma combined with the soma determined is acquired.
  • the information transmission unit 612 uses the combination information in the combination information storage unit 53 to acquire the soma identifiers of one or more pieces of soma that are combined with the soma determined by the determining unit 614 to fire.
  • the information transmission unit 612 acquires one piece of feature information acquired by the soma calculation unit 613 described later and a soma identifier of each of one or more soma to be determined for firing.
  • one feature information is usually one information amount.
  • the information transmission unit 612 uses the combination information that identifies the combination of the soma group (referred to as the first soma group) that includes the soma determined to be ignited by the determination unit 614 and the other soma group (referred to as the second soma group). There is a case in which a soma identifier of one or more soma included in the second soma group is acquired.
  • the soma identifiers of one or more soma included in the second soma group are identifiers of soma existing at positions close to the position of the first soma group.
  • the information transmission unit 612 uses the information transmission probability information included in the combination information stored in the combination information storage unit 53 to probabilistically determine the soma identifier of each of one or more soma combined with the soma determined to be fired by the determination unit 614. To get to. Obtaining stochastically means using the probability information (in this case, information transmission probability information) to determine whether or not to ignite. In addition, since the process acquired probabilistically is a known technique, a detailed description thereof is omitted.
  • the determination means 614 which will be described later, has recently used the AXON or Dendrites or the synapse or the spine based on the usage combination information indicating the history of use of the AXON or Dendrites or the synapse or spine. Only when it is determined that the time has not been used for a time longer than the predetermined time or longer than the predetermined time, it is preferable that the information transmission means 612 performs the information transmission processing to the next soma.
  • the usage combination information is configured and stored in the usage combination information storage unit 60. That is, when the information transmission unit 612 performs the information transmission process to the next soma, the information transmission unit 612 acquires timer information indicating the current time from a timer (not shown). Then, the information transmission unit 612 acquires one or more identifiers of the AXON identifier of the used AXON, the Dendrites identifier of the used Dendrites, the synaptic identifier of the used synapse, or the spine identifier of the used spine. .
  • the information transmission unit 612 configures the usage combination information including the timer information and the acquired one or more identifiers, and stores the usage combination information in the usage combination information storage unit 60.
  • the information transmission unit 612 acquires, for example, a combined information identifier of a connection (link) used for information transmission, acquires timer information indicating the current time from a timer (not illustrated), and the combined information identifier and the timer
  • the usage combination information including the information is configured and accumulated in the usage combination information storage unit 60.
  • the information transmission means 612 performs the information transmission process to the next soma, the information on the amount of stored energy paired with the AXON identifier of the AXON used for the transmission, and the Dendrites used for the transmission It is preferable to reduce the energy amount indicated by the stored energy amount information paired with the Dendrites identifier. It is assumed that the function for reducing the energy amount is stored in the storage unit 5, for example. Moreover, the said function is not ask
  • Equation 1 E, g, and c are parameters, and I is an input signal (request signal).
  • the ignition start point soma determination unit 6121 acquires, from the ignition start point information storage unit 55, one or more soma identifiers that are paired with an information identifier that identifies one or more pieces of feature information acquired by the feature information acquisition unit 611.
  • the information identifier may be included in the feature information acquired by the feature information acquisition unit 611, or may correspond to the feature information or information amount acquired by the feature information acquisition unit 611.
  • the combination detection unit 6122 detects one or more soma combined with the soma determined to be ignited by the determination unit 614 using the combination information in the combination information storage unit 53.
  • Soma detection is usually acquisition of a soma identifier.
  • the combination detection unit 6122 acquires, for example, the soma identifier of the soma determined to be ignited by the determination unit 614, and acquires one or more soma identifiers paired with the soma identifier from the combined information storage unit 53.
  • the combination detection unit 6122 acquires the soma identifier of the soma determined to be ignited by the determination unit 614, acquires the AXON identifier that is paired with the soma identifier from the soma related information storage unit 51, and pairs with the AXON identifier. Is obtained from the combined information storage unit 53, and a soma identifier paired with the Dendrites identifier is obtained from the soma related information storage unit 51.
  • the binding detection unit 6122 acquires, for example, the soma identifier of the soma determined to be fired by the determining unit 614, acquires the synapse identifier paired with the soma identifier from the soma related information storage unit 51, and paired with the synapse identifier.
  • the spine identifier is acquired from the combined information storage unit 53, and the soma identifier paired with the spine identifier is acquired from the soma related information storage unit 51.
  • the combination detection unit 6122 acquires, for example, the soma group identifier of the soma group to which the soma determined to be ignited by the determination unit 614 belongs. If the soma determined by the determining means 614 to be fired is the end of the soma group (there is no soma that transmits feature information in the same soma group), the combination detecting means 6122 The soma group identifier of another soma group that is paired with the soma group identifier that identifies the soma group is acquired from the combined information storage unit 53.
  • the combination detection unit 6122 receives, for example, feature information from one or more of the end of the soma group specified by the acquired soma group identifier (other soma in the same soma group). Get one or more soma identifiers that identify no soma).
  • the soma identifier of the soma determined by the determining unit 614 to be ignited is stored in the storage unit 1 as the last soma of information transmission in the soma group to which the soma belongs
  • the soma group identifier of another soma group that is paired with the soma group identifier that identifies the soma group is acquired from the combined information storage unit 53.
  • the combination detection unit 6122 is, for example, one of one or more soma identifiers paired with the obtained soma group identifier of the other soma group, and is the soma that first receives information from the other soma group.
  • the soma identifier of the soma is acquired from the storage unit 5.
  • the transmission information acquisition means 6123 acquires information used for transmission of information between soma.
  • the transmission information acquisition unit 6123 acquires the characteristic information to be transmitted and the soma identifier of the transmission destination soma.
  • the transmission information acquisition unit 6123 is, for example, one or more feature information applied to the soma related information of the soma determined to be ignited by the determination unit 614, or one or more feature information acquired from the one or more feature information, and combination detection
  • the soma identifier of one or more soma detected by the means 6122 is acquired.
  • the soma calculation means 613 performs an operation on two or more pieces of feature information passed from two or more other pieces of soma to obtain one piece of feature information.
  • the feature information here is usually an information amount. That is, the soma calculation means 613 normally calculates two or more information amounts passed from each of two or more other soma, and acquires one information amount.
  • the calculation is a predetermined calculation.
  • the calculation is, for example, a process of adding two or more information amounts passed from two or more other soma.
  • the calculation is, for example, a process of adding a constant less than 1 after adding two or more pieces of information passed from two or more other soma.
  • the determination means 614 is identified by each soma identifier using one or more feature information acquired by the information transmission means 612 and firing condition information paired with one or more each soma identifier acquired by the information transmission means 612. It is determined whether or not soma fires.
  • the one or more feature information is, for example, an information amount.
  • the determining unit 614 acquires, from the soma-related information storage unit 51, for example, firing condition information that is paired with one or more soma identifiers acquired by the information transmitting unit 612. For example, the determination unit 614 determines whether the information amount acquired by the information transmission unit 612 matches a condition indicated by the acquired ignition condition information.
  • the determination unit 614 does not determine that a fire has occurred when time has not passed to satisfy a predetermined condition with respect to the soma determined to have fired. It may be determined that not igniting does not ignite.
  • the determining unit 614 refers to, for example, the ignition information storage unit 59, and is timer information that is paired with the soma identifier of the determination target soma, and acquires the latest timer information. Then, the determination unit 614 acquires current timer information from a timer (not shown), for example. Then, for example, the determination unit 614 acquires information on the elapsed time since the most recent firing from the current timer information and the latest timer information. Next, the determination unit 614 determines whether or not the elapsed time is smaller than the threshold value or less than or equal to the threshold value. If the elapsed time is smaller than or less than the threshold value, the soma is It is preferable to determine that no ignition occurs.
  • the judging unit 614 uses the firing probability information to judge whether or not soma is fired. Even when the feature information and the firing condition information are used, whether or not the soma is ignited, it is judged that the soma is ignited or not. In other words, even if the same one or more pieces of feature information are given to one soma, the judging means 614 uses the firing probability information corresponding to the soma to judge that it will fire or not to fire. Is preferred.
  • the determination unit 614 determines that it will ignite or does not ignite according to external information.
  • Judgment means 614 stores, for example, the soma identifier of the fired soma and the timer information indicating the time of fire in the storage unit 5 in pairs. That is, when the determination unit 614 determines that soma has fired, the determination unit 614 acquires timer information from a timer (not shown). Then, ignition information including the soma identifier of the fired soma and the timer information is accumulated in the ignition information storage unit 59.
  • the combined information does not operate if it has been operated once, and the time has not passed to satisfy a predetermined condition. That is, the determination unit 614 obtains timer information indicating when the AXON, Dendrites, synapse, or spine used for information transmission was most recently used from the use combination information storage unit 60, and includes timer information indicating the current time. In comparison, when the time has not passed enough to satisfy a predetermined condition, it is preferable to determine that information transmission using the AXON, the Dendrites, the synapse, or the spine is not performed.
  • the determining unit 614 determines that one soma is ignited, it is preferable to reduce the energy amount indicated by the stored energy amount information included in the soma related information of the soma. It is assumed that the function for reducing the energy amount is stored in the storage unit 5, for example. Moreover, the said function is not ask
  • the function is, for example, Equation 2 below.
  • Equation 2 t is time, and f (t) is the amount of stored energy.
  • the determining unit 614 includes, for example, one or more feature information acquired by the information transmitting unit 612 for each of two or more color images received by the input information receiving unit 21 and one or more partial images, and an information transmitting unit. Whether or not the soma identified by each soma identifier is ignited is determined by using the firing condition information paired with one or more each soma identifier acquired by 612. For example, the determination unit 614 determines whether or not one or more feature information acquired by the information transmission unit 612 satisfies the ignition condition indicated by the ignition condition information paired with the one or more soma identifiers acquired by the information transmission unit 612. Judging.
  • the ignition probability changing means 615 changes the ignition probability information corresponding to the soma determined to be ignited by the determining means 614 so that the probability of ignition increases. That is, the firing probability changing unit 615 acquires the soma identifier of the soma determined to be fired by the determining unit 614, and changes the firing probability information paired with the soma identifier so that the probability of firing increases.
  • the algorithm for changing the firing probability information does not matter.
  • the ignition probability changing means 615 may add a predetermined value to the ignition probability information, add a predetermined ratio value to the ignition probability information, or receive one or more features. You may acquire the probability to raise based on information. That is, the degree of increase in probability may be constant or may change dynamically.
  • the firing pattern acquisition unit 616 acquires a firing pattern including one or more soma identifiers that identify the soma determined by the determining unit 614 to have fired.
  • the firing pattern acquisition means 616 applies the input information received by the input information receiving unit 21 or one or more feature information acquired from the input information to the one or more learning information in the learning information storage unit 58, and the input information or the input information It is preferable to acquire an ignition pattern corresponding to one or more pieces of feature information acquired from.
  • the ignition pattern here is an ignition pattern acquired using learning information.
  • the firing pattern acquisition unit 616 may acquire the firing pattern periodically, may acquire the firing pattern irregularly, or acquires the firing pattern every time the determination unit 614 detects the firing of soma. May be.
  • the time width of the ignition pattern acquired by the ignition pattern acquisition means 616 is not limited.
  • the firing pattern acquisition unit 616 may acquire from the firing information storage unit 59 one or more soma identifiers that are paired with timer information indicating the time within the threshold or the latest time from the current time.
  • the firing pattern acquisition unit 616 may acquire one or more soma identifiers included in all the firing information in the firing information storage unit 59 from the firing information storage unit 59. It is preferable that the temporal width of the ignition pattern acquired by the ignition pattern acquisition unit 616 is dynamically changed.
  • the firing pattern acquisition unit 616 identifies one or more of the soma determined by the determining unit 614 to be ignited for each of the two or more color images received by the input information receiving unit 21 and for each of the one or more partial images.
  • the firing pattern including the soma identifier is acquired.
  • the output information acquisition unit 617 acquires output information corresponding to the ignition pattern acquired by the ignition pattern acquisition unit 616 from the output management information storage unit 56.
  • the output information corresponding to the ignition pattern is usually output information that is paired with an ignition pattern that is similar to the ignition pattern acquired by the ignition pattern acquisition unit 616 and satisfies a predetermined condition.
  • a predetermined condition for example, a number of soma identifiers greater than or equal to a threshold value among the one or more soma identifiers included in the firing pattern A are fired. This is the case included in pattern B.
  • a predetermined condition for example, a number of soma identifiers that are greater than or equal to or greater than the threshold value among one or more soma identifiers included in the firing pattern B are fired. This is the case included in pattern A.
  • a predetermined condition for example, one or more of the soma identifiers included in the firing pattern A has a soma identifier that is greater than or equal to a threshold value or greater than the threshold value. This is the case included in pattern B.
  • soma identifiers included in the firing pattern B have a soma identifier that is equal to or greater than a threshold or greater than the threshold. This is the case included in pattern A.
  • the output information acquisition unit 617 detects an ignition pattern that is similar to the ignition pattern acquired by the ignition pattern acquisition unit 616 so as to satisfy a predetermined condition, and uses the output probability information that is paired with the ignition pattern. Probably, it is determined whether or not the output condition is satisfied, and when it is determined that the output condition is satisfied, the output information paired with the output condition is acquired from the output management information storage unit 56.
  • the output information acquisition unit 617 determines an output condition that matches the ignition pattern acquired by the ignition pattern acquisition unit 616 and one or more external information received by the input information receiving unit 21, and outputs information that is paired with the output condition. To get.
  • the output information acquisition unit 617 determines an output condition from the output conditions stored in the output management information storage unit 56, and acquires output information paired with the output condition.
  • the output information acquisition unit 617 detects an ignition pattern that is similar to the ignition pattern acquired by the ignition pattern acquisition unit 616 so as to satisfy a predetermined condition from the output management information storage unit 56, and is paired with the ignition pattern.
  • the firing pattern in the output management information storage unit 56 and the Output information paired with one or more external information is acquired.
  • the output information acquisition unit 617 detects, for example, an ignition pattern that is similar to the ignition pattern acquired by the ignition pattern acquisition unit 616 so as to satisfy a predetermined condition from the output management information storage unit 56, and matches the ignition pattern.
  • the ignition pattern and the one or more external information are paired with each other.
  • Output probability information is used to determine whether or not the output condition is satisfied stochastically.
  • the output information paired with the output condition is output to the output management information storage unit 56.
  • the output information acquisition unit 617 cannot always acquire output information.
  • the output information acquisition unit 617 outputs output information corresponding to the ignition pattern acquired by the ignition pattern acquisition unit 616 for each of one or more partial images for each of two or more color images received by the input information reception unit 21. Are obtained from the output management information storage unit 56, and a changed image is constructed from the obtained one or more pieces of output information.
  • the control unit 618 performs control so that the process of the determination unit 614, the process of the ignition pattern acquisition unit 616, and the process of the information transmission unit 612 are repeated twice or more.
  • the learning detection unit 62 detects a learning condition that matches the ignition pattern acquired by the ignition pattern acquisition means 616.
  • the soma identifier of a part of the ignition pattern acquired by the ignition pattern acquisition unit 616 and the whole or a part of the soma identifier constituting the ignition pattern included in the learning condition are more similar as satisfying a predetermined condition.
  • the learning detection unit 62 determines that the ignition pattern acquired by the ignition pattern acquisition unit 616 matches the learning condition.
  • the detection of the learning condition is, for example, acquisition of a learning condition identifier for identifying the learning condition, acquisition of information indicating that the learning condition is met, and the like.
  • the learning information storage unit 63 stores the learning information in the learning information storage unit 58 when the learning detection unit 62 detects a matching learning condition.
  • the learning information includes at least one of the input information that is the basis of the ignition pattern acquired by the ignition pattern acquisition unit 616 or one or more feature information acquired from the input information, and the ignition pattern acquired by the ignition pattern acquisition unit 616. And a firing pattern having a soma identifier of the part. At least some of the soma identifiers are, for example, one or more soma identifiers excluding the one or more soma identifiers used for detecting the learning condition from the firing pattern obtained by the firing pattern obtaining unit 616.
  • the growth unit 64 performs one or more of a soma generation process, a combined information generation process, a combined information growth process, and a glial cell generation process.
  • the soma generation process is a process for generating soma related information having a soma identifier.
  • the soma generation process for example, generates a unique soma identifier, generates soma position information of a position satisfying a predetermined condition from the position indicated by the soma position information included in the soma related information of the split-source soma, and the soma
  • This is a process of accumulating in the soma related information storage unit 51 the soma related information having the identifier and the soma position information.
  • the soma generation process may be, for example, a process of copying a part of information constituting the soma related information of the split source soma and generating the soma related information of the split soma having the information.
  • the information to be copied is, for example, ignition condition information.
  • the growing unit 64 preferably uses the soma position information included in the soma related information to be generated as the position information that does not overlap with the position of other elements. Further, it is preferable that the growing unit 64 uses the soma position information included in the soma related information to be generated as information of a position that is close enough to satisfy a predetermined condition with the position indicated by the soma position information of the split-source soma.
  • the combined information generation process is a process of generating combined information and storing the combined information in the combined information storage unit 53.
  • the combined information generation process is, for example, a process of acquiring the soma identifiers of two soma to be combined, generating combined information having the two soma identifiers, and accumulating them in the combined information storage unit 53.
  • the combined information generation process is a process of acquiring, for example, the AXON identifier of the AXON to be combined and the Dendrites identifier of Dendrites, generating combined information having the two identifiers, and storing the combined information in the combined information storage unit 53.
  • a synapse identifier of a synapse to transmit information and a spine identifier of a spine to which information is transmitted are generated, combined information having the two identifiers is generated, and stored in the combined information storage unit 53. It is processing.
  • the combined information growth process is a process for growing combined information.
  • the combined information growth process is, for example, changing the position of the AXON position information included in the AXON information in a direction in which the AXON extends.
  • the combined information growth process is, for example, changing the position of the Dendrites position information included in the Dendrites information in a direction in which the Dendrites expands.
  • the growing unit 64 generates the soma related information of the split soma, which is a new soma that has split so many so that the number or frequency of the judgment means 614 determined to have fired satisfies a predetermined condition, and the soma related information It is preferable to perform the soma generation process accumulated in the storage unit 51.
  • the growing unit 64 generates coupling information that couples soma and split soma so that the number or frequency at which the judging unit 614 judges that the firing has occurred satisfies a predetermined condition, and stores the joint information in the joint information storage unit 53 It is preferable to perform the information generation process.
  • the split soma is a soma obtained by splitting the soma from which the split soma was generated.
  • the growing unit 64 generates the soma related information of the split soma that is a new soma in which the connected glial cell information satisfies a predetermined condition, and stores the soma related information in the soma related information storage unit 51. It is preferable to perform the generation process.
  • the growing unit 64 performs a combined information growth process for growing AXON or Dendrites in which the connected glial cell information satisfies a predetermined condition.
  • the growth unit 64 determines that the soma identifier of the one soma and the soma identifier of the other soma It is preferable to perform combined information generation processing for generating combined information having the following information and storing it in the combined information storage unit 53.
  • the growth unit 64 performs the following glial cell generation process. That is, for example, when the amount of energy held by an element that is soma, AXON, or Dendrites decreases to meet the predetermined condition with respect to the required energy amount, the growing unit 64 causes the glia connected to the element to Generate cell information.
  • the growth unit 64 indicates that the stored energy amount indicated by the stored energy amount information included in each element information (soma related information, AXON information, or Dendrites information) is the required energy included in each element information.
  • An identifier (soma identifier, AXON identifier, or Dendrites) that identifies the element when it is determined that the amount is small enough to satisfy a predetermined condition as compared with the required energy amount indicated by the amount information.
  • Glial cell information having an identifier is generated and stored in the glial cell information storage unit 54.
  • the apoptosis processing unit 65 may delete the soma related information from the soma related information storage unit 51.
  • the apoptosis processing unit 65 performs information on the soma corresponding to the soma-related information, information on the AXON connected to the soma-related information, and the soma corresponding to the soma-related information. It is preferable to delete information regarding Dendrites connected to the.
  • the apoptosis processing unit 65 includes the binding information having the soma identifier included in the deleted soma-related information, the binding information including the AXON identifier included in the deleted soma-related information, and the binding information including the dendrites identifier included in the deleted soma-related information. It is preferable to delete from the combined information storage unit 53. Further, when deleting the soma related information from the soma related information storage unit 51, the apoptosis processing unit 65 may delete the connection information to AXON and Dendrites connected to the soma corresponding to the soma related information.
  • the apoptosis processing unit 65 may delete the AXON information. Further, the apoptosis processing unit 65 may delete the Dendrites information. When soma does not undergo apoptosis and only AXON or Dendrites undergoes apoptosis, the apoptosis processing unit 65 deletes, for example, the AXON information of the apoptotic AXON or the Dendrites information of the apoptotic Dendrites from the soma-related information.
  • the apoptosis processing unit 65 preferably deletes the binding information having the AXON identifier of the apoptotic AXON and the binding information having the Dendrites identifier of the apoptotic Dendrites from the binding information storage unit 53. .
  • the apoptosis processing unit 65 deletes information regarding the connection of the glial cells connected to the deleted soma with the soma from the glial cell information storage unit 54. That is, for example, the apoptosis processing unit 65 deletes the soma identifier from glial cell information including the soma identifier of the deleted soma.
  • the apoptosis processing unit 65 is based on glial cell information including the AXON identifier of the AXON or the Dendrites identifier of the Dendrites, Delete the AXON identifier or Dendrites identifier. That is, when information on soma, AXON, or Dendrites is deleted, it is preferable that the apoptosis processing unit 65 deletes information on connection to soma, AXON, or Dendrites from glial cell information.
  • the apoptosis processing unit 65 deletes the soma related information from the soma related information storage unit 51 according to a predetermined condition.
  • the apoptosis processing unit 65 deletes the soma related information from the soma related information storage unit 51 when the soma related information stored in the soma related information storage unit 51 is large enough to satisfy a predetermined condition. is there.
  • the apoptosis processing unit 65 determines a soma that is not connected to another soma, Dendrites, or AXON, and deletes the soma-related information having the soma identifier of the determined soma from the soma-related information storage unit 51. .
  • the apoptosis processing unit 65 inspects the binding information storage unit 53, acquires a soma identifier whose number of occurrences of the soma identifier is less than or less than a threshold, and the soma related information including the soma identifier. Delete from 51.
  • the apoptosis processing unit 65 inspects the soma related information storage unit 51, and displays the soma related information storage unit having only the DENON information below the threshold or below the threshold, or the AXON information below the threshold or below the threshold. Delete from 51.
  • the apoptosis processing unit 65 determines a soma for which the connected AXON has not reached a predetermined goal, and deletes the soma related information having the soma identifier of the determined soma from the soma related information storage unit 51. Is preferred.
  • the apoptosis processing unit 65 inspects the soma related information storage unit 51, compares the AXON position information included in the soma related information with the goal information, and determines the soma in which AXON has not reached the predetermined goal.
  • the soma related information having the soma identifier of the determined soma is deleted from the soma related information storage unit 51.
  • the apoptosis processing unit 65 uses the one or more pieces of firing information in the firing information storage unit 59 to determine a soma identifier that is fired less frequently so as to satisfy a predetermined condition, and soma-related information having the soma identifier is soma. It is preferable to delete from the related information storage unit 51. For example, the apoptosis processing unit 65 determines a soma identifier whose number of appearances is equal to or less than a threshold value or less than the threshold value, and deletes the soma related information having the soma identifier from the soma related information storage unit 51.
  • the apoptosis processing unit 65 deletes the information on AXON or the information on Dendrites according to a predetermined condition.
  • the information related to AXON is, for example, AXON identifier in AXON information or combined information including AXON identifier and glial cell information.
  • the information regarding Dendrites is, for example, Dendrites information or binding information including Dendrites identifier, and Dendrites identifier in glial cell information.
  • the apoptosis processing unit 65 when it is determined that the apoptosis processing unit 65 has become a mitch, and it is determined that there is AXON or Dendrites that is not connected anywhere, the information regarding the AXON or the information regarding the Dendrites is deleted.
  • becoming a mitch is that there are so many phrases that the number of one or more types of information among soma-related information, AXON information, Dendrites information, and glial cell information satisfies a predetermined condition.
  • the elements are soma, AXON, Dendrites, glial cells, synapses, and spines.
  • the ignition information storage unit 66 configures ignition information having a soma identifier for identifying the soma determined to be ignited by the determination unit 614, and stores the ignition information in the ignition information storage unit 59.
  • the ignition information accumulation unit 66 acquires, for example, a soma identifier for identifying the soma determined to be fired by the determining unit 614, acquires timer information indicating the current time from a timer (not shown), and the soma identifier and the timer information And the ignition information is stored in the ignition information storage unit 59.
  • the output unit 4 outputs various information.
  • the various information is, for example, output information.
  • the output information is, for example, two or more changed images.
  • output refers to display on a display, projection using a projector, printing with a printer, sound output, vibration output by a vibrator, transmission to an external device, accumulation in a recording medium, other processing devices, etc. This is a concept that includes delivery of processing results to other programs.
  • the output unit 4 outputs an attention region transition that is a change in one or more attention regions determined by the attention region determination unit 33 so as to be visually recognizable.
  • the information output unit 41 outputs the output information acquired by the output information acquisition unit 617.
  • the output destination of output information is not ask
  • the output destination of the output information may be outside the video processing apparatus B, or may be delivered to other processes in the video processing apparatus B.
  • Soma related information storage unit 51 Soma group information storage unit 52, combined information storage unit 53, glial cell information storage unit 54, firing start point information storage unit 55, output management information storage unit 56, learning condition storage unit 57
  • the learning information storage unit 58, the ignition information storage unit 59, and the use combination information storage unit 60 are preferably non-volatile recording media, but can also be realized by volatile recording media.
  • information may be stored in the storage unit 5 or the like via a recording medium
  • information transmitted via a communication line or the like may be stored in the storage unit 5 or the like
  • information input via the input device may be stored in the storage unit 5 or the like.
  • Processing unit 6 change unit 61, learning detection unit 62, learning information storage unit 63, growth unit 64, apoptosis processing unit 65, firing information storage unit 66, feature information acquisition unit 611, information transmission unit 612, soma calculation unit 613,
  • the judging means 614, the firing probability changing means 615, the firing pattern obtaining means 616, the output information obtaining means 617, and the control means 618 can usually be realized by an MPU, a memory, or the like.
  • the processing procedure of the processing unit 6 and the like is usually realized by software, and the software is recorded on a recording medium such as a ROM. However, it may be realized by hardware (dedicated circuit).
  • the information output unit 41 may or may not include an output device such as a display or a speaker.
  • the information output unit 41 may be realized by output device driver software, or output device driver software and an output device.
  • the input information receiving unit 21 receives external information and temporarily stores it in the storage unit 5.
  • the input information receiving unit 21 may not receive external information.
  • Step S2102 The input information receiving unit 21 determines whether or not input information has been received. If the input information is accepted, the process goes to step S2103. If the input information is not accepted, the process goes to step S2104.
  • the input information here is, for example, a video.
  • Step S2103 The video processing apparatus B performs a firing transmission process.
  • the process returns to step S2101.
  • the ignition transmission process is a process in which information is transmitted between soma. Details of the example of the ignition transmission process will be described with reference to the flowchart of FIG.
  • Step S2104 The processing unit 6 determines whether or not to perform the firing pattern process. If the firing pattern processing is to be performed, the process proceeds to step S2105. If the firing pattern processing is not to be performed, the process proceeds to step S2106. Note that the processing unit 6 may determine that the firing pattern processing is always performed, or may determine that the firing pattern processing is performed at regular intervals.
  • the video processing apparatus B performs a firing pattern process.
  • the process returns to step S2101.
  • the firing pattern process is a process performed using the firing pattern, and includes, for example, a process for determining output information using the firing pattern and outputting the output information, and a learning process. Details of the example of the firing pattern processing will be described with reference to the flowchart of FIG.
  • Step S2106 The processing unit 6 determines whether to perform a growth process and an apoptosis process. If the growth process or the like is performed, the process proceeds to step S2107. If the growth process or the like is not performed, the process returns to step S2101. Note that the processing unit 6 may determine that the growth process is always performed, or may determine that the growth process is performed at regular intervals. The conditions for determining whether or not to perform the growth process or the like are not limited. Further, the growth process and the apoptosis process may be performed as a set, or it may be determined individually whether or not to perform the process.
  • Step S2107 The growth unit 41 performs a soma growth process.
  • An example of the soma growth process will be described with reference to the flowchart of FIG.
  • Step S2108 The growth unit 41 performs an AXON growth process.
  • An example of the AXON growth process will be described with reference to the flowchart of FIG.
  • Step S2109 The growth unit 41 performs Dendrites growth processing. An example of the Dendrites growth process will be described with reference to the flowchart of FIG.
  • Step S2110 The growing unit 41 performs a soma coupling process.
  • An example of the soma combining process will be described with reference to the flowchart of FIG.
  • Step S2111 The growth unit 41 performs glial cell growth processing. An example of the glial cell growth process will be described with reference to the flowchart of FIG.
  • Step S2112 The apoptosis processing unit 42 performs an apoptosis process.
  • the process returns to step S2101.
  • An example of the apoptosis process will be described with reference to the flowchart of FIG.
  • the firing transmission process, the firing pattern process, the growth process, the apoptosis process, and the like are performed in parallel.
  • the firing transmission process of each soma is also processed in parallel.
  • the processing is ended by powering off or interruption of processing end.
  • the feature information acquisition unit 611 acquires one or more feature information.
  • the one or more pieces of feature information here are, for example, red component information, green component information, blue component information, and luminance in the color information of each pixel of the color image.
  • the method by which the feature information acquisition unit 611 acquires one or more feature information is, for example, any one of the following methods (1) to (3).
  • the feature information acquisition unit 611 analyzes the input information received in step S2102 and acquires one or more feature information.
  • the one or more pieces of feature information are, for example, a set of an information identifier and an information amount, and one or more sets.
  • the information transmission unit 612 acquires one or more pieces of feature information applied to the soma related information of the soma determined to be ignited.
  • the one or more pieces of feature information are, for example, one information amount.
  • the information transmission means 612 acquires one or more feature information from one or more feature information applied to the soma related information of the soma determined to be ignited.
  • the information transmission unit 612 calculates one or more pieces of feature information received by soma determined to be ignited using a predetermined calculation formula, and acquires one or more pieces of feature information. An example of such calculation is addition, for example.
  • Step S2202 The information transmission unit 612 determines one or more soma to which the one or more feature information acquired in Step S2201 is passed.
  • the soma determination method is, for example, one of the following (1) to (3).
  • the information transmission unit 612 acquires one or more predetermined soma identifiers of the soma related information in the soma related information storage unit 51.
  • the predetermined one or more soma identifiers of the soma are stored, for example, in the storage unit 5, and the information transmission unit 612 acquires the one or more soma identifiers from the storage unit 5.
  • the one or more soma identifiers are the soma identifiers of the soma that first accepts the one or more feature information acquired from the input information received from the outside.
  • the information transmission unit 612 acquires, from the ignition start point information storage unit 55, one or more soma identifiers that are paired with the information identifier included in the feature information acquired by the feature information acquisition unit 611.
  • the information transmission unit 612 refers to the combined information storage unit 53, and obtains one or more soma identifiers of the soma coupled to the tip of the processing target soma. That is, the information transmission unit 612 realizes the process (3) by any one of the following processes (a) to (c), for example.
  • the information transmission unit 612 acquires, from the combined information storage unit 53, for example, a Dendrites identifier that is paired with one or more AXON identifiers included in the soma related information of the soma to be processed. Then, the information transmission unit 612 acquires, from the soma related information storage unit 51, for example, one or more soma identifiers included in the soma related information having each acquired one or more Dendrites identifiers. (B) The information transmission unit 612 acquires, from the combined information storage unit 53, for example, one or more soma identifiers that are paired with the soma identifier included in the soma related information of the soma to be processed.
  • the information transmission unit 612 acquires, from the combined information storage unit 53, for example, a spine identifier that is paired with one or more synapse identifiers included in the soma related information of the soma to be processed. And the information transmission means 612 acquires one or more soma identifiers contained in the soma related information which has each acquired one or more spine identifiers from the soma related information storage part 51, for example.
  • Step S2203 The processing unit 6 substitutes 1 for the counter i.
  • Step S2204 The processing unit 6 determines whether or not the i-th soma exists in the one or more somas determined in step S2202. If the i-th soma exists, the process goes to step S2205. If the i-th soma does not exist, the process returns to the upper process. Whether or not the i-th soma exists is determined by the processing unit 6 based on whether or not the i-th soma identifier of the soma identifiers acquired in step S2202 exists.
  • Step S2205 The determination means 614 performs an ignition determination process. Details of the firing determination process will be described with reference to the flowchart of FIG.
  • Step S2206 The processing unit 6 goes to step S2207 if the judgment result in step S2205 is a judgment result “fire”, and goes to step S2211 if the judgment result is “no ignition”.
  • the ignition information accumulating unit 66 constitutes the ignition information having a soma identifier for identifying the soma determined to be ignited by the determining means 614.
  • the ignition information storage unit 66 stores the ignition information in the ignition information storage unit 59.
  • the ignition information storage unit 66 acquires timer information from a timer (not shown), configures ignition information having a soma identifier and timer information, and stores the information in the ignition information storage unit 59.
  • the firing probability changing means 615 changes the firing probability information paired with the soma identifier of the soma determined to be fired by the judging means 614 so that the firing probability is increased.
  • the changed ignition probability information is information stored in the soma related information storage unit 51.
  • Step S2209 The processing unit 6 determines whether or not to end the transmission of the feature information to the soma before the i-th soma or the soma group after the i-th soma. If the transmission is to end, go to step S2211, and if the transmission is not to end, go to step S2210. Note that the processing unit 6 refers to, for example, the combined information storage unit 53 and determines whether there is a soma ahead of the i-th soma. If there is no soma ahead of the i-th soma, the transmission of the feature information is terminated.
  • Step S2210) The processing unit 6 executes an ignition transmission process to the soma ahead of the i-th soma or the soma group ahead of the i-th soma.
  • Such a process is an ignition transmission process. That is, the ignition transmission process is a recursive process.
  • Step S2211 The processing unit 6 increments the counter i by 1. The process returns to step S2204.
  • Step S2301 The determination unit 614 acquires the firing condition information included in the soma related information of the soma to be processed from the soma related information storage unit 51.
  • Step S2302 The determination unit 614 determines whether or not external information is received in Step S301. If external information has been received, the process proceeds to step S2303. If external information has not been received, the process proceeds to step S2304.
  • Step S2303 The determination unit 614 acquires the external information received in Step S302 or one or more feature information acquired from the external information.
  • the process of acquiring one or more feature information from the external information may be performed by the feature information acquisition unit 611, the determination unit 614, or the like.
  • Step S2304 The judging means 614 applies one or more pieces of feature information or the like to the firing condition information acquired in Step S2301, and judges whether or not to fire.
  • the one or more feature information or the like is one or more feature information, or one or more feature information and external information.
  • Step S2305 The determination means 614 substitutes the determination result in Step S2304 for the variable “return value”. Return to upper process.
  • the ignition pattern acquisition means 616 refers to the ignition information storage unit 59 and acquires an ignition pattern having one or more soma identifiers. Moreover, it is preferable that the firing pattern acquisition unit 616 also acquires from the learning information storage unit 58 the input information or the firing pattern corresponding to one or more feature information acquired from the input information. That is, it is preferable to use the firing pattern learned in association with the input information or the like for the following processing.
  • Step S2402 The output information acquisition unit 617 determines whether or not external information is accepted in Step S301. If external information is accepted, the process goes to step S2403, and if external information is not accepted, the process goes to step S2404.
  • Step S2403 The output information acquisition unit 617 acquires the external information received in Step S301 or one or more feature information acquired from the external information.
  • the process of acquiring one or more feature information from the external information may be performed by the feature information acquisition unit 611, the output information acquisition unit 617, or the like.
  • the external information accepted in step S301 is usually external information stored in the storage unit 5.
  • Step S2404 The output information acquisition means 617 substitutes 1 for the counter i.
  • Step S2405 The output information acquisition unit 617 determines whether or not the i-th output management information exists in the output management information storage unit 56. If the i-th output management information exists, the process goes to step S2406. If the i-th output management information does not exist, the process goes to step S2411.
  • Step S2406 The output information acquisition unit 617 acquires the output condition of the i-th output management information from the output management information storage unit 56.
  • Step S2407 The output information acquisition unit 617 determines whether or not the ignition pattern acquired in step S2401 or the ignition pattern acquired in step S2401 and the information acquired in step S2403 match the output condition acquired in step S2406. Determine whether. If the output condition is met, the process goes to step S2408, and if the output condition is not met, the process goes to step S2410.
  • Step S2408 The output information acquisition unit 617 acquires the output information included in the i-th output management information.
  • Step S2409 The information output unit 51 outputs the output information acquired in step S2408.
  • Step S2410 The output information acquisition means 617 increments the counter i by one. The process returns to step S2405.
  • the learning information storage unit 63 acquires input information or one or more feature information acquired from the input information.
  • Step S2412 The learning detection unit 62 assigns 1 to the counter i.
  • Step S2413 The learning detection unit 62 determines whether or not the i-th learning condition exists in the learning condition storage unit 57. If the i-th learning condition exists, the process goes to step S2414; otherwise, the process goes to step S2417.
  • Step S2414 The learning detection unit 62 determines whether or not the ignition pattern acquired in Step S2401 matches the i-th learning condition. If they match, go to step S2415; otherwise, go to step S2417.
  • Step S2415 The learning information storage unit 63 acquires one or more soma identifiers to be stored from the firing pattern acquired in step S2401. Then, the learning information storage unit 63 configures learning information having the one or more soma identifiers and the input information or one or more feature information acquired in step S2411.
  • the one or more soma identifiers are firing patterns.
  • Step S2416 The learning information accumulation unit 63 accumulates the learning information configured in step S2415.
  • Step S2417 The learning detection unit 62 increments the counter i by one. The process returns to step S2413.
  • steps S2401 to S2410 are output information output processes
  • steps S2411 to S2417 are learning processes.
  • step S2107 an example of the soma growth process in step S2107 will be described using the flowchart of FIG.
  • Step S2501 The growth unit 64 substitutes 1 for the counter i.
  • Step S2502 The growth unit 64 determines whether or not the i-th soma related information exists in the soma related information storage unit 51. When the i-th soma related information exists, the process goes to step S2503, and when the i-th soma related information does not exist, the process returns to the higher-level process.
  • Step S2503 The growth unit 64 acquires the ignition information corresponding to the i-th soma identifier included in the i-th soma related information from the ignition information storage unit 59.
  • the ignition information is a history of the i-th soma firing.
  • the growth part 64 judges whether the acquired ignition information satisfy
  • the growth unit 64 determines whether or not the number of firing information including the i-th soma identifier is so large as to satisfy a predetermined condition. If so, it is determined that the condition is satisfied.
  • the ignition information acquired by the growth unit 64 may be, for example, ignition information having timer information within a certain period from the present time.
  • the growth unit 64 acquires information on glial cell information corresponding to the i-th soma.
  • the information on glial cell information here may be glial cell information of one or more glial cells bound to the i-th soma, or the number of glial cells bound to the i-th soma.
  • Step S2505 The growth unit 64 determines whether or not the information regarding glial cell information acquired in Step S2504 satisfies a predetermined condition. If the condition is satisfied, the process goes to step S2506. If the condition is not satisfied, the process goes to step S2508.
  • the predetermined condition is a condition for splitting soma.
  • the condition for splitting soma is a condition for generating split soma.
  • the predetermined condition is a case where the information regarding glial cell information is information indicating that the number of glial cells increases as the predetermined condition is satisfied.
  • the growth unit 64 calculates the number of glial cell identifiers paired with the i-th soma identifier, and determines that the predetermined condition is satisfied when the number is greater than or equal to the threshold value. It should be noted that the growth unit 64 has the condition that the number of one or more types of elements is less than or less than the threshold among the number of soma related information, the number of AXON information, the number of Dendrites information, and the like. You may judge that it satisfies.
  • Step S2506 The growth unit 64 generates the soma related information of the split soma obtained by splitting the i-th soma, and accumulates it in the soma related information storage unit 51. In addition, the growth part 64 does not ask
  • Step S2507 The growth unit 64 generates bond information that combines the i-th soma and the split soma generated in step S2506, and accumulates them in the bond information storage unit 53.
  • Step S2508 The growth unit 64 increments the counter i by one. The process returns to step S2502.
  • Step S2601 The growth unit 64 substitutes 1 for the counter i.
  • Step S2602 The growth unit 64 determines whether or not the i-th AXON information exists in the soma related information storage unit 51. If the i-th AXON information exists, the process goes to step S2603, and if the i-th AXON information does not exist, the process returns to the upper level process.
  • the growth unit 64 acquires information on glial cell information corresponding to the i-th AXON information.
  • the information on glial cell information here may be glial cell information of one or more glial cells bound to the i-th AXON, or the number of glial cells bound to the i-th AXON.
  • Step S2604 The growth unit 64 determines whether or not the information on glial cell information acquired in Step S2603 satisfies a predetermined condition. If the condition is satisfied, the process goes to step S2605. If the condition is not satisfied, the process goes to step S2606.
  • the predetermined condition is a condition for AXON to expand. In addition, the predetermined condition is a case where the information regarding glial cell information is information indicating that the number of glial cells increases as the predetermined condition is satisfied.
  • Step S2605 The growth unit 64 changes the AXON position information included in the i-th AXON information so that the AXON is expanded.
  • Step S2606 The growth unit 64 increments the counter i by one. The process returns to step S2602.
  • step S2109 the Dendrites growth process in step S2109 will be described with reference to the flowchart of FIG.
  • Step S2701 The growth unit 64 substitutes 1 for the counter i.
  • Step S2702 The growth unit 64 determines whether or not the i-th Dendrites information exists in the soma related information storage unit 51. If the i-th Dendrites information exists, the process goes to step S2703, and if the i-th Dendrites information does not exist, the process returns to the upper process.
  • the growth unit 64 acquires information regarding glial cell information corresponding to the i-th Dendrites information.
  • the information related to glial cell information here may be glial cell information of one or more glial cells bound to the i-th dendrites, or the number of glial cells bound to the i-th dendrites.
  • Step S2704 The growth unit 64 determines whether or not the information related to glial cell information acquired in Step S2703 satisfies a predetermined condition. If the condition is satisfied, the process goes to step S2705. If the condition is not satisfied, the process goes to step S2706.
  • the predetermined condition is a condition for expanding Dendrites. In addition, the predetermined condition is a case where the information regarding glial cell information is information indicating that the number of glial cells increases as the predetermined condition is satisfied.
  • Step S2705 The growth unit 64 changes the Dendrites position information included in the i-th Dendrites information so that the Dendrites are extended.
  • Step S2706 The growth unit 64 increments the counter i by one. The process returns to step S2702.
  • step S2110 the soma combining process in step S2110 will be described with reference to the flowchart of FIG.
  • Step S2801 The growth unit 64 substitutes 1 for the counter i.
  • Step S2802 The growth unit 64 determines whether or not the i-th AXON information exists in the soma related information storage unit 51. If the i-th AXON information exists, the process goes to step S2803, and if the i-th AXON information does not exist, the process returns to the upper level process.
  • Step S2803 The growth unit 64 acquires the AXON position information included in the i-th AXON information from the soma-related information storage unit 51.
  • Step S2804 The growth unit 64 substitutes 1 for the counter j.
  • Step S2805 The growth unit 64 is Dendrites Dendrites information that is not an input to the soma to which the i-th AXON is connected, and whether or not the j-th Dendrites information exists in the soma-related information storage unit 51. to decide. If the jth Dendrites information exists, the process goes to step S2806, and if the jth Dendrites information does not exist, the process goes to step S2810.
  • Step S2806 The growth unit 64 acquires the Dendrites position information included in the j-th Dendrites information from the soma-related information storage unit 51.
  • Step S2807 The growth unit 64 can combine the i-th AXON and the j-th Dendrites by using the AXON position information included in the i-th AXON information and the Dendrites position information included in the j-th Dendrites information. Judge whether or not. If it can be combined, the process goes to step S2808; otherwise, the process goes to step S2809. For example, when the growth unit 64 determines that the AXON position information included in the i-th AXON information overlaps with the Dendrites position information included in the j-th Dendrites information, the i-th AXON and the j-th Dendrites Are determined to be connectable.
  • the position of the tip of AXON indicated by the AXON position information included in the i-th AXON information matches the position of the tip of Dendrites indicated by the Dendrites position information included in the j-th Dendrites information.
  • the distance is equal to or less than the threshold or less than the threshold, it is determined that the i-th AXON and the j-th Dendrites can be combined.
  • the growth unit 64 configures connection information for specifying the connection between the i-th AXON and the j-th Dendrites, and accumulates them in the connection information storage unit 53.
  • the growth unit 64 configures, for example, combined information including an AXON identifier included in the i-th AXON information and a Dendrites identifier included in the j-th Dendries information, and stores the combined information in the combined information storage unit 53.
  • Step S2809 The growth unit 64 increments the counter j by 1. The process returns to step S2805.
  • Step S2810 The growth unit 64 increments the counter i by one. It returns to step S2802.
  • step S2111 Next, the glial cell growth process of step S2111 will be described using the flowchart of FIG.
  • Step S2901 The growth unit 64 substitutes 1 for the counter i.
  • Step S2902 The growth unit 64 determines whether or not the i-th soma related information exists in the soma related information storage unit 51. If the i-th soma related information exists, the process goes to step S2903, and if the i-th soma related information does not exist, the process returns to the upper process.
  • Step S2903 The growth unit 64 acquires necessary energy amount information and retained energy amount information included in the i-th soma related information.
  • Step S2904 The growth unit 64 determines whether or not the necessary energy amount information and the stored energy amount information acquired in Step S2903 satisfy a predetermined condition. If the predetermined condition is satisfied, the process goes to step S2905. If the predetermined condition is not satisfied, the process goes to step S2907.
  • Step S2905 The growth unit 64 acquires the soma identifier included in the i-th soma related information.
  • Step S2906 The growth unit 64 configures glial cell information having the soma identifier acquired in step S2905 and accumulates it in the glial cell information storage unit 54.
  • Step S2907 The growth unit 64 substitutes 1 for the counter j.
  • Step S2908 The growth unit 64 determines whether or not the j-th AXON information exists in the i-th soma related information. If the j-th AXON information exists, the process goes to step S2909. If the j-th AXON information does not exist, the process goes to step S2914.
  • Step S2909 The growth unit 64 acquires necessary energy amount information and retained energy amount information included in the jth AXON information.
  • Step S2910 The growth unit 64 determines whether the necessary energy amount information and the stored energy amount information acquired in Step S2909 satisfy a predetermined condition. If the predetermined condition is satisfied, the process goes to step S2911. If the predetermined condition is not satisfied, the process goes to step S2913.
  • Step S2911 The growth unit 64 acquires the AXON identifier of the jth AXON information.
  • Step S2912 The growth unit 64 configures glial cell information having the AXON identifier acquired in step S2911 and accumulates it in the glial cell information storage unit 54.
  • Step S2913 The growth unit 64 increments the counter j by 1. The process returns to step S2908.
  • Step S2914 The growth unit 64 substitutes 1 for the counter j.
  • Step S2915 The growth unit 64 determines whether or not the j-th Dendrites information is present in the i-th soma related information. If the jth Dendrites information exists, the process goes to step S2909. If the jth Dendrites information does not exist, the process goes to step S2914.
  • Step S2916 The growth unit 64 acquires the required energy amount information and the retained energy amount information included in the j-th Dendrites information.
  • Step S2917 The growth unit 64 determines whether the required energy amount information and the stored energy amount information acquired in Step S2916 satisfy a predetermined condition. If the predetermined condition is satisfied, the process goes to step S2918. If the predetermined condition is not satisfied, the process goes to step S2920.
  • Step S2918 The growth unit 64 acquires the Dendrites identifier of the j-th Dendrites information.
  • Step S2919 The growth unit 64 configures glial cell information having the Dendrites identifier acquired in step S2918, and accumulates it in the glial cell information storage unit 54.
  • Step S2920 The growth unit 64 increments the counter j by 1. It returns to step S2915.
  • Step S2921 The growth unit 64 increments the counter i by one. The process returns to step S2902.
  • step S2112 the apoptosis process in step S2112 will be described with reference to the flowchart of FIG.
  • Step S3001 The apoptosis processing unit 65 substitutes 1 for the counter i.
  • Step S3002 The apoptosis processing unit 65 determines whether or not the i-th soma related information exists in the soma related information storage unit 51. When the i-th soma related information exists, the process goes to step S3003, and when the i-th soma related information does not exist, the process returns to the upper process.
  • Step S3003 The apoptosis processing unit 65 acquires the soma identifier of the i-th soma related information from the soma related information storage unit 51.
  • Step S3004 The apoptosis processing unit 65 acquires the ignition information including the soma identifier acquired in step S3003 from the ignition information storage unit 59.
  • the ignition information acquired here is preferably ignition information having timer information indicating when the current time is within the threshold or near the threshold.
  • Step S3005 The apoptosis processing unit 65 determines whether or not the condition for apoptosis is satisfied using the ignition information acquired in step S3004. If the condition for apoptosis is satisfied, the process goes to step S3006. If the condition for apoptosis is not satisfied, the process goes to step S3008.
  • Step S3006 The apoptosis processing unit 65 deletes the i-th soma related information from the soma related information storage unit 51.
  • Step S3007 The apoptosis processing unit 65 deletes the binding information corresponding to the i-th soma from the binding information storage unit 53.
  • Step S3008 The apoptosis processing unit 65 increments the counter i by one. The process returns to step S3002.
  • the processing in the present embodiment may be realized by software. Then, this software may be distributed by software download or the like. Further, this software may be recorded on a recording medium such as a CD-ROM and distributed. This also applies to other embodiments in this specification.
  • the software that implements the video processing apparatus B in the present embodiment is the following program. In other words, this program causes the computer to monochromeize a partial area of the color image with respect to each of the two or more color images included in the video and an input information reception unit that receives the video including two or more color images.
  • a change unit that obtains two or more changed images by performing one or more of the third processes for setting a partial area of the image to black or white, and an output that outputs the two or more changed images
  • the computer-accessible recording medium stores a soma-related information in which two or more soma-related information including a soma identifier for identifying the soma and firing condition information relating to a condition for firing the soma.
  • This is a condition for output using an information storage unit, a combined information storage unit that stores one or more pieces of combined information for specifying a connection between two or more soma, and a firing pattern having one or more soma identifiers.
  • An output management information storage unit that stores one or more output management information having output conditions and output information that is output information, and the changing unit adds each of the two or more color images included in the video.
  • one or more pieces of feature information that is one or more pieces of information of R, G, B, and Y are obtained from one or more partial images constituting each color image. Identify one or more feature information acquired by the feature information acquisition means and the first fired soma for each of the feature information acquisition means to be obtained, for each of the two or more color images, and for each of the one or more partial images One or more feature information acquired from one or more other soma or one or more feature information acquired from the one or more feature information and the target of fire determination 1 Information transmitting means for acquiring the soma identifier of each of the above soma, one or more feature information acquired by the information transmitting means for each of the two or more color images, and for each of the one or more partial images; Using the firing condition information paired with one or more each soma identifier acquired by the information transmission means, a judgment for judging whether or not the soma identified by each soma identifier is fired.
  • a firing pattern acquisition for acquiring a firing pattern including one or more soma identifiers identifying the soma determined to fire by the judging means for each of the two or more color images and each of the one or more partial images And output information corresponding to the ignition pattern acquired by the ignition pattern acquisition means for each of the two or more color images from the output management information storage unit for each of the one or more partial images,
  • This is a program for causing a computer to function as an output information acquisition unit that configures a changed image from one or more acquired output information.
  • FIG. 32 shows the external appearance of a computer that executes the programs described in this specification to realize the video processing apparatuses according to the various embodiments described above.
  • the above-described embodiments can be realized by computer hardware and a computer program executed thereon.
  • FIG. 32 is a schematic view of the computer system 300
  • FIG. 33 is a block diagram of the system 300.
  • the computer system 300 includes a computer 301 including a CD-ROM drive 3012, a keyboard 302, a mouse 303, and a monitor 304.
  • a computer 301 includes a CD-ROM drive 3012, an MPU 3013, a bus 3014, a ROM 3015, a RAM 3016, and a hard disk 3017.
  • the ROM 3015 stores programs such as a bootup program.
  • the RAM 3016 is connected to the MPU 3013 and temporarily stores application program instructions and provides a temporary storage space.
  • the hard disk 3017 normally stores application programs, system programs, and data.
  • the computer 301 may further include a network card that provides connection to a LAN.
  • a program that causes the computer system 300 to execute the functions of the video processing apparatus according to the above-described embodiment may be stored in the CD-ROM 3101, inserted into the CD-ROM drive 3012, and further transferred to the hard disk 3017. Further, the program may be transmitted to the computer 301 via a network (not shown) and stored in the hard disk 3017. The program is loaded into the RAM 3016 at the time of execution. The program may be loaded directly from the CD-ROM 3101 or the network.
  • the program does not necessarily include an operating system (OS) or a third-party program that causes the computer 301 to execute the functions of the video processing apparatus according to the above-described embodiment.
  • the program need only include the part of the instruction that calls the appropriate module in a controlled manner and achieves the desired result. How the computer system 300 operates is well known and will not be described in detail.
  • the computer that executes the program may be singular or plural. That is, centralized processing may be performed, or distributed processing may be performed.
  • each process may be realized by centralized processing by a single device, or may be realized by distributed processing by a plurality of devices.
  • the video processing device has an effect of being able to obtain a video obtained by imitating the function of the human eye, and is useful as a video processing device or the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Processing Of Color Television Signals (AREA)

Abstract

La présente invention résout le problème selon lequel, de manière classique, il n'est pas possible d'obtenir une vidéo résultant de la simulation des fonctions des yeux humains. La vidéo résultant de la simulation des fonctions des yeux humains peut être obtenue au moyen d'un dispositif de traitement vidéo comportant : une unité de réception d'informations d'entrée qui reçoit une vidéo comprenant au moins deux images couleur ; une unité de modification qui obtient au moins deux images modifiées en effectuant, sur chacune desdites deux images couleur incluses dans la vidéo, au moins un processus parmi un premier processus de monochromatisation d'une région partielle de l'image couleur, un deuxième processus d'obtention d'informations de couleur rouge d'au moins une région partielle d'une image couleur précédente qui précède temporellement l'image couleur et d'ajout des informations de couleur à l'image couleur, et un troisième processus de modification de la région partielle de l'image couleur en noir ou blanc ; et une unité de sortie qui délivre lesdites deux images modifiées.
PCT/JP2017/019991 2017-05-30 2017-05-30 Dispositif de traitement vidéo, procédé de traitement vidéo et programme WO2018220694A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2017/019991 WO2018220694A1 (fr) 2017-05-30 2017-05-30 Dispositif de traitement vidéo, procédé de traitement vidéo et programme
JP2019521556A JP6924829B2 (ja) 2017-05-30 2017-05-30 映像処理装置、映像処理方法、およびプログラム

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/019991 WO2018220694A1 (fr) 2017-05-30 2017-05-30 Dispositif de traitement vidéo, procédé de traitement vidéo et programme

Publications (1)

Publication Number Publication Date
WO2018220694A1 true WO2018220694A1 (fr) 2018-12-06

Family

ID=64455836

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/019991 WO2018220694A1 (fr) 2017-05-30 2017-05-30 Dispositif de traitement vidéo, procédé de traitement vidéo et programme

Country Status (2)

Country Link
JP (1) JP6924829B2 (fr)
WO (1) WO2018220694A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07154827A (ja) * 1993-11-29 1995-06-16 Canon Inc 複数画像合成装置及び画像表示装置
JP2012029236A (ja) * 2010-07-27 2012-02-09 Toshiba Corp 映像処理装置及び映像処理方法
JP2015057659A (ja) * 2014-10-24 2015-03-26 ソニー株式会社 表示装置、表示方法、プログラム
JP2017011589A (ja) * 2015-06-24 2017-01-12 凸版印刷株式会社 表示装置、視差画像表示プログラム、および、視差画像の提供方法

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2259601B1 (fr) * 2008-04-03 2016-09-07 NLT Technologies, Ltd. Procédé de traitement d'image, dispositif de traitement d'image, et support d'enregistrement

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07154827A (ja) * 1993-11-29 1995-06-16 Canon Inc 複数画像合成装置及び画像表示装置
JP2012029236A (ja) * 2010-07-27 2012-02-09 Toshiba Corp 映像処理装置及び映像処理方法
JP2015057659A (ja) * 2014-10-24 2015-03-26 ソニー株式会社 表示装置、表示方法、プログラム
JP2017011589A (ja) * 2015-06-24 2017-01-12 凸版印刷株式会社 表示装置、視差画像表示プログラム、および、視差画像の提供方法

Also Published As

Publication number Publication date
JPWO2018220694A1 (ja) 2020-04-02
JP6924829B2 (ja) 2021-08-25

Similar Documents

Publication Publication Date Title
US20070091106A1 (en) Adaptive lexical classification system
CN107862653B (zh) 图像显示方法、装置、存储介质和电子设备
KR102411237B1 (ko) 얼굴 화상 처리 시스템, 얼굴 화상 처리 방법 및 얼굴 화상 처리 프로그램
Bertalmío Vision models for high dynamic range and wide colour gamut imaging: techniques and applications
CN106104300A (zh) 深度感测设备中的定时脉冲
US9299011B2 (en) Signal processing apparatus, signal processing method, output apparatus, output method, and program for learning and restoring signals with sparse coefficients
CN110493506A (zh) 一种图像处理方法和系统
Clemens et al. Computational principles underlying the recognition of acoustic signals in insects
CN101080744B (zh) 产生合成图像的方法
CN113534596B (zh) Rgbd立体相机及成像方法
CN110889426A (zh) 三维表达体生成系统
GB2541073B (en) Information processing apparatus, method for controlling information processing apparatus, image processing system and storage medium
WO2018220694A1 (fr) Dispositif de traitement vidéo, procédé de traitement vidéo et programme
KR102140045B1 (ko) 미러 패딩을 이용한 손 동작 인식 장치 및 그 방법
CN110663249B (zh) 用于图像处理的装置和方法
WO2018189792A1 (fr) Dispositif et procédé de traitement d'informations, et programme
Llinás et al. The olivo-cerebellar circuit as a universal motor control system
JP2011139364A (ja) 画像評価装置、画像評価方法、及びコンピュータプログラム
CN113014805B (zh) 一种仿视网膜中央凹与外周的联合采样方法及装置
CN105049822A (zh) 图像处理设备及图像处理方法
CN108603809A (zh) 汽车测试系统、方法及计算机程序产品
WO2018189793A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
JP6864084B2 (ja) 情報処理装置、情報処理方法、およびプログラム
CN114399973B (zh) 基于环境感知的自适应显示屏
JP2006031171A (ja) 擬似的3次元データ生成方法、装置、プログラム、および記録媒体

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17911786

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019521556

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17911786

Country of ref document: EP

Kind code of ref document: A1