CN106406504B - The atmosphere rendering system and method for human-computer interaction interface - Google Patents

The atmosphere rendering system and method for human-computer interaction interface Download PDF

Info

Publication number
CN106406504B
CN106406504B CN201510448258.2A CN201510448258A CN106406504B CN 106406504 B CN106406504 B CN 106406504B CN 201510448258 A CN201510448258 A CN 201510448258A CN 106406504 B CN106406504 B CN 106406504B
Authority
CN
China
Prior art keywords
color
rendering
atmosphere
tone
software
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510448258.2A
Other languages
Chinese (zh)
Other versions
CN106406504A (en
Inventor
祁高进
董建飞
郑拓
季永康
张国旗
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guizhou zhongshengtaike Intelligent Technology Co.,Ltd.
Original Assignee
Changzhou Wujin Semiconductor Lighting Application Technology Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changzhou Wujin Semiconductor Lighting Application Technology Institute filed Critical Changzhou Wujin Semiconductor Lighting Application Technology Institute
Priority to CN201510448258.2A priority Critical patent/CN106406504B/en
Publication of CN106406504A publication Critical patent/CN106406504A/en
Application granted granted Critical
Publication of CN106406504B publication Critical patent/CN106406504B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

A kind of the atmosphere rendering system and method for human-computer interaction interface, by extracting video/image/software mass-tone, and atmosphere rendering is carried out according to the dominant color information of extraction, not only atmosphere rendering can be carried out to video/image, popular software can also be rendered, by being rendered to user's operation, user is enabled to judge the correctness of its operating result or the satisfaction of operating effect when recognizing atmosphere variation, further to be operated, and the region in video/image can be divided by human eye feature, so that rendering effect greatly promotes.

Description

The atmosphere rendering system and method for human-computer interaction interface
Technical field
The present invention relates to the atmosphere in computer interface atmosphere rendering field more particularly to a kind of human-computer interaction interface to render system System and method.
Background technique
Human-computer interaction is the subject of the interactive relation between a research system and user, and wherein system can be various each The machine of sample, is also possible to the system and software of computerization, and human-computer interaction interface typically refers to the visible part of user, user It is exchanged, and is operated with system by human-computer interaction interface.Human-computer interaction experienced following developing stage: be based on keyboard and word Accord with the interaction stage of display, the interaction stage based on mouse and graphic alphanumeric display, and the interaction rank based on multimedia technology Section.
It is well known that color is the most intuitive artistic expression of a width works, it is the distillation to picture atmosphere, even more artist The direct expression of emotion.Artist combines color common sense with creative experience, and this emotion is made to be converted to color and image language Speech, by color, group is woven in picture in an orderly manner, is spoken out the emotion of creation with most intuitive visual expression mode, is expressed The theme atmosphere of works is to form the Color Language and artistic style of the great personal presentation of artist.
In addition, the stimulation of light can influence the mood of people, this stimulation must be adapted with the atmosphere that space should have, Light exists in the form of immaterial, and can not directly feel but being capable of illuminating objects, rendering room atmosphere.The form of trimmed book body is to ring The rendering in border acts on.
By the effect of light, bandwagon effect can be made more lively intuitive, and the form of the light by space structure form Rendering effect to environment renders atmosphere by effect of shadow by the color rendering contexts atmosphere of light, this construction atmosphere Mode can increase the intuitive induction of user, the satisfaction that user uses be adjusted, so that user is adjusted in different operation Different atmosphere out, pleasant mental and body, this is even more important in the rendering of video/image/software, has caused everybody attention.
Prior art illumination rendering objects are single, and lighting source is usually handled and generated just for video-image, right It can not then be handled in other common and very powerful function softwares, there are biggish functional limitations, in addition, the prior art exists The subregion that simple random or predefined mode seeks video-image is generallyd use when obtaining the subregion on video-image, The method converted by obtaining the common straightforward procedures such as subregion mass-tone/intensity as optical color parameter, technological means is simple, flexibly Property is poor, and rendering effect is poor.It is badly in need of customer service and overcomes defect of the existing technology, promotes rendering effect, increase expiring for user Meaning degree.
Summary of the invention
It is an object of the present invention to provide the atmosphere rendering systems and method of a kind of human-computer interaction interface, both can be only Vertical image or video in human-computer interaction interface renders, and can hand over the software in user and human-computer interaction device Mutual communication.
Another object of the present invention is to provide the atmosphere rendering systems and method of a kind of human-computer interaction interface, not only can be with Video-image is rendered, popular software can more be rendered, the scope of application is wider.
Another object of the present invention is to provide the atmosphere rendering systems and method of a kind of human-computer interaction interface, can be to user Operation is rendered, and user can determine whether the correctness of its operating result or the satisfaction of operating effect when recognizing atmosphere variation Deng, and further operation is made according to its (atmosphere rendering result) result.
Another object of the present invention is to provide the atmosphere rendering systems and method of a kind of human-computer interaction interface, regard obtaining Frequently it is divided on the subregion on-image by human eye feature, while again according to rendering demand and function difference to video-figure As making different demarcation, rendering effect is greatly promoted.
Another object of the present invention is to provide the atmosphere rendering systems and method of a kind of human-computer interaction interface, utilize human eye The method for dividing region and being directed to the division subregion of different rendering modes is rendered, and rendering effect is good.
Another object of the present invention is to provide the atmosphere rendering system and method for a kind of human-computer interaction interface, use is multi-thread Journey call flow, substantially increases operation efficiency, it is ensured that the addition of subsequent algorithm.
Another object of the present invention is to provide the atmosphere rendering system and method for a kind of human-computer interaction interface, application ranges Extensively, be suitable for the application of in various software, for example, application software, teaching software, development, Entertainment software, network software, PaintShop, programming software and security software etc..
Another object of the present invention is to provide the atmosphere rendering systems and method of a kind of human-computer interaction interface, use real-time The operation at family carries out illumination rendering, rendering effect include tool selection prompt, effect rendering, effect of shadow simulation or extend, Whole atmosphere rendering etc..
Another object of the present invention is to provide the atmosphere rendering systems and method of a kind of human-computer interaction interface, can be according to not Predefined setting is exclusively carried out with the characteristics of software itself, can also allow user's customized setting on the basis of original function.
Another object of the present invention is to provide the atmosphere rendering system and method for a kind of human-computer interaction interface, system with it is soft Communication between part includes diversified forms.
Another object of the present invention is to provide the atmosphere rendering system and method for a kind of human-computer interaction interface, the systems one Aspect can independently render human-computer interaction interface, propose to lure using human eye vision method and vision and draw the general of color It reads, rendering is vdiverse in function, and rendering effect is good.
Another object of the present invention is to provide the atmosphere rendering systems and method of a kind of human-computer interaction interface, on the other hand It can interact and communicate with the software in user and human-computer interaction device, process object greatly increases, and application range is wider It is general.
Another object of the present invention is to provide the atmosphere rendering systems and method of a kind of human-computer interaction interface, increase user Intuitive induction, the body and mind of pleasant user improves the user satisfaction of user.It is adjusted according to the mood of user, local environment etc. Save atmosphere.Using the pleasant atmosphere of color creation, using the scheme of colour of adaptation to local conditions.
Another object of the present invention is to provide the atmosphere rendering systems and method of a kind of human-computer interaction interface, and work can be improved Make the service efficiency of software, set off and build better work climate by contrast, the entertainment of enhancing entertainment software.
Another object of the present invention is to provide the atmosphere rendering system and method for a kind of human-computer interaction interface, reduce due to Between two frames similar color frequently change and caused by lamp bead flash, improve picture extend visual experience.
To meet object above and other objects of the present invention and advantage of the invention, the present invention provides a kind of man-machine friendship The atmosphere rendering method at mutual interface, the method comprise the steps that
(A) video/image/software mass-tone is extracted;
(B) atmosphere rendering is carried out according to the dominant color information of extraction.
An embodiment according to the present invention, wherein the step (A) the following steps are included:
(A.1) to picture defined area;
(A.2) it is analyzed and is handled using image processing algorithm and mode identification method;And
(A.3) the photochromic rendering parameter identified for lighting apparatus is obtained.
In the step (A.1), by carrying out piecemeal to the picture come defined area, wherein being suitable for by clockwise Or mode counterclockwise carries out piecemeal to the picture.
The step (A.2) the following steps are included:
(A.21) the corresponding rgb pixel matrix in each region is successively extracted;
(A.22) rgb matrix is converted to HSV histogram;
(A.23) statistics HSV histogram obtains mass-tone HSV value;
(A.24) mass-tone and previous frame mass-tone color difference are calculated, and is compared;
(A.25) if (A.26) color difference within the allowable range, thens follow the steps, if color difference not within the allowable range, executes Step (A.27);
(A.26) region corresponds to lamp bead color and becomes the frame mass-tone color;And
(A.27) it is constant to correspond to lamp bead color for the region, identical as previous frame mass-tone color.
Preferably, corresponding lighting apparatus is respectively set in each region, to carry out atmosphere rendering to each region respectively.
Preferably, it in the step (A.21), is arranged every N row M and extracts rgb pixel matrix, wherein N >=1, M >=1.
An embodiment according to the present invention, the dominant color information in the step (B) are that vision lures and draws color information, the step (A) the following steps are included:
(A.4) it delimit and extracts mass-tone region;
(A.5) it determines that vision lures and draws color region;
(A.6) it determines that vision lures to draw color and extract;And
(A.7) verification vision, which lures, draws color.
Wherein the step (A.5) the following steps are included:
(A.51) the corresponding rgb pixel matrix in mass-tone region is extracted;
(A.52) rgb pixel matrix conversion is HSV matrix;
(A.53) HSV histogram is counted;
(A.54) saturation degree, brightness and amount threshold are set;And
(A.55) all sections HSV for meeting threshold condition are found out, the most section HSV of quantification is lured as vision draws Color section
Wherein in the step (A.7), mass-tone is verified by doing imitative binary conversion treatment.
In the step (A.4), it is suitable for mass-tone regional assignment being two pieces, three pieces, four pieces or more muti-piece, according to not The mode of same defined area, is respectively set corresponding lighting apparatus on each piece.
Preferably, it in the step (B), calls to extract vision and lure and draws color algorithm, lamp bead shows that vision lures and draws color;Or Piecemeal is called to extract mass-tone algorithm, lamp bead shows the mass-tone that piecemeal extracts.
A method of by in human-computer interaction device software communication carry out atmosphere rendering, be suitable for user by with people Software in machine interactive device interacts communication to render atmosphere, which is characterized in that the described method comprises the following steps:
(1) inspection software sends the mode of message;
(2) operation for detecting user is intended to;
(3) lighting apparatus is sent instructions to;And
(4) it is intended to carry out atmosphere rendering according to the operation of user.
Wherein, user with software menu or tool when, need to click or double-click menu option, toolbar icon or Using shortcut key, text or code operation being knocked in, software or making corresponding variation depending on the user's operation, which can be by atmosphere It encloses rendering unit to capture, and the atmosphere rendering after being predicted and operated before user's operation.
In word processor operation, atmosphere rendering unit is done when mouse chooses colour table icon and clicks colour table icon It reacts, wherein atmosphere rendering unit obtains the exact position of mouse when mouse chooses colour table icon, judges corresponding to it out Color in the tool icon, type and colour table, analysis handle and send information to lighting apparatus, and lighting apparatus is all shown Or the color that flashing shows color identical with the color icon that mouse is chosen to prompt user to select;In user click colour table When the color icon, choosing the background of text to change is the color of the color icon, and atmosphere rendering unit judges its display color and back Scape changes the position of text, so that the color of the lighting apparatus display color icon of text corresponding position, to carry out Part rendering, so that text information is more prominent.
In the operation of PowerPoint software, lantern slide is done when fading in or fade out dynamic operation, atmosphere rendering unit receives After to windows message or in real time navigate to mouse position and corresponding icon after, send messages to lighting apparatus so that illumination set It is standby it is regular by certain color by secretly being converted from deep to shallow from shallow to deep or from bright to dark to bright, color is according to unreal The mass-tone of lamp piece or secondary mass-tone or vision, which lure, draws what color or other methods determined.
In the programming software work of software engineer, atmosphere rendering unit shows that enhancing pays attention to when programming software works Power can loosen the light of eyes again, and light intensity and color keep are constant, change the color or frequency of light automatically over time, It is appropriate to rest to prompt engineer to adjust.
For photography software of taking a picture, atmosphere rendering unit is adapted to detect for different scenes, obtains its basic color information, Different scenes is rendered, so that the scene of image extends to entire space, promotes the whole atmosphere sense of photo or photography.
For security software, various secure datas in atmosphere rendering unit real-time capture software, when a certain data are lower than Minimum safe standard or when being higher by safety value, lighting apparatus can issue light warning at once.
For teaching software, atmosphere rendering unit is by a certain or certain several color of display, or flashing is a certain or certain is several Color prompts learner to collect favorite power or appropriate rest, wherein prompt the time interval focused on and suitably rest when Between be spaced and automatically configured by system according to experimental data, or configured by user according to the habit of itself.
For assisting software, atmosphere rendering unit be suitable for according to the colors of 3D layout software works, light source to its into Row rendering.
In home decoration, atmosphere rendering unit detects the power of the light of each position, color in room, and according to it Distributing position and variation carry out real-time rendering to it.
In terms of picture browsing, in normal preview picture, atmosphere rendering unit is suitable for next picture or upper one Picture is rendered, and rendering effect is according to next or the color characteristics of a upper picture, next corresponding rendering area Right direction of the domain in picture display area, left direction of the upper corresponding rendering region in picture display area.
When being handled for pictorial information, the variation of atmosphere rendering unit real-time detection pictorial information, and detecting The picture after variation is rendered after variation.
Detailed description of the invention
Fig. 1 is a kind of structural schematic diagram of the atmosphere rendering system of human-computer interaction interface according to the present invention.
Fig. 2 is a kind of block diagram of the atmosphere rendering system of human-computer interaction interface according to the present invention.
Fig. 3 is the flow chart that first preferred embodiment according to the present invention carries out atmosphere rendering.
Fig. 4 is the defined area schematic diagram of above-mentioned first preferred embodiment according to the present invention.
Fig. 5 is the picture background extension method flow chart of above-mentioned first preferred embodiment according to the present invention.
Fig. 6 is the schematic diagram of the human eye front view of second preferred embodiment according to the present invention.
Fig. 7 is that the extraction vision of above-mentioned second preferred embodiment according to the present invention lures the schematic diagram for drawing color.
Fig. 8 is that luring according to vision for above-mentioned second preferred embodiment according to the present invention draws the stream that color carries out atmosphere rendering Cheng Tu.
Fig. 9 is that the extraction vision of above-mentioned second preferred embodiment according to the present invention lures the flow chart for drawing color section.
Figure 10 A to Figure 10 C is the extraction mass-tone region division mode of above-mentioned second preferred embodiment according to the present invention Schematic diagram.
Figure 11 A to Figure 11 C is the corresponding lighting apparatus in each region of above-mentioned second preferred embodiment according to the present invention Position view.
Figure 12 is that above-mentioned second preferred embodiment extraction vision according to the present invention lures the flow chart for drawing color.
Figure 13 A and Figure 13 B are that above-mentioned second preferred embodiment according to the present invention extracts two kinds of algorithms that mass-tone uses Call method schematic diagram.
Figure 14 is the atmosphere rendering flow chart of third preferred embodiment according to the present invention.
Specific embodiment
It is described below for disclosing the present invention so that those skilled in the art can be realized the present invention.It is excellent in being described below Embodiment is selected to be only used as illustrating, it may occur to persons skilled in the art that other obvious modifications.It defines in the following description Basic principle of the invention can be applied to other embodiments, deformation scheme, improvement project, equivalent program and do not carry on the back Other technologies scheme from the spirit and scope of the present invention.
As depicted in figs. 1 and 2, the atmosphere rendering system of a kind of human-computer interaction interface, including a human-computer interaction device 10, one Cloud Server 20 and an atmosphere rendering unit 30, wherein the human-computer interaction device 10, the Cloud Server 20 and the atmosphere Rendering unit 30 is connected, and the atmosphere rendering unit 30 is that the human-computer interaction device 10 provides atmosphere rendering, to meet use The demand at family.
Further, the atmosphere rendering unit 30 includes multiple lighting apparatus 31 and multiple analysis and processing units 32, Wherein the lighting apparatus 31 is connected to the analysis and processing unit 32, executes the instruction that the analysis and processing unit 32 is sent, The human-computer interaction interface 11 of the human-computer interaction device 10 is carried out according to the photochromic rendering parameter of the analysis and processing unit 32 Atmosphere rendering.On the one hand the atmosphere rendering system of the human-computer interaction interface is suitable for independently rendering human-computer interaction interface, separately On the one hand the software being suitable in user and human-computer interaction device interacts communication.
It is noted that the lighting apparatus 31 is to be used for atmosphere wash with watercolours independently of except the human-computer interaction device 10 The lighting apparatus of dye.
Embodiment one
The signal of picture is had for human-computer interaction interface, such as image, video etc., which can carry out atmosphere wash with watercolours to it Dye, as shown in figure 3, rendering method following steps: (1) to picture defined area (101);(2) image processing algorithm and mould are utilized Formula recognition methods is analyzed and is handled, and extracts mass-tone (102);(3) the photochromic rendering ginseng identified for lighting apparatus is obtained Number (103);And (4) carry out atmosphere rendering (104) according to the dominant color information of extraction.
Extended through image or the background frame of video frame to carry out atmosphere rendering, wherein picture extends is extracted using piecemeal The method of mass-tone.
The present embodiment extends picture using four direction up and down, each direction is equipped with a certain number of The lighting apparatus 31, wherein the lighting apparatus 31 may be embodied as lamps and lanterns or lamp bead, the present embodiment is preferably colored lamp Pearl.Under normal conditions, it is preferable that top is identical as the number of the lighting apparatus 31 of lower section, the photograph of left and right The number of bright equipment 31 is identical.Be directed to the display of the various sizes of human-computer interaction device 10, it is required use it is described The quantity of lighting apparatus 31 can be variant, according to the number of the lighting apparatus 31, using one-to-one or many-to-one method pair Image edge carries out piecemeal.
In the step (1), the picture includes image, video pictures and software picture, the analysis and processing unit 32 according to clockwise or counterclockwise to image or video pictures defined area, i.e. progress piecemeal is illustrated in figure 4 Defined area clockwise schematic diagram.In the preferred embodiment, a certain frame 40 being directed in a certain image or video frame, edge Picture edge 41 (width of designated edge is W) by clockwise direction delimit be multiple small regions, be worth mentioning , it can also delimit multiple small regions according to direction counterclockwise, these zonules then become block 42, and the one of described piece 42 Side determines that another side is then codetermined by the number of the width of display and lamp bead via W.
Fig. 5 show picture background and extends flow chart, in the step (2), passes through 32 benefit of analysis and processing unit It is analyzed and is handled with image processing algorithm and algorithm for pattern recognition.Wherein in the preferred embodiment, step (2) packet Include following steps: (2.1) extract first piece of 42 corresponding rgb pixel matrix;(2.2) rgb matrix is converted to HSV matrix;(2.3) HSV histogram is counted, and obtains described piece 42 of the mass-tone HSV value;(2.4) it detects whether as last block, if last Block executes step (2.5), if not last block, executes step (2.10);(2.5) compare the mass-tone in the frame region with it is upper The mass-tone in the one frame region, and calculate the mass-tone color difference of the frame mass-tone Yu a frame region: (2.6) if color difference in allowed band It is interior, (2.7) are thened follow the steps, if color difference not within the allowable range, thens follow the steps (2.8);(2.7) region corresponds to lamp bead face Color is constant, identical as previous frame mass-tone color;(2.8) region corresponds to lamp bead color and becomes the frame mass-tone color;(2.9) it detects It whether is last block region, if last block region, then frame picture extension terminates, if not last block region, then According to counterclockwise or clockwise extract the mass-tone in next piece of region;(2.10) using every N row M column by the way of according to counterclockwise or suitable Hour hands extract the corresponding rgb pixel matrix in next piece of region, and return to the step (2.3), successively carry out picture extension.
It is noted that a certain piece 42 in picture of rgb pixel matrix is extracted according to piecemeal result, it can to save the time Using the method extracted every N row M column, wherein N >=1, M >=1, to save the time, it is preferable that N > 1, M > 1.The picture that will be extracted Prime matrix is completely converted into HSV matrix, and counts HSV histogram, obtains the mass-tone HSV value of the block.The range of HSV of the present invention point Not Wei 0-360,1-100 and 0-250, the siding-to-siding block length of histogram HSV can find out best one group of effect according to test result.
Previous frame mass-tone corresponding with the region is needed to be compared after extracting mass-tone, the present invention is to calculate value of chromatism, It keeps the corresponding lamp bead color in the region constant when color difference is sufficiently small, otherwise shows new mass-tone value in corresponding lamp bead Show, wherein the color difference allowed band of the two first passes through the analysis and processing unit 32 in advance and is configured.The purpose of above-mentioned way is Reduce as between two frames similar color frequently change and caused by lamp bead flash, improve picture extend visual experience.
In the step (4), the photochromic rendering parameter that the analysis and processing unit 32 will acquire sends the illumination to Equipment 31 after the lighting apparatus 31 receives instruction, carries out atmosphere rendering to the picture according to the photochromic rendering parameter, with Meets the needs of user.
Embodiment two
When watching video, people are easy that more gorgeous by color, brightness is higher, the faster persons or things of movement attract, So more attentions can be placed on these persons or things, or even the things and background of surrounding can be ignored, for example, picture Surrounding is the branches and leaves of green, and centre is many red roses, we prefer to highlight gorgeous red rather than rendering green.
1. human eye vision atmosphere extends
The visual angle very little of human eye, if calculated with observable standard, the visual angle of human eye is about 150 degree, but by seeing Clearly standard, visual angle just only have 5 degree or so, Just because of this, artificial expansion field range, must roll commonplace turn Dynamic eyeball, left and right are looked forward to, and must rotate a head portion sometimes.
The visual range of people is visus acris area, i.e. central field of vision within 10-15 °, to the color and detail section of image Resolution capability it is most strong, the information such as figure can be correctly identified within 20 °, be effective field of view, although 20-30 ° of eyesight and color discrimination Ability starts to reduce, but more sensitive to action message, and eyesight just declines much except 30 °.
For people when carrying out human-computer interaction, people is different from the relative position of equipment, and the image or video that eye-observation arrives are drawn It face also can be different.Like when film is watched by cinema, its visual angle of the spectators of different location is different, and viewing impression can not yet Together.Under normal circumstances, it needs just see whole pictures clearly by rotation eyeball or twisting neck apart from the closer spectators of screen, Apart from the farther away spectators of screen then it is easy to see that entire picture, however too far distance will affect clarity again, and deviate glimmering The spectators of curtain see that the things in picture can have deformation in varying degrees.
The invention proposes positive visual angle atmosphere to set off by contrast, to protrude the emphasis things of attraction onlooker's eyeball of video frame.It is so-called " positive visual angle " is at people and interactive device in certain relative position, and image or video from human eye angle can reach Optimal effect.It is the positive visual angle figure of human eye shown in Fig. 6, the opposite position of people and interactive device has clearly been marked in the front view It sets, wherein D indicates vertical range of the human eye apart from human-computer interaction device, and intersection point is in the man-machine friendship that width is W and height is H At the center of mutual equipment, the actual size of D value varies with each individual, and standard is the ocular comfort of onlooker, it is seen that image or video Clearly.Elliptical region in figure is the effective coverage of human eye, and human eye is more sensitive to action messages such as the colors in the region, Size determines that angle [alpha] is generally at 20-30 ° by the effective field of view angle [alpha] of human eye and human-computer interaction device's distance D and human eye Between.
The effective field of view in front view is it has been determined that it is to stop that people, which have a big chunk time when watching picture or video, Stay in effective field of view, be either still all for picture or video itself angle from comfortable angle, clear angle as This.Therefore the expression in the eyes of atmosphere can be carried out in effective field of view by the size reduction to effective field of view of image or video script Or vision lures and draws color extraction, wherein vision, which lures, draws the mass-tone that color generally as attracts eyeball.If human eye is apart from man-machine friendship The distance D of mutual equipment is bigger or the width W and height H of human-computer interaction device are smaller, then the effective field of view of onlooker can be with The edge of image or video, the even more than size of image or video are touched, effective field of view at this time is then considered as entirely Image or video.
In order to give top priority to what is the most important, the algorithm for drawing color is lured the invention proposes a kind of extraction vision.Fig. 7 show picture in its entirety and shows It is intended to, wherein picture middle section is to extract and judge that the vision in the frame picture lures the region for drawing color, which can be located at figure The intermediate region of piece either in the range of human eye can be faced, is that human eye is also easiest to the region paid attention at first, extracts mass-tone The width of region distance image edge is W.
2. determining the section region mass-tone HSV
As shown in Fig. 8 to Figure 13 B, the present invention, which is extracted vision using following methods and lured, draws color, and then to image or view Frequency carries out atmosphere rendering.
(A) it delimit and extracts mass-tone region (201);
(B) it determines that vision lures and draws color region (202);
(C) it determines that vision lures to draw color and extract (203);And
(D) verification vision, which lures, draws color (204);
(E) it is lured according to vision and draws color progress atmosphere rendering (205).
After determining that vision lures and draws color region, in the step (C), then the rgb pixel square in the region can be extracted Battle array can be used the mode described above every N row M column and acquire rgb pixel matrix to improve efficiency.By collected rgb pixel Matrix is all converted to HSV matrix, and HSV histogram is then calculated.Under normal conditions, bright-colored, brightness is higher and reaches Things to certain amount (i.e. range is larger) is easiest to attract eyeball.The present invention sets saturation degree S-threshold, brightness V- Threshold and number percent Ratio-threshold lures the three elements for drawing color as examination vision.From HSV histogram All sections HSV greater than three elements are counted, if without the section for meeting three elements, then it represents that the frame picture is without especially prominent A certain things out, i.e. no visual, which lure, draws color.If the section the HSV number for meeting the three elements is greater than zero, select a fairly large number of It is lured as vision and draws color section, and record the quantity MaxNum in the section HSV comprising picture midpoint.
2.1, which determine that vision lures, draws color region
In the step (B), the vision of extraction is lured and draws color region further division, the purpose is to lure vision to draw color Region refinement, and lured according to the vision in the refinement region of statistics and to draw color information to correspond to and be shown in the lamp bead that atmosphere renders, from And atmosphere (including brightness, color, colour temperature etc.) rendering more accurately can be carried out to mass-tone.
There are many kinds of the modes of mass-tone refinement, and specific division mode is needed according to the image or video being specifically rendered Depending on.Figure 10 A to Figure 10 B shows three kinds of division modes, and mass-tone region division is all four pieces by these three modes, may be used also certainly To be divided into two pieces, three pieces or more muti-piece.The present embodiment is mainly illustrated for four pieces.
Figure 10 A be mass-tone regional center line be line of demarcation by mass-tone region be divided into upper left UpLeft, lower-left LowLeft, Upper right UpRight and tetra- pieces of bottom right LowRight.Figure 10 B is then to draw mass-tone region using the diagonal line in mass-tone region as line of demarcation It is divided into Up, lower Low, left Left, tetra- pieces of right Right.Above two division mode application range is wider, to most images or view Frequency is applicable in, and since the corresponding lamp bead position of two different division modes is different, rendering effect also can be different.Figure 10 C be Remove what mass-tone region middle section obtained on the basis of Figure 10 B, such division mode is primarily adapted for use among image or video Under the compositions such as the colour brightness at position are fixed or the lesser situation of variation, for example occur a turntable in image, works as turntable The color component part of intermediate region is constant when rotation, and peripheral regions are constantly to change, and rotation speed is faster The color change of its surrounding is faster.The size at position can be determined by predefined mode among Figure 10 C, can also be passed through Image procossing and the mode real-time monitoring of calculating determine.
After determining that mass-tone divides region, then the region of division is handled, for processing method is described in detail, herein By taking Figure 10 A as an example.
It has obtained extracting vision and luring by Fig. 9 flow chart drawing the section mass-tone HSV in color region, then count respectively in next step The quantity of point of tetra- area distributions of LU, LD, RU, RD in the section mass-tone HSV, be denoted as respectively ULNum, LLNum, URNum, LRNum.They meet formula (1), and the size of four variables embodies distribution situation of the mass-tone in four regions, quantity more it is big then The number of mass-tone in this region is more, which is then easier to arouse attention.
ULNum+LLNum+URNum+LRNum=MaxNum (1)
It, can be by calculating the number of dominant colors in each region and the ratio of total quantity MaxNum for the prominent mass-tone region of emphasis Obtain the mass-tone percentage of corresponding region.Mass-tone percentage then more can clearly react the mass-tone in each region than number of dominant colors itself Distribution situation.Set RatioOne (for example be equal to 60%), RatioTwo (for example being equal to 80%), RatioThree (such as etc. In 90%) respectively as the threshold value of one, two, three region mass-tone percentage.If some region mass-tone percentage is big In RatioOne, then show that the region is mass-tone region, corresponding lamp bead should show the color of mass-tone.If certain two The sum of the percentage in region is greater than RatioTwo, then this two region collectively constitutes mass-tone region, the lamp bead corresponding to them Then show the color of mass-tone.Equally use percentage RatioThree as the threshold value of identification the sum of certain trizonal percentage, If the sum of trizonal percentage is greater than RatioThree, which is mass-tone region, corresponding to them Lamp bead then show the color of mass-tone.If four regions do not meet above-mentioned three kinds of situations, show that the mass-tone is to be uniformly distributed In four regions, at this moment need for all lamp beads to be all shown as the color of mass-tone.Figure 11 A to Figure 11 C indicates four regions The mode of corresponding lamp bead position setting, lamp bead difference is corresponding to be arranged on each region, to carry out wash with watercolours respectively to each region Dye, so that rendering effect greatly promotes, can more meet the demand of user.
2.2, which determine that vision lures, draws color
The left margin (median or right margin) in tri- sections H, S, V in section can be artificially chosen under normal conditions as master Color HSV value, is then converted into rgb value.This way is fairly simple, but true in the HSV value artificially assert not necessarily picture Real existing value, easily causes color difference.In addition, the accuracy of this way is influenced by HSV siding-to-siding block length, siding-to-siding block length is got over Big then to HSV value influence is bigger, to influence on the mass-tone rgb value being finally calculated bigger.To avoid the above problem, this Invention is carried out by the way of directly choosing picture true colors itself.Since HSV picture element matrix is the rgb pixel square in picture What battle array was converted to, therefore the HSV of necessary being need to be only chosen, it is then converted into RGB.Mass-tone area is had determined above Between, and vision lures and draws color one and be scheduled in mass-tone section, therefore the HSV value for meeting mass-tone section need to be only found out in picture. But vision is lured draw the color that color is more heavily weighted toward high saturation, high brightness herein, therefore added S value when looking for HSV and meet Median and V value greater than mass-tone section S are greater than the two conditions of the median of mass-tone section V.High saturation is so just taken into account Two degree, high brightness and picture true colors aspects.If meeting condition there is no HSV in picture, saturation degree can choose With brightness one of them, look in picture the HSV value of saturation degree in all the points for meeting mass-tone section maximum (brightness is maximum).Such as This vision, which lures, draws color value and has obtained, and then mass-tone correspondence is shown in the corresponding lamp bead in mass-tone region.
2.3 fast verification visions, which lure, draws color
In order to reach good effect, each frame played in video all should be extracted vision and be lured by the above method draws color, But result of which can greatly increase operand, influence efficiency.Under normal conditions, video playing is with rhythmic and continuous Property, there may be very big similitude between certain two continuous frames, if a certain frame lures there are vision and draws color section, Adjacent next frame is possible to the mass-tone section for having same or similar.By this characteristic, this paper presents one kind quickly to test The method for demonstrate,proving mass-tone, this method are built upon on the basis of there is vision to lure and draw color section its former frame or former frames.
For example, if there is a certain frame vision, which to lure, draws the section color HSV, that five frame after it first have into The imitative binary conversion treatment of row, binaryzation here is not traditional binaryzation.Traditional binaryzation is by the pixel on image The gray value of point is set as 0 or 255, and whole image, which shows, significantly only has black and white visual effect.Two are imitated in the present invention Value is the threshold value comparison method by means of traditional binaryzation, it handle image the result is that the phase of black and a certain close colour In conjunction with.Traditional binaryzation sets a certain threshold value T, and the pixel value greater than T is set as 255, and the pixel value less than T is set as 0.Imitative binaryzation Threshold value be that there is the section HSV of certain siding-to-siding block length, the pixel value for meeting the section keeps its HSV constant, outside section The all black of pixel value, that is, brightness V is 0.It has just obtained a width in this way there was only the point in mass-tone section being colour, remaining point is all The picture of black.Then statistics mass-tone points are carried out to this special picture and measures MaxNum, and determining vision is called to lure Draw the algorithm in color region and determine that vision lures and draws color.
It is as shown in figure 12 to extract mass-tone region specific flow chart.Firstly, detection previous frame or upper five frame whether there is vision It lures and draws color section, and if it exists, imitative binary conversion treatment is then carried out in the manner described above, is then determined that vision lures and is drawn color region, It determines that vision lures and draws color, and vision is lured and draws color region and shows.If detection discovery previous frame or upper five frame are not deposited It is lured in vision and draws color section, then needed in such a way that said extracted vision of the present invention lures and draws color section, redefined and extract area The vision in domain, which lures, draws color section, then determines that vision lures and draws color region, determines that vision lures and draw color, and lure vision and draw color institute It is shown in region, if after redefining extraction, can not still find vision and lure and draw color section, then prove that no visual lures and draw Color then needs to extract mass-tone in the way of embodiment one kind and carries out atmosphere rendering.
Figure 13 A and Figure 13 B are above two algorithm call flow chart, and wherein Figure 13 A is common call flow chart, Figure 13 B For multithreading call flow chart.
Under normal conditions, need to detect in picture whether the things in need given prominence to the key points, if there is then all lamp beads are aobvious Show that the things mass-tone i.e. vision lures and draws color.It needs to do picture surrounding extension if not and master is extracted using piecemeal Color.Thus as shown in figures 13 a and 13b using the sequence of above two algorithm.Multithreading call flow, this side are used herein Formula substantially increases operation efficiency, it is ensured that the addition of subsequent algorithm.
Embodiment three
Atmosphere rendering is carried out by communicating with the software interactive in human-computer interaction device.
When user is when using software, software makes corresponding variation to the operation of user, and lighting system is then directed to software Variation make corresponding illumination rendering, user can determine whether the correctness or operation of its operating result when recognizing atmosphere variation The satisfaction etc. of effect, and further operation is made according to its result.The software can in real time illuminate the operation of user Rendering, rendering effect include that tool selects prompt, effect select soft, effect of shadow to simulate or extend, whole atmosphere renders etc..
As shown in figure 14, by communicated with the software interactive in human-computer interaction device carry out atmosphere rendering method include with Lower step: (1) inspection software sends the mode (301) of message;(2) operation for detecting user is intended to (302);(3) instruction is sent Give lighting apparatus (303);And (4) are intended to carry out atmosphere rendering (304) according to the operation of user.
Under normal conditions, it is soft can be classified as application software, teaching software, development, Entertainment software, network for software Part, PaintShop, programming software, security software etc..Wherein every money software is subdivided into many specific softwares, for example goes Industry software part includes food and drink software, cosmetology and health software, member management software, cash register software etc. again.It is directed to the soft of different function Part, the existing common rendering function to most of software of the system, and have the peculiar rendering for a certain or a few money softwares Function.
Common rendering function of 3.1 introducing systems to software.
So-called common rendering function refers to the rendering function of generally using one kind that most of software can be applicable in.For For most softwares, menu or tool are the most basic function units that it is worked normally, each is simple or complicated Using be all by these junior units go realize.User needs according to certain rules, such as to click when with menu or tool Or menu option, toolbar icon are double-clicked, or use shortcut key, when knocking in the operation such as text or code, software can be according to user Operation intention make corresponding variation, this variation can be captured by atmosphere rendering unit 30, and before user's operation carry out in advance Atmosphere rendering after surveying and operating.
Such as in the word processor in OFFICE office software, such as in word software or word pages software In, there is the background color of setting text to the tool icon of prominent text, will appear different face in this icon of user click Colour cell at colour table, user only needs that a color icon is selected to be clicked the i.e. changeable background for choosing text in colour table Color, but mouse is placed on the color icon and does not work.The system can choose colour table icon in mouse and click colour table icon Two crucial moments make a response.When mouse chooses colour table icon, atmosphere rendering unit 30 can be by obtaining windows message The mode of the mode or directly positioning mouse position that obtain mouse position obtains the exact position of mouse, and thus judges that its institute is right Then the color in the tool icon and type and colour table answered sends information to illumination by a series of analysis processing Equipment 31, lighting apparatus 31, which can all show or flash, shows color identical with the color icon that mouse is chosen for prompting The color of its selection of user.When the color icon in user click colour table, choose the background of text that can change rapidly as color The color of icon, atmosphere rendering unit 30, which will not only judge its display color also, at this time will judge that the institute of background variation text is in place It sets, then by the color of 31 display color icon of the lighting apparatus of text corresponding position, to reach accurate part rendering function, So that text information is more prominent.
In the operation of PowerPoint software, for example in PPT (PowerPoint) manufacturing process, need to a certain lantern slide The dynamic operation that fades in or fade out is done, at this moment user needs to choose the tool icon for fading in or fading out, atmosphere wash with watercolours in animated menu After dye unit 30 navigates to mouse position and corresponding icon after receiving windows message or in real time, it can make a response rapidly, So that lighting apparatus it is regular by certain color by secretly being converted from deep to shallow from shallow to deep or from bright to dark to bright, face Color is to be lured to draw color or other methods determination according to the mass-tone of lantern slide or secondary mass-tone or vision.
It is convenient, accurate and safe for obtaining software itself and issuing the mode of message as the data source of atmosphere rendering unit Method, but the premise of this method is that software vendor agrees to provide cooperation.It can be by using if it can not obtain message itself The analysis of software carries out predefined processing to the tool icon, the menu text etc. in software, to being analyzed and located the characteristics of its Reason, establishes toy data base, directly can extract and compare from database in operation.
3.2 are directed to function specific to a certain or certain several software.
The function, which is generally required, carries out predefined setting according to the specific specialized of software itself, can also be in original function On the basis of allow the customized setting of user.
For example, software engineer when being worked, needs to guarantee the attention of height with programming software, and program for a long time Work can hurt health again, and atmosphere rendering unit 30 can guarantee that note can be enhanced in display in programming software work thus Meaning power can loosen the light of eyes again, and guarantee that light intensity and color keep are constant, it is therefore an objective to improve the working efficiency of engineer and delay Solve eye fatigue.In addition, changing the color or frequency of light after such as one hour for a period of time, engineer is prompted to make tune It is whole, it is appropriate to rest.
For example, for photograph photography software popular at present.There is the scene pair of many different tones in its special efficacy camera Photo or camera shooting carry out picture rendering, such as aestheticism, Blues, nature etc..Lighting system can detecte different scenes, and obtain Its basic color information (color, saturation degree, brightness etc.), then renders different scenes, so that the scene of image Entire space is extended to, the whole atmosphere sense of photo or photography is promoted.
Such as in the use of security software or running background, 30 moment of atmosphere rendering unit captures the various peaces in software Total evidence, when a certain data are lower than minimum safe standard or are higher by safety value, lighting apparatus 31 can issue light warning at once, Inform that user upgrades security software, virus row kills or take immediately other remedial measures.
Such as in teaching software, atmosphere rendering unit 30 can be a certain by a certain or certain several color of display, or flashing Or certain several color, prompt learner to focus on or suitably rest.It prompts the time interval focused on and appropriate The time interval of rest can be automatically configured by system according to experimental data, can also be matched with user according to the habit of itself It sets.
Such as in auxiliary software, such as 3D layout software, by taking 3DS MAX as an example.3DS MAX software is three-dimensional animation wash with watercolours Dye and Software for producing, be widely used in advertisement, video display, industrial design, architectural design, three-dimensional animation, multimedia making, game, The fields such as aided education and Engineering Visual.Atmosphere rendering unit 30 can be according to source of the color of its works, light etc. to it It is rendered.
Such as in home decoration, the light source in room has one or more, the direct irradiation and body surface pair of light The reflection etc. of light causes the uneven of light distribution in room, and atmosphere rendering unit 30 can detect that the light of each position in room Power, color etc., and it is rendered according to its distributing position.Certainly this rendering is in real time, if in room Light changes, and atmosphere rendering unit 30 can follow the variation of light automatically and carry out adaptable rendering.
For example, can carry out front and back page turning in terms of picture browsing to picture by keyboard or mouse, realize that picture presses certain One sequence is presented.Atmosphere rendering unit proposes a kind of photo preview function, which can be right in normal preview picture Next picture or a upper picture are rendered, and rendering result is according to next or the color characteristics of a upper picture are (main Color, secondary mass-tone, vision, which lure, draws color, illumination etc.).Next it is corresponding rendering region picture display area right direction, on Left direction of one corresponding rendering region in picture display area.Such corresponded manner is suitable for the side of usual picture browsing Formula, left side indicate that, just in the upper picture corresponding position of browsing pictures, right side indicates next picture corresponding position.
For example, opening a certain picture using photoshop.Before untreated, can according to picture itself the characteristics of (some region of mass-tone, secondary mass-tone etc. in the mass-tone of picture in its entirety, secondary mass-tone or picture) carries out whole or part rendering.If When making a certain operation (color balance etc.) to image, the information (color, background etc.) of picture can also make corresponding variation, this When can be there are two roadmap.One is the exact position of mouse is obtained by message or in real time positioning, then according to mouse The a certain pocket around mouse delimited in position, by analyzing information (icon information, menu information etc.) in zonule, and root It is believed that breath makes the preliminary or accurate judgement of operation.But the above method is higher to precise requirements, and processing speed is by multi-party The influence in face, can be used the second way thus.The second way is handled just for pictorial information, atmosphere rendering unit The variation of the information (mass-tone, secondary mass-tone etc.) of detection picture in real time, and rapidly to the picture after variation after detecting variation It is rendered.
Communication between the system and software can have several forms.The first can be software directly and system communication, The data for needing to render come from software vendor, and manufacturer can provide rendering data information, and system is only needed according to data information Content directly or after making a little processing directly carries out illumination rendering.Second is to be led between software and system by medium Letter, which can be self-defined control, DLL, certain type of interface, its main feature is that the medium can with software between each other into Row communication, or as certain a part of software function, and medium can will need the data information rendered to be analyzed and processed, Finally the spatial cue after processing is communicated and carries out illumination rendering to system.
It should be understood by those skilled in the art that foregoing description and the embodiment of the present invention shown in the drawings are only used as illustrating And it is not intended to limit the present invention.The purpose of the present invention has been fully and effectively achieved.Function and structural principle of the invention exists It shows and illustrates in embodiment, under without departing from the principle, embodiments of the present invention can have any deformation or modification.

Claims (18)

1. a kind of atmosphere rendering method of human-computer interaction interface, which is characterized in that the described method comprises the following steps:
(A) video/image/software mass-tone is extracted;
(B) atmosphere rendering is carried out according to the dominant color information of extraction;
Wherein the dominant color information in the step (B) lures for vision draws color information, and carrying out atmosphere rendering to dominant color information includes pair Lighting apparatus carries out brightness rendering, color rendering or colour temperature rendering;The step (A) the following steps are included:
(A.4) it delimit and extracts mass-tone region;
(A.5) it determines that vision lures and draws color region;
(A.6) it determines that vision lures to draw color and extract;And
(A.7) verification vision, which lures, draws color;
Wherein the step (A.5) the following steps are included:
(A.51) the corresponding rgb pixel matrix in mass-tone region is extracted;
(A.52) rgb pixel matrix conversion is HSV matrix;
(A.53) HSV histogram is counted;
(A.54) saturation degree, brightness and amount threshold are set;And
(A.55) all sections HSV for meeting threshold condition are found out, the most section HSV of quantification is as the area vision You Yinse Between;
Wherein in the step (A.7), mass-tone is verified by doing imitative binary conversion treatment.
2. according to the method described in claim 1, wherein the step (A) the following steps are included:
(A.1) to picture defined area;
(A.2) it is analyzed and is handled using image processing algorithm and mode identification method;And
(A.3) the photochromic rendering parameter identified for lighting apparatus is obtained.
3. according to the method described in claim 2, wherein, in the step (A.1), by the picture carry out piecemeal come Defined area, wherein being suitable for carrying out piecemeal to the picture by way of clockwise or counterclockwise.
4. according to the method in claim 2 or 3, wherein the step (A.2) the following steps are included:
(A.21) the corresponding rgb pixel matrix in each region is successively extracted;
(A.22) rgb matrix is converted to HSV histogram;
(A.23) statistics HSV histogram obtains mass-tone HSV value;
(A.24) mass-tone and previous frame mass-tone color difference are calculated, and is compared;
(A.25) if (A.26) color difference within the allowable range, thens follow the steps, if color difference not within the allowable range, thens follow the steps (A.27);
(A.26) region corresponds to lamp bead color and becomes the frame mass-tone color;And
(A.27) it is constant to correspond to lamp bead color for the region, identical as previous frame mass-tone color.
5. according to the method described in claim 4, wherein, corresponding lighting apparatus is respectively set in each region, respectively to each Region carries out atmosphere rendering.
6. according to the method described in claim 4, wherein, in the step (A.21), being arranged every N row M and extracting rgb pixel square Battle array, wherein N >=1, M >=1.
7. according to the method described in claim 1, wherein in the step (A.4), to be suitable for be two pieces by mass-tone regional assignment, Corresponding lighting apparatus is respectively set according to the mode of different defined areas in three pieces, four pieces or more muti-piece on each piece.
8. according to the method described in claim 7, wherein, in the step (B), calling to extract vision and lure and drawing color algorithm, lamp Pearl shows that vision lures and draws color;Or piecemeal is called to extract mass-tone algorithm, lamp bead shows the mass-tone that piecemeal extracts.
9. it is a kind of by in human-computer interaction device software communication carry out atmosphere rendering method, be suitable for user by with it is man-machine Software in interactive device interacts communication to render atmosphere, which is characterized in that the described method comprises the following steps:
(1) inspection software sends the mode of message;
(2) operation for detecting user is intended to;
(3) lighting apparatus is sent instructions to;And
(4) it is intended to carry out atmosphere rendering according to the operation of user;
Wherein, user needs to click or double-click menu option, toolbar icon or use when with the menu or tool of software Shortcut key, knocks in text or code operation, software or makes corresponding variation depending on the user's operation, which can be by atmosphere wash with watercolours Dye unit captures, and the atmosphere rendering after being predicted and operated before user's operation;
Wherein, in word processor operation, atmosphere rendering unit is when mouse chooses colour table icon and clicks colour table icon It makes a response, wherein atmosphere rendering unit obtains the exact position of mouse when mouse chooses colour table icon, judges corresponding to it The tool icon, the color in type and colour table, analysis handles and simultaneously sends information to lighting apparatus, and lighting apparatus is all aobvious Show or flash the color for showing color identical with the color icon that mouse is chosen to prompt user to select;In user click colour table The color icon when, choose text background change be the color icon color, atmosphere rendering unit judge its display color and Background changes the position of text so that the color of the lighting apparatus display color icon of text corresponding position, with into Row part rendering, so that text information is more prominent.
10. according to the method described in claim 9, wherein, in the operation of PowerPoint software, being done to lantern slide and fading in or fade out When dynamic operation, after atmosphere rendering unit receives after windows message or navigates to mouse position and corresponding icon in real time, hair Send message to lighting apparatus, so that regular certain color of pressing of lighting apparatus is by secretly to bright from shallow to deep or from bright to dark by depth To shallowly being converted, color is to be lured to draw color or other methods determination according to the mass-tone of lantern slide or secondary mass-tone or vision.
11. according to the method described in claim 9, wherein, in the programming software work of software engineer, atmosphere rendering unit Show that enhancing attention can loosen the light of eyes again in programming software work, and light intensity and color keep are constant, at one section Between after the automatic color or frequency for changing light, it is appropriate to rest to prompt engineer to adjust.
12. according to the method described in claim 9, wherein, for photography software of taking a picture, atmosphere rendering unit is adapted to detect for difference Scene, obtain its basic color information, different scenes rendered, so that the scene of image extends to entire sky Between, promote the whole atmosphere sense of photo or photography.
13. according to the method described in claim 9, wherein, for security software, in atmosphere rendering unit real-time capture software Various secure datas, when a certain data are lower than minimum safe standard or are higher by safety value, lighting apparatus can issue light at once Warning.
14. according to the method described in claim 9, wherein, for teaching software, atmosphere rendering unit by show it is a certain or certain Several colors, or a certain or certain several color are flashed, prompt learner to collect favorite power or appropriate rest, wherein prompting focal attention The time interval of power and the time interval suitably rested are automatically configured by system according to experimental data, or by user according to certainly The habit of body is configured.
15. according to the method described in claim 9, wherein, for assisting software, atmosphere rendering unit is suitable for soft according to 3D composition The color of part works, the source of light render it.
16. according to the method described in claim 9, wherein, in home decoration, atmosphere rendering unit detects each in room The power of the light of position, color, and real-time rendering is carried out to it according to its distributing position and variation.
17. according to the method described in claim 9, wherein, in terms of picture browsing, in normal preview picture, atmosphere wash with watercolours Dye unit is suitable for rendering next picture or a upper picture, and rendering effect is according to next or a upper picture Color characteristics, in the right direction of picture display area, a upper corresponding rendering region exists in next corresponding rendering region The left direction of picture display area.
18. according to the method described in claim 9, wherein, when being handled for pictorial information, atmosphere rendering unit is examined in real time The variation of pictorial information is surveyed, and the picture after variation is rendered after detecting variation.
CN201510448258.2A 2015-07-27 2015-07-27 The atmosphere rendering system and method for human-computer interaction interface Active CN106406504B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510448258.2A CN106406504B (en) 2015-07-27 2015-07-27 The atmosphere rendering system and method for human-computer interaction interface

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510448258.2A CN106406504B (en) 2015-07-27 2015-07-27 The atmosphere rendering system and method for human-computer interaction interface

Publications (2)

Publication Number Publication Date
CN106406504A CN106406504A (en) 2017-02-15
CN106406504B true CN106406504B (en) 2019-05-07

Family

ID=58008583

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510448258.2A Active CN106406504B (en) 2015-07-27 2015-07-27 The atmosphere rendering system and method for human-computer interaction interface

Country Status (1)

Country Link
CN (1) CN106406504B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107135420A (en) * 2017-04-28 2017-09-05 歌尔科技有限公司 Video broadcasting method and system based on virtual reality technology
CN107945238A (en) * 2017-12-01 2018-04-20 上海怡新信息技术有限公司 Light atmosphere data generating algorithm based on video analysis
CN110084204B (en) * 2019-04-29 2020-11-24 北京字节跳动网络技术有限公司 Image processing method and device based on target object posture and electronic equipment
CN111050202A (en) * 2019-11-22 2020-04-21 北京达佳互联信息技术有限公司 Video processing method, video processing device, electronic equipment and medium
CN112256366A (en) * 2020-09-30 2021-01-22 北京达佳互联信息技术有限公司 Page display method and device and electronic equipment
CN112887694B (en) * 2021-01-26 2023-03-10 腾讯音乐娱乐科技(深圳)有限公司 Video playing method, device and equipment and readable storage medium
CN113259745B (en) * 2021-05-13 2022-11-15 北京百度网讯科技有限公司 Video playing page processing method and device, electronic equipment and storage medium
CN115841528A (en) * 2021-09-18 2023-03-24 北京字跳网络技术有限公司 Rendering method, rendering device, electronic equipment and medium
CN114158160B (en) * 2021-11-26 2024-03-29 杭州当虹科技股份有限公司 Immersive atmosphere lamp system based on video content analysis

Family Cites Families (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB0211898D0 (en) * 2002-05-23 2002-07-03 Koninkl Philips Electronics Nv Controlling ambient light
RU2352081C2 (en) * 2004-06-30 2009-04-10 Кониклейке Филипс Электроникс, Н.В. Selection of dominating colour with application of perception laws for creation of surrounding lighting obtained from video content
CN100559850C (en) * 2004-06-30 2009-11-11 皇家飞利浦电子股份有限公司 Be used for the method that mass-tone is extracted
JP2008505384A (en) * 2004-06-30 2008-02-21 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Ambient light generation from broadcasts derived from video content and influenced by perception rules and user preferences
ATE515925T1 (en) * 2005-12-15 2011-07-15 Koninkl Philips Electronics Nv SYSTEM AND METHOD FOR PRODUCING AN ARTIFICIAL ATMOSPHERE
WO2008038188A2 (en) * 2006-09-29 2008-04-03 Philips Intellectual Property & Standards Gmbh Method and device for composing a lighting atmosphere from an abstract description and lighting atmosphere composition system
CN101669406B (en) * 2007-04-24 2014-06-04 皇家飞利浦电子股份有限公司 Method, system and user interface for automatically creating an atmosphere, particularly a lighting atmosphere, based on a keyword input
CN101553065B (en) * 2008-04-02 2013-05-01 南京汉德森科技股份有限公司 LED atmosphere illumination control system
CN101655983B (en) * 2008-08-18 2012-12-12 索尼(中国)有限公司 Device and method for exacting dominant color
RU2011116047A (en) * 2008-09-23 2012-10-27 Конинклейке Филипс Электроникс Н.В. (Nl) INTERACTIVE SYSTEM FOR ENVIRONMENT
EP2376207B1 (en) * 2008-12-09 2015-08-12 Koninklijke Philips N.V. Method and system for generating data for controlling a system for rendering at least one signal
EP2452278A1 (en) * 2009-07-06 2012-05-16 Koninklijke Philips Electronics N.V. Method and apparatus for generating a sequence of a plurality of images
US20120173382A1 (en) * 2009-09-21 2012-07-05 Koninklijke Philips Electronics N.V. Methods and systems for lighting atmosphere marketplace
WO2011073877A1 (en) * 2009-12-17 2011-06-23 Koninklijke Philips Electronics N.V. Ambience cinema lighting system
KR20120128609A (en) * 2009-12-18 2012-11-27 티피 비전 홀딩 비.브이. Ambience lighting system using global content characteristics
CN101859511A (en) * 2010-04-01 2010-10-13 浙江大学 Environmental atmosphere lamp system and control method thereof
EP2628363B1 (en) * 2010-10-15 2021-05-05 Signify Holding B.V. A method, a user interaction system and a portable electronic devicefor controlling a lighting system
CN102592272B (en) * 2011-01-12 2017-01-25 深圳市世纪光速信息技术有限公司 Extracting method and device of picture dominant tone
RU2635230C2 (en) * 2012-05-08 2017-11-09 Филипс Лайтинг Холдинг Б.В. Illuminating application for interactive electronic device
CN103513946B (en) * 2012-06-26 2018-01-02 腾讯科技(深圳)有限公司 Control the method and device of display
CN103313113A (en) * 2013-05-29 2013-09-18 深圳市九洲电器有限公司 Video playing method and set top box
CN104679380A (en) * 2013-11-30 2015-06-03 富泰华工业(深圳)有限公司 System and method for adjusting background color of user interface
CN103677640B (en) * 2013-12-16 2016-08-31 科大讯飞股份有限公司 A kind of method and system of dummy keyboard self adaptation application theme
CN103686154A (en) * 2013-12-24 2014-03-26 Tcl集团股份有限公司 Television curtain wall system and adjusting method thereof
CN104392443B (en) * 2014-11-18 2017-05-24 浙江工商大学 Method for detecting main characteristic colors of two-dimensional 24-bit color image

Also Published As

Publication number Publication date
CN106406504A (en) 2017-02-15

Similar Documents

Publication Publication Date Title
CN106406504B (en) The atmosphere rendering system and method for human-computer interaction interface
US11321385B2 (en) Visualization of image themes based on image content
CN105659200B (en) For showing the method, apparatus and system of graphic user interface
US10251462B2 (en) Hair consultation tool arrangement and method
US10380803B1 (en) Methods and systems for virtualizing a target object within a mixed reality presentation
CN104113688B (en) A kind of image processing method and its electronic equipment
CN110248450B (en) Method and device for controlling light by combining people
Varona et al. Hands-free vision-based interface for computer accessibility
JP6492332B2 (en) Information processing apparatus, information processing method, and program
CN109886153B (en) Real-time face detection method based on deep convolutional neural network
CN106471521A (en) The identification of self adaptation eyes artifact and correction system
CN109887095A (en) A kind of emotional distress virtual reality scenario automatic creation system and method
US8532354B2 (en) Method for providing visual simulation of teeth whitening
CN108805094A (en) Data enhancement methods based on artificial face
CN107102736A (en) The method for realizing augmented reality
CN108229450A (en) The method and living creature characteristic recognition system of light filling are carried out based on screen display
US20210312167A1 (en) Server device, terminal device, and display method for controlling facial expressions of a virtual character
CN110110412A (en) House type full trim simulation shows method and display systems based on BIM technology
CN208013970U (en) A kind of living creature characteristic recognition system
KR20220012786A (en) Apparatus and method for developing style analysis model based on data augmentation
CN108521546A (en) A kind of implementation method of mobile phone photograph function that simulating photographer's creation thought and mood image style
Gajic et al. Egocentric human segmentation for mixed reality
CN110461060A (en) A kind of intelligence landscape system and its working method
CN103854009B (en) A kind of information processing method and electronic equipment
JP6752007B2 (en) Drawing image display system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20220308

Address after: Building B7-1, shawen Ecological Industrial Park, national high tech Industrial Development Zone, Guiyang City, Guizhou Province

Patentee after: Guizhou zhongshengtaike Intelligent Technology Co.,Ltd.

Address before: 213100, 7th floor, block B, building 1, Chuangyan port, science and Education City, Changzhou City, Jiangsu Province

Patentee before: CHANGZHOU INSTITUTE OF TECHNOLOGY RESEARCH FOR SOLID STATE LIGHTING