CN114158160A - Immersive atmosphere lamp system based on video content analysis - Google Patents

Immersive atmosphere lamp system based on video content analysis Download PDF

Info

Publication number
CN114158160A
CN114158160A CN202111421096.5A CN202111421096A CN114158160A CN 114158160 A CN114158160 A CN 114158160A CN 202111421096 A CN202111421096 A CN 202111421096A CN 114158160 A CN114158160 A CN 114158160A
Authority
CN
China
Prior art keywords
atmosphere
lamp
atmosphere lamp
ambience
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111421096.5A
Other languages
Chinese (zh)
Other versions
CN114158160B (en
Inventor
陈勇
孙彦龙
傅一丹
楼一品
裘昊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Arcvideo Technology Co ltd
Original Assignee
Hangzhou Arcvideo Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Arcvideo Technology Co ltd filed Critical Hangzhou Arcvideo Technology Co ltd
Priority to CN202111421096.5A priority Critical patent/CN114158160B/en
Publication of CN114158160A publication Critical patent/CN114158160A/en
Application granted granted Critical
Publication of CN114158160B publication Critical patent/CN114158160B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/643Hue control means, e.g. flesh tone control
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/155Coordinated control of two or more light sources
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/165Controlling the light source following a pre-assigned programmed sequence; Logic control [LC]
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Arrangements Of Lighting Devices For Vehicle Interiors, Mounting And Supporting Thereof, Circuits Therefore (AREA)

Abstract

The invention discloses an immersive atmosphere lamp system based on video content analysis, wherein an atmosphere lamp configuration module is used for mapping a space of an atmosphere lamp to be set into a planar space, the mapped planar space corresponds to a played video picture plane, the space of the atmosphere lamp to be set is divided into a plurality of independent atmosphere blocks according to an area needing to independently control rendering effects, and all the independent atmosphere blocks comprise corresponding atmosphere lamp IDs and switch state data to form atmosphere lamp configuration information; the video playing module is used for playing a specified video, decoding video data and outputting each decoded frame data to the image tone analysis module at the same time; the image tone analysis module is used for receiving image frame data and atmosphere lamp configuration information; the atmosphere lamp rendering module is used for analyzing after receiving atmosphere lamp configuration information, lighting or closing the corresponding atmosphere lamp according to the state of the atmosphere lamp, and the lighted atmosphere lamp is set according to the dominant hue color value returned by the image hue analysis module.

Description

Immersive atmosphere lamp system based on video content analysis
Technical Field
The invention belongs to the technical field of video processing, and particularly relates to an immersive atmosphere lamp system based on video content analysis.
Background
At present, atmosphere lamps are increasingly used in spaces such as KTVs, bars, automobiles, etc. to create a cool atmosphere.
In automobiles, atmosphere lights are often mounted in center consoles, door armrests, stereos, seats, pedals, ceilings, and the like. At present, various preset effect modes are adopted by various large car factories, users can switch the effect modes according to situations, and the color, the frequency and the like of the atmosphere lamp can be automatically changed according to different conditions such as weather, temperature, music and the like.
In the KTV box, only limited types of mood light effects are currently available and do not make strong correlations with the content of the audiovisual programs played. In a household indoor environment, an atmosphere lamp is installed on a wall body for installing a television, but the atmosphere mode is limited, and the action range is a plane. With the further development of the network and the popularization of the personalized concept in the future, in business seats of high-speed rails and airplanes and boxes of various entertainment places including movie bars, theaters, KTVs and the like, each viewer may experience the same atmosphere or the personalized atmosphere when watching video content.
In the above scenario, when the user plays the video, no atmosphere effect that can continuously change along with the video content picture is seen yet.
Disclosure of Invention
In view of the above existing problems, the embodiments of the present invention provide an immersive atmosphere lamp system based on video content analysis, which can follow the changing atmosphere effect of the video content based on analyzing the dominant hue of the video content.
In order to solve the technical problems, the invention adopts the following technical scheme:
the embodiment of the invention provides an immersive atmosphere lamp system based on video content analysis in a first aspect, which comprises an atmosphere lamp configuration module, a video playing module, an image tone analysis module and an atmosphere lamp rendering module,
the atmosphere lamp configuration module is used for mapping the space of an atmosphere lamp to be set into a plane space, corresponding the mapped plane space to a played video picture plane, dividing the space of the atmosphere lamp to be set into a plurality of independent atmosphere blocks according to an area needing to independently control rendering effects, and forming atmosphere lamp configuration information by using all the independent atmosphere blocks including corresponding atmosphere lamp IDs and switch state data;
the video playing module is used for playing a specified video, decoding video data and outputting each decoded frame data to the image tone analysis module at the same time;
the image tone analysis module is used for receiving image frame data and atmosphere lamp configuration information and analyzing corresponding position and width and height information, atmosphere lamp states and atmosphere lamp main tones of the independent atmosphere blocks;
the atmosphere lamp rendering module is used for analyzing after receiving atmosphere lamp configuration information, lighting or closing the corresponding atmosphere lamp according to the state of the atmosphere lamp, and the lighted atmosphere lamp is set according to the dominant hue color value returned by the image hue analysis module.
In one possible design of the first aspect, in the atmosphere lamp rendering module, for an atmosphere lamp state of an independent atmosphere block being a closed state, a closing control signal is sent to a corresponding atmosphere lamp controller through an atmosphere lamp ID, and the atmosphere lamp controller closes the corresponding atmosphere lamp; and for the lighted atmosphere lamp, receiving the dominant hue color value returned by the image hue analysis module, sending a color control signal to the corresponding atmosphere lamp controller, and controlling the corresponding atmosphere lamp color by the atmosphere lamp controller according to the dominant hue color.
In a possible design of the first aspect, the analyzing, by the image tone analysis module, the corresponding position and width and height information of the independent atmosphere block includes: analyzing the width and height of an image from image frame data, analyzing information of an independent atmosphere block from atmosphere lamp configuration information, acquiring initial coordinate position and width and height information of the independent atmosphere block when the atmosphere lamp state of the independent atmosphere block is in a lighting state, and converting the initial coordinate position and the width and height information into the initial coordinate position and the width and height of a corresponding area of the image.
In a possible design of the first aspect, the individual atmosphere block is divided into several smaller individual atmosphere units, the atmosphere lamps of each individual atmosphere unit being controlled independently.
A second aspect of embodiments of the present invention provides an automobile having disposed thereon a immersive ambience light system based on video content analysis as described in any one of the above.
In a possible design of the second aspect, the system further includes a riding scene mode selection module, and after one riding scene mode is selected, the atmosphere light effect of the video is rendered according to a range defined by the mode, and then the rendering is performed in combination with the independent atmosphere block configuration information of the atmosphere light configuration module.
In a possible design of the second aspect, the definition of the riding scene mode includes a whole vehicle atmosphere mode, and atmosphere lamps of all independent atmosphere blocks in a vehicle in the whole vehicle mode participate in atmosphere rendering.
In a possible design of the second aspect, the definition of the riding scene mode includes a driver atmosphere mode in which atmosphere lights of corresponding individual atmosphere blocks for the driver participate in atmosphere rendering.
One possibility of the second aspect is that the definition of the riding scene mode comprises a front row atmosphere mode in which atmosphere lights of individual atmosphere zones corresponding to the front seat participate in the atmosphere rendering.
In a possible design of the second aspect, the definition of the riding scene mode includes a back-row atmosphere mode in which an independent atmosphere block atmosphere lamp corresponding to the back-row seat participates in atmosphere rendering.
In a possible design of the second aspect, the definition of the riding scene mode includes a whole vehicle passenger atmosphere module, and atmosphere lamps of all independent atmosphere blocks except a driver in the whole vehicle passenger atmosphere mode participate in atmosphere rendering.
In one possible design of the second aspect, the definition of the riding scene mode includes a custom atmosphere module, and all individual atmosphere blocks in the vehicle are respectively configured with an activation or deactivation block atmosphere lamp state.
In one possible design of the second aspect, the definition of the riding scene mode includes a custom atmosphere module intelligent atmosphere, the seating condition of the driver and the crew is automatically acquired under the custom atmosphere module, and the states of the zone atmosphere lamps are respectively configured and activated according to the seating condition.
A third aspect of the embodiments of the present invention provides a KTV box, in which is disposed an immersive ambience lamp system based on video content analysis as described in any one of the above.
The invention has the following beneficial effects:
(1) the effect expression of the atmosphere can be correspondingly changed according to the change of the content of the video picture;
(2) the method of arranging the atmosphere lamps in the blocks can enable the atmosphere expression to change according to the change of the scene, and the method is more immersive. Furthermore, the intelligent switching of the atmosphere lamp block is realized through seat use state sensing, so that the purpose of energy conservation is achieved while the atmosphere wrapping feeling is enhanced.
Drawings
FIG. 1 is a schematic diagram of the structure of an immersive atmosphere lamp system based on video content analysis in accordance with an embodiment of the present invention;
FIG. 2 is a schematic structural diagram of a space in which an atmosphere lamp is to be disposed in an automobile according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of a space in which an atmosphere lamp is to be arranged in a KTV box according to an embodiment of the invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, the immersive atmosphere lamp system based on video content analysis according to the embodiment of the present invention includes an atmosphere lamp configuration module 10, a video playing module 20, an image hue analysis module 30, and an atmosphere lamp rendering module 40, where the atmosphere lamp configuration module 10 is configured to map a space where an atmosphere lamp is to be set into a planar space, correspond the mapped planar space to a plane of a video picture to be played, divide the space where the atmosphere lamp is to be set into a plurality of independent atmosphere blocks according to an area where a rendering effect needs to be independently controlled, and form atmosphere lamp configuration information by including corresponding atmosphere lamp IDs and on-off state data for all the independent atmosphere blocks; the video playing module 20 is configured to play a specified video, decode video data, and output each decoded frame of data to the image tone analysis module at the same time; the image tone analysis module 30 is configured to receive image frame data and atmosphere lamp configuration information, and analyze corresponding position and width and height information of the independent atmosphere block, an atmosphere lamp state, and an atmosphere lamp dominant tone;
the atmosphere lamp rendering module 40 is configured to analyze the received atmosphere lamp configuration information, turn on or turn off the corresponding atmosphere lamp according to the atmosphere lamp state, and set the turned-on atmosphere lamp according to the dominant hue color value returned by the image hue analysis module 30.
In an embodiment of the present invention, the plane space is defined as a plane corresponding to W × H of the video frame, W is the width, and H is the height; each independent atmosphere block has a coordinate in the plane of alphajW,βjH, width is gammajW, height δjH; wherein alpha isj,βj,γj,δjIs any real number in the range of 0 to 1. Analyzing the width and height of the image from the image and marking as WpAnd Hp. Analyzing the information of the independent atmosphere unit from the configuration information, acquiring the coordinate and the width and the height of the atmosphere lamp of the independent atmosphere unit in an activated state, and converting the coordinate and the width and the height of the independent atmosphere unit into the coordinate and the width and the height of the corresponding area of the image: the abscissa is alphajW*WpOrdinate is betajH*HpWidth of gammajW*WpHeight of deltajH*Hp
In an embodiment of the present invention, the dominant hue color calculation method may adopt an average value of RGB components of all pixels of the image unit, a color value with the largest proportion of the image unit, a K-Means clustering algorithm, or a color quantization algorithm such as a median cut method or an octree algorithm, where different algorithms have differences in dominant hue, so that the calculation method of the image hue analysis module in the embodiment of the present invention is optional.
On the basis of the immersive atmosphere lamp system based on video content analysis in the embodiment of the invention, the embodiment of the invention provides an automobile, and the immersive atmosphere lamp system based on video content analysis is arranged on the automobile.
In an embodiment of the invention, the atmosphere lamp configuration module is mounted in a central control vehicle-mounted system of an automobile, and can be used for defining the influence range of all atmosphere lamps in the automobile under different riding scenes by a user. An atmosphere light configuration module, in which a model is presented in the central control screen, the model being the layout of the whole cabin, see fig. 2, divided into different individual atmosphere zones by seat. Individual ambiance cells in a block correspond to all of the ambiance lights within the area of the block, such as the front left driver block, with the corresponding ambiance lights including driver-side center consoles, door armrests, seats, foot-operated ambiance lights. The atmosphere lamp in each independent atmosphere block can be selected to be in an activated/closed state, and the position and the size of each sub-block can be adjusted through keys or touch and the like.
In an embodiment of the invention, the automobile further comprises a riding scene mode selection module, which is used for providing compact modes such as a whole automobile atmosphere, a driver atmosphere, a front row atmosphere, a rear row atmosphere, a whole automobile passenger atmosphere, a self-defined atmosphere, an intelligent atmosphere and the like for selection, after one mode is selected, a video atmosphere effect presentation module in a configuration stage starts to play a video, and the atmosphere light effect of the video is rendered according to the range defined by the mode and in combination with the configuration information of the subareas. The ride scene modes include the following:
(1) and (3) atmosphere of the whole vehicle: atmosphere lamps of all independent atmosphere blocks in the vehicle participate in atmosphere rendering;
(2) driver atmosphere: only atmosphere lamps of the independent atmosphere blocks corresponding to the driver participate in atmosphere rendering;
(3) front exhaust atmosphere: only atmosphere lamps of independent atmosphere blocks corresponding to the front row seats participate in atmosphere rendering, and generally comprise a main driving block and a secondary driving block;
(4) back atmosphere: only independent atmosphere block atmosphere lamps corresponding to the rear row of seats participate in atmosphere rendering;
(5) the atmosphere of the passengers in the whole vehicle: atmosphere lamps of all independent atmosphere blocks except a driver in the vehicle participate in atmosphere rendering;
(6) self-defining atmosphere: all individual ambience blocks in the vehicle may be individually configured with an active/off block ambience light status.
(7) Intelligent atmosphere: and (4) automatically acquiring the seating condition of the driver and the passenger, and respectively entering the atmosphere lamps from (1) to (6). Specifically, when only the driver is detected, entering (2); entering when a passenger is detected in the front row and no passenger is detected in the rear row (3); entering (4) when no passenger is detected in the copilot and a passenger is in the back row; and entering (5) when passengers are detected in both the passenger seat and the rear row. And can automatically switch into the corresponding modes from (1) to (6) at any time along with the change of the sitting of the passenger.
The configuration process can also be configured on a mobile terminal APP of the vehicle enterprise and synchronized to the vehicle machine.
Receiving video pictures after video decoding played by a screen in the vehicle, analyzing the width and the height of the image from the image, and recording as WpAnd Hp. And acquires the current riding mode and the configuration information of the block atmosphere lamps from the central control host. Obtaining independent atmosphere unit information participating in rendering in the corresponding configuration information in the riding mode, and converting the coordinates and the width and the height of the independent atmosphere unit into the coordinates and the width and the height of an image corresponding area:
the abscissa is alphajW*WpOrdinate is betajH*HpWidth of gammajW*WpHeight of deltajH*Hp
And dividing the image into different sub-image units according to the converted area value and width and height, and performing dominant hue analysis calculation on the sub-image units to obtain dominant hue color values. The dominant hue color calculation method may adopt calculating an average value of RGB components of all pixels of the image unit, finding a color value of the image unit with the most proportion, a K-Means clustering algorithm, or a color quantization algorithm such as a median segmentation method or an octree algorithm, and different algorithms have differences in dominant hue, so the calculation method in the present analysis unit is optional.
On the basis of the immersive atmosphere lamp system based on video content analysis, the embodiment of the invention provides a KTV box, and the immersive atmosphere lamp system based on video content analysis is arranged in the KTV box.
In an embodiment of the present invention, in an atmosphere lamp configuration module of a KTV box, a layout of an area surrounded by an entire audiovisual seat of the KTV box is shown in fig. 3. In each region, the different independent atmosphere blocks may be divided differently. The different individual atmosphere blocks are divided into different individual atmosphere units corresponding to all the atmosphere lamps within the block, such as the back, seat, footrests of the sofas of the respective zones. The atmosphere lamp in each independent atmosphere block can be selected to be in an activated/closed state, and the position and the size of each sub-block can be adjusted through keys or touch and the like. The main tone analysis module receives video pictures after the video decoding is played by the television, analyzes the width and the height of the image from the image and records the width and the height as WpAnd Hp. And obtaining the configuration information of the atmosphere lamp from the atmosphere lamp configuration module. Obtaining information of independent atmosphere units participating in rendering in the information, and converting coordinates and width and height of the independent atmosphere units into coordinates and width and height of an image corresponding area: the abscissa is alphajW*WpOrdinate is betajH*HpWidth of gammajW*WpHeight of deltajH*Hp. And dividing the image into different sub-image units according to the converted area value and width and height, and performing dominant hue analysis calculation on the sub-image units to obtain dominant hue color values. The dominant hue color calculation method may be used to find the image unit by calculating the average of the RGB components of all pixels of the image unitThe most significant color values, the K-Means clustering algorithm, or the color quantization algorithm such as the median segmentation or the octree algorithm, the difference of dominant hue exists among different algorithms, and therefore the calculation method in the present analysis unit is optional.
It is to be understood that the exemplary embodiments described herein are illustrative and not restrictive. Although one or more embodiments of the present invention have been described with reference to the accompanying drawings, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present invention as defined by the following claims.

Claims (14)

1. An immersive atmosphere lamp system based on video content analysis is characterized by comprising an atmosphere lamp configuration module, a video playing module, an image hue analysis module and an atmosphere lamp rendering module,
the atmosphere lamp configuration module is used for mapping the space of an atmosphere lamp to be set into a plane space, corresponding the mapped plane space to a played video picture plane, dividing the space of the atmosphere lamp to be set into a plurality of independent atmosphere blocks according to an area needing to independently control rendering effects, and forming atmosphere lamp configuration information by using all the independent atmosphere blocks including corresponding atmosphere lamp IDs and switch state data;
the video playing module is used for playing a specified video, decoding video data and outputting each decoded frame data to the image tone analysis module at the same time;
the image tone analysis module is used for receiving image frame data and atmosphere lamp configuration information and analyzing corresponding position and width and height information, atmosphere lamp states and atmosphere lamp main tones of the independent atmosphere blocks;
the atmosphere lamp rendering module is used for analyzing after receiving atmosphere lamp configuration information, lighting or closing the corresponding atmosphere lamp according to the state of the atmosphere lamp, and the lighted atmosphere lamp is set according to the dominant hue color value returned by the image hue analysis module.
2. The video content analysis based immersive ambience lamp system of claim 1, wherein in the ambience lamp rendering module, the ambience lamp status for the individual ambience blocks is off, a turn-off control signal is sent to the corresponding ambience lamp controller via the ambience lamp ID, and the ambience lamp controller turns off the corresponding ambience lamp; and for the lighted atmosphere lamp, receiving the dominant hue color value returned by the image hue analysis module, sending a color control signal to the corresponding atmosphere lamp controller, and controlling the corresponding atmosphere lamp color by the atmosphere lamp controller according to the dominant hue color.
3. The video content analysis based immersive ambience lamp system of claim 1, wherein the image tone analysis module resolving the corresponding location and width and height information of the individual ambience blocks comprises: analyzing the width and height of an image from image frame data, analyzing information of an independent atmosphere block from atmosphere lamp configuration information, acquiring initial coordinate position and width and height information of the independent atmosphere block when the atmosphere lamp state of the independent atmosphere block is in a lighting state, and converting the initial coordinate position and the width and height information into the initial coordinate position and the width and height of a corresponding area of the image.
4. The video content analysis based immersive ambience lamp system of claim 1, wherein the independent ambience block is divided into a number of smaller independent ambience cells, the ambience lamp of each independent ambience cell being independently controlled.
5. An automobile, characterized in that the automobile is provided with an immersive ambience light system based on video content analysis as claimed in any one of claims 1 to 4.
6. The vehicle of claim 1, further comprising a riding scene mode selection module, wherein after a riding scene mode is selected, the video ambience light effect is rendered according to the range defined by the mode in combination with the individual ambience block configuration information of the ambience light configuration module.
7. The vehicle of claim 6, wherein the definition of the riding scene mode comprises a full vehicle atmosphere mode, and atmosphere lights of all independent atmosphere blocks in the vehicle participate in atmosphere rendering in the full vehicle mode.
8. The vehicle of claim 6, wherein the definition of the riding scene mode comprises a driver atmosphere mode in which atmosphere lights for respective individual atmosphere zones of the driver participate in the atmosphere rendering.
9. The vehicle of claim 6, wherein the definition of the riding scene mode comprises a front row atmosphere mode in which atmosphere lights of individual atmosphere tiles corresponding to front seats participate in atmosphere rendering.
10. The automobile of claim 6, wherein the definition of the riding scene mode comprises a rear-row ambience mode in which a rear-row seat corresponding independent ambience tile ambience light participates in the ambience rendering.
11. The vehicle of claim 6, wherein the definition of the riding scene mode includes a whole vehicle passenger atmosphere module, and atmosphere lights of all independent atmosphere blocks except for a driver inside the vehicle in the whole vehicle passenger atmosphere mode participate in atmosphere rendering.
12. The vehicle of claim 6, wherein the definition of the riding scene mode includes a custom atmosphere module, and wherein all individual atmosphere tiles in the vehicle are configured to activate or deactivate the tile atmosphere light states, respectively.
13. The automobile of claim 6, wherein the definition of the riding scene mode comprises a custom atmosphere module intelligent atmosphere, the custom atmosphere module automatically captures the seating condition of the occupant, and the activation block atmosphere lamp states are configured according to the seating condition.
14. A KTV room having the immersive ambience lamp system based on video content analysis of any one of claims 1 to 4 disposed therein.
CN202111421096.5A 2021-11-26 2021-11-26 Immersive atmosphere lamp system based on video content analysis Active CN114158160B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111421096.5A CN114158160B (en) 2021-11-26 2021-11-26 Immersive atmosphere lamp system based on video content analysis

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111421096.5A CN114158160B (en) 2021-11-26 2021-11-26 Immersive atmosphere lamp system based on video content analysis

Publications (2)

Publication Number Publication Date
CN114158160A true CN114158160A (en) 2022-03-08
CN114158160B CN114158160B (en) 2024-03-29

Family

ID=80458220

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111421096.5A Active CN114158160B (en) 2021-11-26 2021-11-26 Immersive atmosphere lamp system based on video content analysis

Country Status (1)

Country Link
CN (1) CN114158160B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114698207A (en) * 2022-03-30 2022-07-01 阿波罗智联(北京)科技有限公司 Light adjusting method and device, electronic equipment and storage medium
CN115334099A (en) * 2022-07-20 2022-11-11 榜威电子科技(上海)有限公司 Linkage system, method and storage medium of streaming media audio/video data and lamp
CN115810338A (en) * 2023-02-03 2023-03-17 深圳市蔚来芯科技有限公司 Display processing method and system based on image scene
CN117412451A (en) * 2023-12-13 2024-01-16 深圳市千岩科技有限公司 Atmosphere lamp equipment, mapping color matching method thereof, corresponding device and medium

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101548551A (en) * 2006-12-08 2009-09-30 皇家飞利浦电子股份有限公司 Ambient lighting
KR20130100628A (en) * 2012-03-02 2013-09-11 삼성전자주식회사 The method of controlling light using image and the system using the same
CN105744177A (en) * 2016-02-17 2016-07-06 广州视睿电子科技有限公司 Video exhibition stand light control method and system
CN105976767A (en) * 2016-06-28 2016-09-28 凌云光技术集团有限责任公司 Area source brightness uniformity adjusting method, device and system
CN106406504A (en) * 2015-07-27 2017-02-15 常州市武进区半导体照明应用技术研究院 Atmosphere rendering system and method of man-machine interaction interface
US20180284953A1 (en) * 2017-03-28 2018-10-04 Osram Sylvania Inc. Image-Based Lighting Controller
US20190116647A1 (en) * 2016-04-06 2019-04-18 Philips Lighting Holding B.V. Controlling a lighting system
CN113132756A (en) * 2021-03-12 2021-07-16 杭州当虹科技股份有限公司 Video coding and transcoding method
CN113352986A (en) * 2021-05-20 2021-09-07 浙江吉利控股集团有限公司 Vehicle voice atmosphere lamp partition interaction control method and system
CN113613370A (en) * 2021-08-30 2021-11-05 江苏惠通集团有限责任公司 Atmosphere lamp control method and device, computer readable storage medium and terminal
CN113630932A (en) * 2020-12-11 2021-11-09 萤火虫(深圳)灯光科技有限公司 Light control method, controller, module and storage medium based on boundary identification

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101548551A (en) * 2006-12-08 2009-09-30 皇家飞利浦电子股份有限公司 Ambient lighting
KR20130100628A (en) * 2012-03-02 2013-09-11 삼성전자주식회사 The method of controlling light using image and the system using the same
CN106406504A (en) * 2015-07-27 2017-02-15 常州市武进区半导体照明应用技术研究院 Atmosphere rendering system and method of man-machine interaction interface
CN105744177A (en) * 2016-02-17 2016-07-06 广州视睿电子科技有限公司 Video exhibition stand light control method and system
US20190116647A1 (en) * 2016-04-06 2019-04-18 Philips Lighting Holding B.V. Controlling a lighting system
CN105976767A (en) * 2016-06-28 2016-09-28 凌云光技术集团有限责任公司 Area source brightness uniformity adjusting method, device and system
US20180284953A1 (en) * 2017-03-28 2018-10-04 Osram Sylvania Inc. Image-Based Lighting Controller
CN113630932A (en) * 2020-12-11 2021-11-09 萤火虫(深圳)灯光科技有限公司 Light control method, controller, module and storage medium based on boundary identification
CN113132756A (en) * 2021-03-12 2021-07-16 杭州当虹科技股份有限公司 Video coding and transcoding method
CN113352986A (en) * 2021-05-20 2021-09-07 浙江吉利控股集团有限公司 Vehicle voice atmosphere lamp partition interaction control method and system
CN113613370A (en) * 2021-08-30 2021-11-05 江苏惠通集团有限责任公司 Atmosphere lamp control method and device, computer readable storage medium and terminal

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
S. HAN, X. ZHONG, Y. DING, W. LI, S. LIU AND P. LIU: "Intelligent Dimming LED for Moonlight Simulation", 《2015 2ND INTERNATIONAL CONFERENCE ON INFORMATION SCIENCE AND CONTROL ENGINEERING》 *
文渊, 汪玉坤, 张保平: "自适应屏幕主题的氛围灯设计", 《科技创新与应用》 *
赵新: "基于单片机控制的LED线阵显示装置的设计", 《武汉交通职业学院学报》, vol. 22, no. 3 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114698207A (en) * 2022-03-30 2022-07-01 阿波罗智联(北京)科技有限公司 Light adjusting method and device, electronic equipment and storage medium
CN115334099A (en) * 2022-07-20 2022-11-11 榜威电子科技(上海)有限公司 Linkage system, method and storage medium of streaming media audio/video data and lamp
WO2024016603A1 (en) * 2022-07-20 2024-01-25 榜威电子科技(上海)有限公司 System and method for linking streaming media audio/video data with lamp, and storage medium
CN115334099B (en) * 2022-07-20 2024-02-27 榜威电子科技(上海)有限公司 Linkage system, method and storage medium for streaming media audio/video data and lamp
CN115810338A (en) * 2023-02-03 2023-03-17 深圳市蔚来芯科技有限公司 Display processing method and system based on image scene
CN117412451A (en) * 2023-12-13 2024-01-16 深圳市千岩科技有限公司 Atmosphere lamp equipment, mapping color matching method thereof, corresponding device and medium
CN117412451B (en) * 2023-12-13 2024-03-15 深圳市千岩科技有限公司 Atmosphere lamp equipment, mapping color matching method thereof, corresponding device and medium

Also Published As

Publication number Publication date
CN114158160B (en) 2024-03-29

Similar Documents

Publication Publication Date Title
CN114158160A (en) Immersive atmosphere lamp system based on video content analysis
CN101438579B (en) Adaptive rendering of video content based on additional frames of content
CN105431900B (en) For handling method and apparatus, medium and the equipment of audio data
CN112061049B (en) Scene triggering method, device, equipment and storage medium
CN103649904A (en) Adaptive presentation of content
EP2926626B1 (en) Method for creating ambience lighting effect based on data derived from stage performance
WO2010118296A2 (en) System and method for generating and rendering multimedia data including environmental metadata
CN1836202A (en) A visual content signal apparatus and a method of displaying a visual content signal thereof
KR102247264B1 (en) Performance directing system
CN114416000A (en) Multi-screen interaction method and multi-screen interaction system applied to intelligent automobile
CN111123851A (en) Method, device and system for controlling electric equipment according to user emotion
JP7372991B2 (en) Performance production system and its control method
CN107592588B (en) Sound field adjusting method and device, storage medium and electronic equipment
JP5166794B2 (en) Viewing environment control device and viewing environment control method
US11368725B1 (en) System and method for creating a virtual journey experience
CN115775569A (en) Vehicle cabin music playing method and device, vehicle and storage medium
CN115891823A (en) Vehicle cabin lighting control method, control device, vehicle and storage medium
CN115352376A (en) Interaction method and device for vehicle cabin, vehicle and storage medium
CN114760434A (en) Automobile intelligent cabin capable of realizing multi-person online video conference and method
CN104735597A (en) Immersion type holographic sound and 3D image fusion achieving system
CN115095843A (en) Car lamp structure capable of realizing sound and light integration and control method thereof
CN115103485A (en) Scene-based control method and system for rhythm of automobile atmosphere lamp along with sound source
CN103581692A (en) Airborne entertainment system based on combination of streaming media unicast and multicast and unicast control method thereof
Buqi et al. Cruise cabin as a home: Smart approaches to improve cabin comfort
Wilkinson Theatre in an expanded field? All That Fall and Embers reimagined by Pan Pan

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant