CN113474822A - System and method for providing weather effects in images - Google Patents

System and method for providing weather effects in images Download PDF

Info

Publication number
CN113474822A
CN113474822A CN202080015016.3A CN202080015016A CN113474822A CN 113474822 A CN113474822 A CN 113474822A CN 202080015016 A CN202080015016 A CN 202080015016A CN 113474822 A CN113474822 A CN 113474822A
Authority
CN
China
Prior art keywords
image
weather
texture
server
segments
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202080015016.3A
Other languages
Chinese (zh)
Inventor
金启显
崔闰熙
权槿周
金范锡
李相沅
李有真
张台永
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Priority claimed from PCT/KR2020/001627 external-priority patent/WO2020171425A1/en
Publication of CN113474822A publication Critical patent/CN113474822A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/603D [Three Dimensional] animation of natural phenomena, e.g. rain, snow, water or plants
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30192Weather; Meteorology

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • Data Mining & Analysis (AREA)
  • Multimedia (AREA)
  • Artificial Intelligence (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Processing Or Creating Images (AREA)

Abstract

A system and method for providing a weather effect in an image includes selecting at least one weather texture image indicating weather, and providing the weather effect in the image by overlaying the selected weather texture image on the image. A method of providing a weather effect in an image includes receiving object identification information regarding a shape of an object in the image and spatial information regarding a depth of the object in the image, selecting at least one weather texture image indicating weather, and providing the weather effect in the image.

Description

System and method for providing weather effects in images
Technical Field
The present invention relates to a system and method for providing weather effects in images.
Background
Currently, televisions or mobile devices display weather information by using text or pre-designed media. However, weather information may not be effectively provided to the user by using text and icons. In order to provide weather information in the form of video data, time and resources are required to generate the video data.
Disclosure of Invention
Aspects of the present invention provide a system that can allow a device to simulate weather effects in an image by using few computing resources.
Aspects of the present invention also provide a system capable of providing a 3D weather effect by predicting a three-dimensional (3D space) in a two-dimensional (2D) picture using an artificial intelligence model.
Aspects of the present invention also provide a system capable of effectively using a storage space by simulating a weather effect in an image in real time using a weather texture image.
Additional aspects will be set forth in the description which follows and will be apparent from the description, or may be learned by practice of the presented embodiments of the disclosure.
Drawings
The above and other aspects, features and advantages of certain embodiments of the present invention will become more apparent from the following description when taken in conjunction with the accompanying drawings, in which:
FIG. 1 is a schematic diagram illustrating an example of a system for providing weather effects in a particular image in accordance with an embodiment of the present invention;
FIG. 2 is a diagram illustrating an example of reflecting a weather effect in an image according to an embodiment of the present invention;
FIG. 3 is a flow diagram of a method performed by a system for providing weather effects in an image, according to an embodiment of the invention;
FIG. 4 is a flow diagram of a method performed by a device of selecting a weather texture image corresponding to current weather, in accordance with an embodiment of the present invention;
FIG. 5 is a diagram illustrating an example of a plurality of weather texture images corresponding to weather according to an embodiment of the present invention;
fig. 6 is a schematic diagram illustrating an example of object identification information indicating an object identified in an image according to an embodiment of the present invention;
fig. 7 is a schematic diagram illustrating an example of spatial information indicating a space analyzed in an image according to an embodiment of the present disclosure;
fig. 8 is a diagram illustrating an example of weather information according to an embodiment of the present invention;
FIG. 9 is a flow diagram of a method performed by a system of providing a weather effect in an image based on criteria information for simulating the weather effect, in accordance with an embodiment of the present disclosure;
FIG. 10 is a flow diagram of a method performed by a server of the system of determining criteria for simulating a plurality of weather texture images in accordance with an embodiment of the present invention;
FIG. 11 is a flow diagram of a method performed by a server to determine criteria for simulating a weather texture image for each depth range in accordance with an embodiment of the present disclosure;
FIG. 12 is a schematic diagram illustrating an example of locations and spacings between image segments to be cropped from a weather texture image according to an embodiment of the present invention;
FIG. 13 is a schematic diagram illustrating an example of locations and spacings between image segments to be cropped from a weather texture image in accordance with an embodiment of the invention;
FIG. 14 is a diagram illustrating an example of cropping image segments from different locations of a weather texture image based on weather, according to an embodiment of the invention;
FIG. 15 is a schematic diagram illustrating an example of sequentially simulating image segments at a particular period according to an embodiment of the invention;
FIG. 16 is a diagram illustrating an example of simulating image segments in different cycles based on depth range according to an embodiment of the invention;
FIG. 17 is a diagram illustrating an example of cropping image segments in different shapes based on depth range, according to an embodiment of the invention;
FIG. 18 is a schematic diagram illustrating an example of masking or adjusting the transparency of portions of an image segment based on a depth range according to an embodiment of the disclosure;
FIG. 19 is a table illustrating an example of information provided to a device simulating a weather effect according to an embodiment of the disclosure;
FIG. 20 is an image of a Graphical User Interface (GUI) for setting criteria for simulating a weather effect according to an embodiment of the present disclosure;
FIG. 21 is a flow diagram of a method performed by a device to simulate a weather effect in an image using a weather texture image in accordance with an embodiment of the present disclosure;
FIG. 22 is a flow diagram of a method performed by a device of the system to simulate a weather effect in an image by using a weather texture image received from a server in accordance with an embodiment of the present disclosure;
FIG. 23 is a flow diagram of a method performed by the system to simulate a weather effect in an image from which a weather object has been deleted in accordance with an embodiment of the present disclosure;
fig. 24 is a schematic diagram showing an example of an image from which a weather object is deleted according to an embodiment of the present invention;
FIG. 25 is a flow diagram of a method performed by the system to simulate a weather effect in a reference image corresponding to a current time in accordance with an embodiment of the present disclosure;
fig. 26 is a diagram illustrating an example of a reference image corresponding to a preset time period according to an embodiment of the present invention;
FIG. 27 is a flow diagram of a method performed by a system of changing colors of a reference image based on a color change path and simulating a weather effect in the color changed reference image in accordance with an embodiment of the present disclosure;
fig. 28 is a schematic diagram showing an example of a color change path between reference images according to an embodiment of the present invention;
fig. 29 is a schematic diagram showing an example of changing a color pattern of an image according to an embodiment of the present invention;
fig. 30 is a schematic diagram illustrating an example of an image reflecting a rain effect according to an embodiment of the present disclosure;
FIG. 31 is a block diagram of a server according to an embodiment of the present invention;
FIG. 32 is a block diagram of an apparatus according to an embodiment of the invention;
fig. 33 is a diagram illustrating an example in which an external device sets criteria for simulating a weather effect and the device receives information on the set criteria through an external Database (DB) and provides the weather effect in an image, according to an embodiment of the present disclosure;
FIG. 34 is a schematic diagram showing an example in which an external device sets criteria for simulating a weather effect and the device receives information about the set criteria through a server and provides the weather effect in an image, according to an embodiment of the present disclosure; and
fig. 35 is a schematic diagram illustrating an example in which an external device sets a criterion for simulating a weather effect through a server and the device receives information on the set criterion through an external DB and provides the weather effect in an image according to an embodiment of the present disclosure.
Detailed Description
According to an embodiment of the present invention, there is provided a method of an apparatus for providing a weather effect in an image, including obtaining an image to which the weather effect is to be applied; obtaining at least one weather texture image showing weather; and providing a weather effect in the image based on the weather texture image by sequentially overlapping a plurality of image segments obtained from the obtained weather texture image on the image.
According to another embodiment of the present invention, an apparatus for providing weather effects in an image includes a display; a memory storing one or more instructions; and a processor configured to execute the one or more instructions to obtain an image to which a weather effect is to be applied, select at least one weather texture image showing the weather effect, and provide the weather effect in the image on the display based on the weather texture image by sequentially overlapping a plurality of image segments obtained from the weather texture image on the image.
According to another embodiment of the present invention, a computer-readable recording medium has recorded thereon a computer program for executing the above-described method.
[ modes for the invention ]
Hereinafter, the present disclosure will be described in detail by explaining embodiments with reference to the accompanying drawings. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments of the invention set forth herein. In the drawings, portions irrelevant to the present invention are not shown for clarity of explanation, and like reference numerals denote like elements.
It will be understood that when an element is referred to as being "connected to" another element, it can be "directly connected to" the other element or "electrically connected to" the other element through intervening elements. It will be further understood that the terms "comprises" and/or "comprising," when used herein, specify the presence of stated elements, but do not preclude the presence or addition of one or more other elements, unless the context clearly dictates otherwise.
Throughout the disclosure, "at least one of a, b, or c" means all or a variation of only a, only b, only c, both a and b, both a and c, both b and c, a, b, and c.
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings.
Fig. 1 is a schematic diagram illustrating an example of a system for providing a weather effect in a specific image according to an embodiment of the present invention.
Referring to fig. 1, a system for providing weather effects may include a device 1000 and a server 2000.
Device 1000 can select a particular image and display the image to include a three-dimensional (3D) weather effect. The device 1000 may select an image and receive information for simulating a weather effect in the selected image from the server 2000. Device 1000 can provide a weather effect in a selected image by simulating a weather texture image as an overlay or otherwise included in the selected image. The weather texture image may be an image of an object indicating a specific weather, and may include, for example, a raindrop image, a snowflake image, or a fog image, but the type of weather effect is not limited thereto.
To provide a weather effect in an image by cropping a plurality of image segments from a weather texture image and superimposing the plurality of image segments on the image in sequence. The device 1000 may analyze the depth of an object in the image and superimpose an image segment reflecting the current weather on the image based on the analyzed depth. To provide a weather effect in an original picture or a picture converted from the original picture, the device 1000 may adjust transparency of an image segment based on depth and synthesize the transparency-adjusted image segment with an image.
The device 1000 may reflect weather effects in a particular image in the ambient mode. The ambient mode may be an operational mode for providing only some functions of the device 1000 at low power. For example, in the ambient mode, most functions of the device 1000 may not be activated, and only input/output functions and some preset functions of the device 1000 may be activated on the display. Alternatively, the apparatus 1000 may display an image in which a weather effect is reflected as a background image of the apparatus 1000.
Device 1000 may be, for example, a smartphone, a tablet PC, a smart television, a mobile phone, a Personal Digital Assistant (PDA), a laptop, a media player, a microserver, a Global Positioning System (GPS) device, an e-book reader, a digital broadcast receiver, a navigation system, a kiosk, an MP3 player, a digital camera, a household appliance, or another mobile or non-mobile computing device, but is not limited to such.
Fig. 2 is a schematic diagram illustrating an example of reflecting a weather effect in an image according to an embodiment of the present invention.
Referring to FIG. 2, the image may be analyzed using one or more artificial intelligence models. For example, the image may be a two-dimensional (2D) or three-dimensional (3D) image. The image may be input to a first artificial intelligence model to detect and recognize an object in the image, and thus object recognition information indicating the recognized object in the image may be output from the first artificial intelligence model. The object identification information may include information on the position and shape of the object in the image. The image may be input to a second artificial intelligence model to analyze a space in the image, and thus spatial information regarding a depth of the space in the image may be output from the second artificial intelligence model. The second artificial intelligence model can be used to estimate 3D space in the 2D image by analyzing spatial depth in the 2D image. The image may be input to a third artificial intelligence model for deleting the weather object in the image, and thus the image from which the weather object is deleted may be output from the third artificial intelligence model.
The first to third artificial intelligence models may be constructed in consideration of an application field of the recognition model, a purpose of training, or a computational performance of the apparatus. The first to third artificial intelligence models may be, for example, artificial neural network-based models. The artificial neural network may include, for example, a Convolutional Neural Network (CNN), a Deep Neural Network (DNN), a Recurrent Neural Network (RNN), a Restricted Boltzmann Machine (RBM), a Deep Belief Network (DBN), a Bidirectional Recurrent Deep Neural Network (BRDNN), or a deep Q network, but the type of the artificial neural network is not limited thereto.
A single artificial intelligence model may be used that provides the functionality of two or more of the first through third artificial intelligence models described above.
The apparatus 1000 may obtain object identification information, spatial information, and an image from which a weather object is deleted, and obtain a weather texture image corresponding to current weather. The device 1000 may clip a plurality of image segments from the weather texture image based on preset criteria. The apparatus 1000 may provide an image reflecting a weather effect related to current weather by sequentially overlapping a plurality of image segments on an image from which a weather object is deleted.
FIG. 3 is a flow diagram of a method performed by a system to provide a weather effect in an image according to an embodiment of the disclosure.
In operation S300, the apparatus 1000 may select an image. The device 1000 may display a Graphical User Interface (GUI) for selecting an image to be displayed on a screen in the environment mode, and select the image based on a user input received through the displayed GUI. For example, device 1000 can select pictures stored in device 1000 or pictures taken in real-time.
In operation S305, the device 1000 may transmit an image to the server 2000. The apparatus 1000 may provide the image to the server 2000 and request the server 2000 to provide data required to simulate a weather effect in the image.
Although the device 1000 selects an image and transmits it to the server 2000 in operations S300 and S305, the present disclosure is not limited thereto. Server 2000 may provide device 1000 with a list of images stored in server 2000, and device 1000 may select a particular image in the list of images.
In operation S310, the server 2000 may detect and recognize an object in an image by using the first artificial intelligence model. The server 2000 may obtain the object recognition information indicating or recognizing the object in the image by inputting the image into the first artificial intelligence model for recognizing the object in the image. For example, objects may include people, sky, buildings, and trees. The object identification information is information indicating an object identified in the image, and may include, for example, information on a shape of the object included in the image, information on a position of the object, and identification information of the object, but the object identification information is not limited thereto. The first artificial intelligence model may be a model pre-trained to recognize objects in the image, and may be, for example, an artificial neural network-based model.
In operation S315, the server 2000 may obtain spatial information regarding the depth of the object in the image by using the second artificial intelligence model. The server 2000 may obtain spatial information on spatial depth in the image by inputting the image into a second artificial intelligence model for analyzing a space in the image. The spatial information is information indicating a depth of a space in the image, and may include, for example, information on a depth of an object included in the image, information on a depth of a background in the image, and information on a depth relationship between objects in the image, but the spatial information is not limited thereto. For example, when an object in an image is a tree and a background of the image is a sky, the spatial information may include information indicating a depth of the tree, information indicating a depth of the sky, and information indicating that the tree is placed at a reference position closer than the sky. The depth of objects in the image may indicate whether each object is placed at a near or far location in the image. For example, when the depth range of the object in the image is 0 to 100, the values of 0 to 40 may be set to a short distance, the values of 40 to 70 may be set to a medium distance, and the values of 70 to 100 may be set to a long distance. The second artificial intelligence model may be a model pre-trained to analyze space in the image, and may be, for example, an artificial neural network-based model.
The first and second artificial intelligence models can be implemented by a single artificial intelligence model. In this case, the image may be input into a single artificial intelligence model to provide functions of the first artificial intelligence model and the second artificial intelligence model, so that spatial information about a space in the image and object recognition information about an object in the image may be output.
The server 2000 may transmit the object identification information and the spatial information to the device 1000 in operation S320. The server 2000 may transmit object recognition information and spatial information obtained by analyzing an image selected by the apparatus 1000 using at least one artificial intelligence model to the apparatus 1000.
In operation S330, the device 1000 may identify a current weather condition corresponding to a current location of the device 1000. Device 1000 may identify the current weather in the area where device 1000 is located, or the current weather in an area selected by the user that is different from the area where device 1000 is located. The device 1000 may receive weather information indicating current weather from the server 2000 in real time at a preset period or according to a request of the device 1000. The weather information indicating the current weather may include, for example, information indicating cloud, snow, rain, fog, lightning, wind, precipitation, rainfall, fog density, cloud cover, wind direction, gust, wind speed, relative humidity, temperature, somatosensory problem, air pressure, solar radiation, visibility, Ultraviolet (UV) index, dew point, but the weather information is not limited thereto. The weather information may include, for example, information about weather forecasts, hourly weather forecasts, and weekly weather forecasts.
In operation S335, the apparatus 1000 may obtain a weather texture image corresponding to current weather. The apparatus 1000 may pre-store a plurality of weather texture images related to various weather types and select at least one weather texture image suitable for current weather from the plurality of pre-stored weather texture images. For example, device 1000 may store in memory a plurality of weather texture images indicating rainy weather, a plurality of weather texture images indicating snowy weather, and a plurality of weather texture images indicating foggy weather.
The device 1000 may pre-store weather texture images related to predicted weather, taking into account information about weather forecast, hourly weather forecast, and weekly weather forecast. For example, the device 1000 may request the server 2000 to provide a weather texture image related to a one-week weather forecast based on the one-week weather forecast, and store the weather texture image received from the server 2000 in the memory. In this case, since the device 1000 stores only the weather texture image related to the weather predicted for the predetermined time in the memory, the memory of the device 1000 can be efficiently managed.
The plurality of weather texture images registered in advance by weather types may respectively correspond to the depth ranges. The plurality of weather texture images previously registered by weather may be different from each other based on the depth range. The depth range may be a value of a depth range used to define a space in the image. For example, when the depth range of the space in the image is 0 to 100, a value of 0 to 40 may be set as the first depth range, a value of 40 to 70 may be set as the second depth range, and a value of 70 to 100 may be set as the third depth range. The details of the weather expression may increase as the depth range increases. The resources used to provide the weather effect may be reduced inversely proportional to the number of depth ranges. The number of depth ranges may be, for example, 2 to 10.
The apparatus 1000 may select a weather texture image corresponding to a characteristic of current weather from a plurality of weather texture images. Device 1000 can select a weather texture image based on characteristics of the current weather and depth of space in the image. For example, the apparatus 1000 may select a weather texture image corresponding to a rainfall of 10ml/h and a wind speed of 7km/h from among the plurality of weather texture images, and select weather texture images corresponding to first and second depth ranges of a space in the image from among the selected weather texture images.
When the weather texture image suitable for the current weather is not stored in the memory, the apparatus 1000 may request the server 2000 to provide the weather texture image suitable for the current weather and receive the requested weather texture image from the server 2000.
In operation S340, the apparatus 1000 may simulate a weather effect indicating current weather in an image by using the obtained weather texture image. Device 1000 can select an image segment for replacement or modification from the selected weather texture image. The size of the weather texture image may be larger than the size of the image selected by the device 1000, and the device 1000 may select an image segment from the weather texture image to fit the size of the image selected by the device 1000. Device 1000 can replace image segments from the weather texture image by moving the clipping location. The clipping position may be variously determined according to weather characteristics. For example, the device 1000 may determine the clipping position as an area of the rain texture image including a large amount of raindrops when the rainfall is high, and determine the clipping position as an area of the rain texture image including a small amount of raindrops when the rainfall is low. The degree of movement may be determined differently based on the depth to which the weather texture image is to be applied. For example, since raindrops in the near space move fast, the device 1000 may cut out an image segment from a rain texture image by moving a cut-out position at large intervals to display raindrops in the near space in the image. For example, since raindrops in the far space move slowly, the device 1000 may cut out an image segment from the rain texture image by moving the cut-out position at small intervals to display raindrops in the far space in the image.
The device 1000 may provide a weather effect in an image by sequentially overlapping, interleaving, or otherwise merging image segments over the image at a preset period. For example, the device 1000 may superimpose an image segment obtained from a weather texture image corresponding to a first depth range on an image together with an image segment obtained from a weather texture image corresponding to a second depth range. A period of sequentially reproducing the image segments obtained from the weather texture image corresponding to the first depth range may be different from a period of sequentially reproducing the image segments obtained from the weather texture image corresponding to the second depth range.
The device 1000 may simulate weather effects in an image, such as snow, rain, or fog effects, by merging image segments cut from a weather texture image on the image using frames of the weather texture image.
The apparatus 1000 may adjust transparency of the image segment differently based on the depth, and simulate a weather effect by using the transparency-adjusted image segment. For example, when the spatial depth in the image is between 0 and 100, the apparatus 1000 may adjust the transparency of the image segment corresponding to the depth between 0 and 40 to 30%, the transparency of the image segment corresponding to the depth between 40 and 70 to 50%, and the transparency of the image segment corresponding to the depth from 70 to 100 to 70%.
Fig. 4 is a flowchart of a method performed by the apparatus 1000 of selecting a weather texture image corresponding to current weather according to an embodiment of the present disclosure.
In operation S400, the apparatus 1000 may obtain a plurality of weather texture images indicating a first weather condition and corresponding to a plurality of depth ranges. The plurality of weather texture images indicating the first weather condition may respectively correspond to the plurality of depth ranges. For example, device 1000 can obtain, from memory, a weather texture image corresponding to a first weather condition and a first depth range, a weather texture image corresponding to the first weather condition and a second depth range, and a weather texture image corresponding to the first weather condition and a third depth range. The first weather condition may correspond to at least one weather characteristic and an intensity or value thereof. The weather characteristics may include, for example, information indicating cloud, snow, rain, fog, lightning, wind, precipitation, fog density, cloud cover, wind direction, wind gust, wind speed, relative humidity, temperature, sensible temperature, atmospheric pressure, solar radiation, visibility, UV index, and dew point, but the weather characteristics are not limited thereto.
In operation S410, the apparatus 1000 may obtain a plurality of weather texture images indicating a second weather condition and corresponding to a plurality of depth ranges. The plurality of weather texture images indicating the second weather condition may respectively correspond to the plurality of depth ranges. For example, device 1000 can obtain, from memory, a weather texture image corresponding to a second weather condition and a first depth range, a weather texture image corresponding to the second weather condition and a second depth range, and a weather texture image corresponding to the second weather condition and a third depth range. The second weather condition may correspond to at least one weather characteristic and an intensity or value thereof. The weather characteristics may include, for example, information indicating cloud, snow, rain, fog, lightning, wind, precipitation, fog density, cloud cover, wind direction, wind gust, wind speed, relative humidity, temperature, sensible temperature, atmospheric pressure, solar radiation, visibility, UV index, and dew point, but the weather characteristics are not limited thereto.
In operation S420, the apparatus 1000 may select a weather texture image corresponding to current weather. The current weather may be weather at a location corresponding to the location of device 1000 or weather at a location selected by a user of device 1000 that is different from the current location of device 1000. The apparatus 1000 may select a weather texture image corresponding to a characteristic of current weather from a plurality of weather texture images. Device 1000 can select a weather texture image based on characteristics of the current weather and depth of space in the image. For example, the apparatus 1000 may select a weather texture image corresponding to a rainfall of 10ml/h and a wind speed of 7km/h from among the plurality of weather texture images, and select weather texture images corresponding to first and second depth ranges of a space in the image from among the selected weather texture images.
When the weather texture image suitable for the current weather is not stored in the memory, the apparatus 1000 may request the server 2000 to provide the weather texture image suitable for the current weather and receive the requested weather texture image from the server 2000.
Although the apparatus 1000 obtains a plurality of weather texture images indicating a first weather condition and a plurality of weather texture images indicating a second weather condition in fig. 4, the apparatus 1000 obtaining the weather texture images is not limited thereto. The device 1000 may obtain weather texture images indicating various preset weather types. The device 1000 may receive weather texture images indicating various preset weather types from the server 2000 in advance, and thus effectively use the weather texture image suitable for the current weather even if the current weather changes.
Fig. 5 is a schematic diagram illustrating an example of a plurality of weather texture images corresponding to weather according to an embodiment of the present invention.
Referring to fig. 5, the plurality of weather texture images indicating snowing weather may include a first snow texture image 50, a second snow texture image 51, and a third snow texture image 52. The first snow texture image 50 may correspond to "snow" and a first depth range, the second snow texture image 51 may correspond to "snow" and a second depth range, and the third snow texture image 52 may correspond to "snow" and a third depth range.
The first depth range may be shallower than the second depth range (shoollower), and the second depth range may be shallower than the third depth range. For example, when the depth range of a space or an object in an image is 0 to 100, a value of 0 to 40 may be set as a first depth range, a value of 40 to 70 may be set as a second depth range, and a value of 70 to 100 may be set as a third depth range. The first snow texture image 50 corresponds to a depth shallower than the depth of the second and third snow texture images 51 and 52, and thus the size of the snowflakes included in the first snow texture image 50 may be larger than the size of the snowflakes included in the second and third snow texture images 51 and 52.
The second snow texture image 51 corresponds to a depth deeper than the depth of the first snow texture image 50 and shallower than the depth of the third snow texture image 52, and thus, the snowflakes included in the second snow texture image 51 may appear smaller in size than the snowflakes included in the first snow texture image 50 and may be larger in size than the snowflakes included in the third snow texture image 52.
Alternatively, for example, the plurality of weather texture images indicating rainy weather may include the first rain texture image 55, the second rain texture image 56, and the third rain texture image 57. The first rain texture image 55 may correspond to "rain" and a first depth range, the second rain texture image 56 may correspond to "rain" and a second depth range, and the third rain texture image 57 may correspond to "rain" and a third depth range.
The device 1000 may store the first to third snow texture images 50 to 52, the first to third rain texture images 55 to 57, and the like, in the memory in association with various weather characteristics and various depth ranges.
Although the weather texture image corresponds to one weather characteristic in fig. 5, the weather texture image is not limited thereto. The weather texture image may correspond to a plurality of weather characteristics. For example, a weather texture image may correspond to rain, precipitation, wind speed, and wind direction. In this case, the size, density, direction, and the like of an object (e.g., raindrops) included in the weather texture image may be different based on the weather characteristics corresponding to the weather texture image.
The weather texture image may be an image of a weather object displayed on a transparent layer. In this way, only weather objects may be displayed on the image when the weather texture image is overlaid on the image.
Fig. 6 is a schematic diagram illustrating an example of object identification information 62 indicating an object identified in an image 60 according to an embodiment of the present disclosure.
Referring to fig. 6, the server 2000 may recognize an object in the image 60 by inputting the image 60 to the first artificial intelligence model and obtain object recognition information 62 regarding the shape and position of the object. Although the object identification information 62 has the form of an image in fig. 6, the object identification information 62 is not limited thereto, and may include various format data capable of identifying the position, shape, and the like of an object.
The object identification information 62 is information indicating an object identified in the image 60, and may include, for example, information on the shape of the object included in the image 60, information on the position of the object, and identification information of the object, but the object identification information is not limited thereto. The first artificial intelligence model may be a model pre-trained to detect and recognize objects in the image 60, and may be, for example, an artificial neural network-based model. The first artificial intelligence model may be, for example, an artificial intelligence model for semantic image segmentation. The first artificial intelligence model may detect and identify objects and the locations of objects in the image 60 by estimating the classes of pixels in the image 60.
Fig. 7 is a schematic diagram illustrating an example of spatial information 72 indicating a space analyzed in an image 70 according to an embodiment of the present disclosure.
Referring to FIG. 7, the server 2000 may analyze a space or region in the image 70 by inputting the image 70 into a second artificial intelligence model and obtain spatial information 72 indicative of a depth of the space in the image 70. Although the spatial information 72 has the form of an image in fig. 7, the spatial information 72 is not limited thereto, and may include various format data capable of identifying the depth of a space in the image 70. The spatial information 72 is information indicating a depth of a space in the image 70, and may include, for example, information on a depth of an object included in the image 70, information on a depth of a background in the image 70, and information on a depth relationship between objects in the image 70, but is not limited thereto. The second artificial intelligence model may be a model pre-trained to analyze spatial depth in the image 70, and may be, for example, an artificial neural network-based model. The second artificial intelligence model may be, for example, an artificial intelligence model for depth prediction/estimation.
Fig. 8 is a schematic diagram illustrating an example of weather information according to an embodiment of the present invention.
Referring to fig. 8, the weather information may include, for example, information indicating a location, time, cloud, snow, rain, relative humidity, temperature, sensible temperature, weather, atmospheric pressure, solar radiation, visibility, wind direction, gust, wind speed, UV index, and dew point.
Device 1000 can simulate a 3D image effect in an image by using a weather texture image indicating weather (e.g., snow, rain, sunlight, clouds, fog, lightning, or wind). The apparatus 1000 may reflect 3D image effects related to one or more of visibility, wind speed, and temperature in the original image. The device 1000 may reflect 3D image effects in the image, for example, taking into account weather parameters such as intensity of wind, rainfall, snow amount, fog density, air resistance, distance, and direction. Accordingly, the intensity of the weather effect may be reflected in the image accordingly according to the intensity of the weather effect of the specific location selected by the user or the location of the device 1000.
Fig. 9 is a flowchart of a method performed by a system of providing a weather effect in an image based on criteria information for simulating the weather effect, according to an embodiment of the present disclosure.
Operations S900 to S915 correspond to operations S300 to S315 of fig. 3, and thus redundant description thereof is omitted.
In operation S920, the server 2000 may obtain criterion information for simulating a weather effect. The server 2000 may determine criteria for simulating a weather texture image. For example, the server 2000 may determine criteria for a weather texture image to use based on weather, criteria for a portion of the weather texture image from which an image segment is to be replaced or modified, criteria for an interval between image segments, and criteria for a time to display an image segment based on a depth range. The server 2000 may obtain criterion information regarding the determined criterion. The criterion information for simulating the weather texture image may include, for example, information on an identifier of the weather texture image to be used based on weather, a criterion for replacing or modifying the image segments based on the weather texture image, an interval between the image segments, and a time at which the image segments are displayed. The criterion information may be parameter type data for downloading data regarding a specific criterion for simulating a weather texture image. The criteria and the criteria information for simulating the weather texture image will be described in detail below.
In operation S925, the server 2000 may provide the criterion information about the determined criterion, the object identification information, and the spatial information to the device 1000. The criterion information, the object identification information, and the spatial information may be provided to the apparatus 1000 in the form of parameter values.
The device 1000 may identify current weather based on the location of the device 1000 or a user-selected location in operation S930, and obtain a weather texture image corresponding to the current weather in operation S935. The device 1000 may obtain a weather texture image corresponding to current weather based on the criterion information received from the server 2000. For example, the device 1000 may check an identifier of a weather texture image corresponding to current weather according to the criterion information, and receive the weather texture image from the server 2000 based on the identifier of the weather texture image. Alternatively, the device 1000 may extract the weather texture image from the memory based on the identifier of the weather texture image. Device 1000 can select a weather texture image based on the current weather and the depth of objects and spaces in the image.
In operation S940, the apparatus 1000 may simulate a weather effect in the image based on the criterion information. The apparatus 1000 may select a plurality of image segments from the weather texture image according to the current weather and the depth of the object and space in the image based on the criterion information, and sequentially overlap the plurality of image segments on the image at a specific period. For example, the device 1000 may overlap multiple image segments on an image by using alpha blending (alpha blending). In this case, the apparatus 1000 may apply a specific transparency to the image segment based on the depth range corresponding to the image segment and overlap the image segment to which the specific transparency is applied on the image. The device 1000 may mask a portion of the image segment corresponding to the long-range region based on certain criteria.
Fig. 10 is a flow chart of a method performed by the server 2000 of the system to determine criteria for simulating a plurality of weather texture images according to an embodiment of the present disclosure.
In operation S1000, the server 2000 may select a weather texture image corresponding to a specific depth range. The server 2000 may select a weather texture image corresponding to a specific weather and a specific depth range from weather texture images stored in the Database (DB).
In operation S1010, the server 2000 may determine a location of an image segment to be used for simulating a weather effect in the weather texture image. The server 2000 may determine criteria from which to obtain portions of the weather texture image of the image segments, and the spacing between the image segments, based on weather characteristics. For example, when the amount of rainfall is large, the server 2000 may set the position of the image segment in such a manner that the image segment is cut out at a large interval from a portion of the rain-textured image having dense raindrops.
In operation S1020, the server 2000 may determine a period for simulating the image segment. Server 2000 may determine a period for simulating an image segment based on the depth range. For example, the server 2000 may set a short simulation period for an image segment cut out from a weather texture image corresponding to a shallow depth range, and set a long simulation period for an image segment cut out from a weather texture image corresponding to a deep depth range. Thus, weather effects to be displayed at different depths of the image can be independently reproduced.
Fig. 11 is a flow chart of a method performed by the server 2000 to determine criteria for simulating a weather texture image by depth range in accordance with an embodiment of the present disclosure.
In operation S1100, the server 2000 may select a weather texture image corresponding to a first depth range. The first depth range may be a range in which a depth is shallower than a depth of a specific object in the image, and the weather texture image corresponding to the first depth range may include a weather object having a size greater than a preset value. The weather object may be an object indicating a specific weather, such as a raindrop or a snowflake. For example, when the depth range of the space in the image is 0 to 100 and the depth of the nearest object in the image is 40, the server 2000 may determine the depth range of 0 to 40 as the first depth range.
In operation S1110, the server 2000 may determine a criterion for simulating a weather texture image corresponding to a first depth range. The server 2000 may determine the location of the image segment to be obtained from the weather texture image, the interval between the image segments, and the criterion of the simulation period of the image segment based on the characteristics of weather and the depth range of the weather texture image. For example, when the weather texture image corresponding to the first depth range is a rain texture image corresponding to rainy weather, the server 2000 may set a criterion for a portion of the rain texture image from which the image segment is to be obtained, based on at least one of the amount of precipitation, the wind speed, or the wind direction. The server 2000 may set the interval between image segments to be obtained based on at least one of precipitation, wind speed, or wind direction. The server 2000 may set a simulation period for sequentially simulating the image segments based on the depth range of the weather texture image. For example, the first depth range may be a shallower depth range than a second depth range described below, and image segments obtained from a weather texture image corresponding to the first depth range may be simulated in the image at a shorter period than image segments cut from the weather texture image corresponding to the second depth range described below.
In operation S1120, the server 2000 may select a weather texture image corresponding to the second depth range. The second depth range may be a range deeper than the depth of the specific object in the image, and the weather texture image corresponding to the second depth range may include a weather object having a size smaller than a preset value. For example, when the depth range of the space in the image is 0 to 100 and the depth of the nearest object in the image is 40, the server 2000 may determine the depth range of 40 to 100 as the second depth range.
In operation S1130, the server 2000 may determine a criterion for simulating a weather texture image corresponding to the second depth range. The second depth range may be a depth range deeper than a depth of the particular object in the image, and the image segment obtained from the weather texture image corresponding to the second depth range may be simulated as if displayed behind the particular object. In this way, the server 2000 may determine the shape of the image segment to be cropped from the weather texture image corresponding to the second depth range such that the image segment does not overlap with the particular object in the image.
Alternatively, the server 2000 may control the transparency of the cutout image segment or mask a portion of the cutout image segment in such a manner that the cutout image segment does not overlap with a specific object in the image. In this case, the server 2000 may determine that the region of the cutout image segment is transparent or occluded based on the region occupied by the object closer than the cutout image segment. The level of transparency can be controlled.
For example, when the weather texture image corresponding to the second depth range is a rain texture image corresponding to rainy weather, the server 2000 may set a criterion for a portion of the rain texture image from which the image segment is to be obtained, based on at least one of the amount of precipitation, the wind speed, or the wind direction. The server 2000 may set the interval between image segments to be obtained based on at least one of precipitation, wind speed, or wind direction. The server 2000 may set a simulation period for sequentially simulating the image segments based on the depth range of the weather texture image. For example, the second depth range may be a deeper depth range than the first depth range, and image segments obtained from a weather texture image corresponding to the second depth range may be simulated in the image at a longer period than image segments obtained from a weather texture image corresponding to the first depth range.
Fig. 12 is a schematic diagram illustrating an example of positions and intervals between image segments to be cut out of a weather texture image according to an embodiment of the present invention.
Fig. 13 is a schematic diagram illustrating an example of positions and intervals between image segments to be cut out of a weather texture image according to an embodiment of the present invention.
Referring to fig. 12, image segments 111, 112, and 113 may be cut from the rain-textured image 110, and referring to fig. 13, image segments 114, 115, and 116 may be cut from the rain-textured image 110. The cropping direction of the image segment to be cropped from the rain texture image 110 and the interval between the image segments may be adjusted based on the amount of precipitation, the wind direction, and the wind speed. For example, when the amount of precipitation, wind direction, and wind speed are low, the image segments 111, 112, and 113 may be selected at small intervals in a direction close to the vertical direction of the rain texture image 110. Otherwise, when the amount of precipitation and the wind speed are high, the image segments 114, 115, and 116 may be selected at large intervals in the diagonal direction of the rain texture image 110.
FIG. 14 is a diagram illustrating an example of selecting image segments from different locations of a weather texture image based on weather, according to an embodiment of the invention.
Referring to fig. 14, raindrops having different shapes may be placed in the rain texture image 130 at different densities. For example, raindrops may be placed vertically on the left side of the rain texture image 130 and raindrops may be placed diagonally on the right side of the rain texture image 130. For example, raindrops may be placed at a low density on top of the rain texture image 130 and raindrops may be placed at a high density on the bottom of the rain texture image 130.
Therefore, when the rainfall and wind speed are low, selection criteria may be set to select the image segments 131, 132, and 133 from the upper left portion of the rain texture image 130. Otherwise, when the rainfall and wind speed are high, selection criteria may be set to select the image segments 134, 135 and 136 from the lower right portion of the rain texture image 130.
Fig. 15 is a schematic diagram illustrating an example of sequentially simulating image segments at a specific period according to an embodiment of the present invention.
Referring to fig. 15, the image segments 141, 142, and 143 may be sequentially and repeatedly reproduced on the image 140. The period for covering the image segments 141, 142, and 143 may be preset based on the depth range of the image segments 141, 142, and 143. For example, image segment 141 may be overlaid on image 140 from 0 seconds to 0.1 seconds, and then image segment 142 may be overlaid on image 140 from 0.1 seconds to 0.2 seconds. Image segment 143 may overlay image 140 from 0.2 seconds to 0.3 seconds, and then image segment 141 may overlay image 140 from 0.3 seconds to 0.4 seconds. The transparency of the overlay image segments 141, 142, 143 may be controlled, for example, based on the depth of objects in the image.
A snow effect or other weather effect may be provided in the image 140 by repeatedly superimposing the image segments 141, 142 and 143 in turn on the image 140 at a certain period.
Fig. 16 is a diagram illustrating an example of simulating image segments at different periods based on depth ranges according to an embodiment of the present invention.
Referring to FIG. 16, one of the first image segments 1-1, 1-2, and 1-3 obtained from a weather texture image of a first depth range, one of the second image segments 2-1, 2-2, and 2-3 obtained from a weather texture image of a second depth range, and one of the third image segments 3-1, 3-2, and 3-3 obtained from a weather texture image of a third depth range may be merged on the image 150.
The first depth range may be a shallower depth range than the second depth range, and the second depth range may be a shallower depth range than the third depth range.
First image segments 1-1, 1-2, and 1-3 obtained from a weather texture image for a first depth range may be sequentially overlaid on image 150 at a period of 0.1 seconds. Second image segments 2-1, 2-2, and 2-3 obtained from the weather texture image for the second depth range may be sequentially overlaid on the image 150 at a period of 0.2 seconds. The third image segments 3-1, 3-2 and 3-3 obtained from the weather texture image of the third depth range may be sequentially superimposed on the image 150 with a period of 0.3 seconds.
In this case, at least portions of the first image segments 1-1, 1-2, and 1-3, the second image segments 2-1, 2-2, and 2-3, and the third image segments 3-1, 3-2, and 3-3 may be transparent or obscured based on the position and depth of the object in the image 150.
The first image segments 1-1, 1-2 and 1-3, the second image segments 2-1, 2-2 and 2-3 and the third image segments 3-1, 3-2 and 3-3 may be images on which differently sized weather objects are displayed. The first image segments 1-1, 1-2 and 1-3, the second image segments 2-1, 2-2 and 2-3 and the third image segments 3-1, 3-2 and 3-3 may overlap with different periods, and thus may provide a realistic 3D weather effect in the image 150.
Fig. 17 is a diagram illustrating an example of clipping image segments in different shapes based on depth range according to an embodiment of the present invention.
Referring to fig. 17, the first depth range may be a depth range shallower than the depths of the first object 161 and the second object 162 in the image 160. For example, when the spatial depth range in the image 160 is 0 to 100, the first depth range may be a depth range of 0 to 40. An image segment 164 having the same size as the image 160 may be cropped from the weather texture image 163 corresponding to the first depth range.
The second depth range may be a depth range that is deeper than the depth of the first object 161 and shallower than the depth of the second object 162 in the image 160. For example, when the spatial depth range in the image 160 is 0 to 100, the second depth range may be a depth range of 40 to 70. An image segment 166 having a shape that does not overlap with the first object 161 may be cropped from the weather texture image 165 corresponding to the second depth range.
The third depth range may be a depth range deeper than the depths of the first object 161 and the second object 162 in the image 160. For example, when the spatial depth range in the image 160 is 0 to 100, the third depth range may be a depth range of 70 to 100. An image segment 168 having a shape that does not overlap with the first object 161 and the second object 162 may be cropped from the weather texture image 167 corresponding to the third depth range.
Fig. 18 is a schematic diagram illustrating an example of masking or adjusting transparency of portions of an image segment based on a depth range according to an embodiment of the present invention.
Referring to fig. 18, the image segments 170, 174, and 178 may be selected in a rectangular shape. The shape of the image segments 170, 174, and 178 is not limited and may be any shape including square, circular, and irregular shapes. The shape of the image segment may correspond to the shape of an object in the image that is to overlay the image segment.
In this case, since the image segment 170 corresponds to the third depth range, the region 172 of the image segment 170 overlapping with an object having a depth shallower than the third depth range may be masked. Optionally, the region 172 of the image segment 170 may be made transparent.
Because the image segment 174 corresponds to the second depth range, a region 176 of the image segment 174 that overlaps objects having a depth shallower than the second depth range may be masked. Optionally, the region 176 of the image segment 174 may be made transparent.
Fig. 19 is a table illustrating an example of information provided to the apparatus 1000 to simulate a weather effect according to an embodiment of the present disclosure.
Referring to fig. 19, object identification information, spatial information, criterion information, and the like may be provided to the apparatus 1000.
The request parameter is an example of criterion information required to simulate a weather effect, and may include an identifier of the weather effect (e.g., effect _ id), an identifier of the image (e.g., image _ num), and information indicating a criterion for simulating the weather effect (e.g., effect-information). The request parameters may be fixed values and may be contained in a JavaScript object notation (Json) type file.
The image may comprise an image to which a weather effect is to be applied. The image to which the weather effect is to be applied may include an original image, an image converted based on time, and an image converted based on season, but the image is not limited thereto.
The depth map is an example of spatial information indicating a depth of a space in an image, and may be a map file generated by depth prediction using a second artificial intelligence model. The depth map may be generated by recognizing a 3D space from a 2D image and representing distance values of pixels in the image in the form of a map.
The segmentation map is an example of object identification information indicating an object identified in an image, and may be a map file generated by semantic image segmentation using a first artificial intelligence model.
The texture may indicate a weather texture image for providing a weather effect.
The server 2000 may or may not provide the device 1000 with the request parameters, the depth map, and the segmentation map, and may not provide the device 1000 with an image to which a weather effect is to be applied and a weather texture image. In this case, the server 2000 may provide the device 1000 with link information for downloading an image to which a weather effect is to be applied and link information for downloading a weather texture image. The server 2000 may transmit a compressed file generated by compressing at least a portion of the data of fig. 19 to the device 1000.
FIG. 20 is an image of a GUI 20 for setting criteria for simulating a weather effect according to an embodiment of the present disclosure.
Through the GUI 20 of FIG. 20, a user of the device 1000 or an operator of the server 2000 may set criteria for simulating a weather effect. The GUI 20 of fig. 20 may be provided to the device 1000 via a web-based service. Referring to fig. 20, the GUI 20 for setting criteria for simulating a weather effect may include an area 22 for selecting an image to which the weather effect is to be applied, an area 23 displaying a preview image to which the weather effect is applied, an area 24 for selecting a weather type, and an area 25 for setting parameters of the weather effect.
The list of images stored in the device 1000 and the list of images stored in the server 2000 may be displayed in the area 22 for selecting an image to which a weather effect is to be applied.
When the list of images stored in the server 2000 is displayed in the area 22 for selecting an image to which a weather effect is to be applied, the server 2000 may recommend an image related to the current weather. The server 2000 may classify images stored in the server 2000 based on weather. The server 2000 may receive weather information related to current weather from the weather service provider server and recommend an image corresponding to the current weather to the device 1000 based on the received weather information.
The server 2000 may recommend an image corresponding to the current weather to the device 1000 in consideration of the time and area where the device 1000 is located. In this case, the server 2000 may classify the image based on the location and time.
When a list of images stored in the device 1000 is displayed in the area 22 for selecting an image to which a weather effect is to be applied, the device 1000 may recommend an image related to the current weather. Device 1000 can classify images stored in device 1000 based on weather. The apparatus 1000 may receive weather information related to current weather from a weather service provider server and recommend an image corresponding to the current weather to a user based on the received weather information.
The GUI 20 may be provided to the device 1000 through a specific Application Programming Interface (API).
An identifier of a weather effect applicable to the image may be displayed in the area 24 for selecting a weather type. For example, identifiers indicating rain, snow, and fog effects may be displayed in area 24.
A value indicative of a characteristic of the weather effect may be displayed in area 25 for setting a parameter of the weather effect. For example, values for setting the speed, amount, and angle of the weather object may be displayed in the area 25.
A preview image reflecting the weather effect may be displayed in the area 23. A preview image of the simulation result of the weather effect selected in the area 24 may be displayed in real time in the area 23 in an image based on the characteristics selected in the area 25.
Fig. 21 is a flowchart of a method performed by the apparatus 1000 for simulating a weather effect in an image by using a weather texture image according to an embodiment of the present disclosure.
In operation S2100, the apparatus 1000 may obtain a plurality of image segments from a weather texture image corresponding to a first depth range. Device 1000 can obtain weather information indicating current weather and identify characteristics of the current weather. The apparatus 1000 may identify a location of the image segment and an interval between the image segments according to characteristics of current weather based on criterion information for simulating a weather effect. Device 1000 may obtain a plurality of image segments from a weather texture image based on the identified location and interval.
In operation S2110, the apparatus 1000 may identify a simulation period. The device 1000 may identify a simulation period based on the current weather and characteristics of the first depth range based on criteria information for simulating a weather effect.
In operation S2120, the apparatus 1000 may obtain a plurality of image segments from a weather texture image corresponding to a second depth range. The apparatus 1000 may identify a location of the image segment and an interval between the image segments according to characteristics of current weather based on criterion information for simulating a weather effect. The device 1000 may select a plurality of image segments from the weather texture image based on the identified clipping locations and intervals. When an object exists in the image at a depth shallower than the second depth range, the apparatus 1000 may control the image segment not to overlap with the object in the image.
In operation S2130, the apparatus 1000 may identify a simulation cycle. The device 1000 may identify a simulation period based on the current weather and characteristics of the second depth range based on criteria information for simulating weather effects.
In operation S2140, the apparatus 1000 may simulate an image segment corresponding to a first depth range and an image segment corresponding to a second depth range in an image. The device 1000 may superimpose one of the plurality of image segments corresponding to the first depth range and one of the plurality of image segments corresponding to the second depth range together on the image.
Fig. 22 is a flowchart of a method performed by the device 1000 of the system to simulate a weather effect in an image by using a weather texture image received from the server 2000, according to an embodiment of the present disclosure.
Operations S2200 through S2215 of fig. 22 correspond to operations S300 through S315 of fig. 3, and thus redundant description thereof is omitted.
In operation S2220, the server 2000 may obtain a plurality of weather texture images. The server 2000 may obtain a plurality of weather texture images, which are registered in advance according to weather conditions, from the database. The server 2000 may obtain, for example, a plurality of weather texture images indicating rainy weather, a plurality of weather texture images indicating snowy weather, and a plurality of weather texture images indicating fog weather. The plurality of weather texture images pre-registered by weather may respectively correspond to the depth ranges. The plurality of weather texture images previously registered by weather may be different from each other based on the depth range. The depth range may be a value for defining a depth range of a space in the image.
In operation S2220, the server 2000 may obtain criterion information for simulating a weather effect. For example, the server 2000 may determine criteria for a weather texture image to use based on weather, criteria for a portion of the weather texture image from which an image segment is to be cut, criteria for an interval between image segments, and criteria for a time to display an image segment based on a depth range. In operation S2230, the server 2000 may provide the criterion information, the object identification information, the spatial information, and the plurality of weather texture images with respect to the determined criterion to the device 1000. The apparatus 1000 may pre-store a plurality of weather texture images related to various weather types and select and use at least one weather texture image and the selected image suitable for the current weather from among the plurality of pre-stored weather texture images.
In operation S2240, the device 1000 may simulate a weather effect in the image based on the criterion information. Device 1000 can select a weather texture image based on the current weather and the depth of objects and spaces in the image. The device 1000 may obtain a plurality of image segments from the selected weather texture image and sequentially merge the plurality of image segments on the image at a particular period.
Fig. 23 is a flow diagram of a method performed by a system to simulate a weather effect in an image from which a weather object was deleted in accordance with an embodiment of the present disclosure.
Operations S2300 to S2315 and S2325 correspond to operations S2200 to S2215 and S2225 of fig. 22, and thus redundant description thereof is omitted.
In operation S2320, the server 2000 may delete the weather object from the image by using the third artificial intelligence model. The server 2000 may obtain an image from which a weather object is deleted by inputting the image to the third artificial intelligence model for deleting the weather object in the image. The third artificial intelligence model may be a model pre-trained to remove weather objects in the image, and may be, for example, an artificial neural network-based model.
In operation S2330, the server 2000 may provide the apparatus 1000 with an image from which a weather object is deleted, object identification information, spatial information, and a plurality of weather texture images.
In operation S2340, the apparatus 1000 may simulate a weather effect in an image from which a weather object is deleted. Device 1000 can select a weather texture image based on the current weather and the depth of objects and spaces in the image. The apparatus 1000 may obtain a plurality of image segments from the selected weather texture image and sequentially overlap the plurality of image segments with a certain period on the image from which the weather object is deleted.
Fig. 24 is a schematic diagram illustrating an example of an image from which a weather object is deleted according to an embodiment of the present disclosure.
Referring to fig. 24, a picture 190 taken in rainy weather includes raindrops. When the picture 190 including the raindrops is input to the third artificial intelligence model, a picture 192 from which the raindrops are deleted may be output from the third artificial intelligence model. The picture 191 including empty raindrops may be generated by deleting raindrops from the picture 190 including raindrops, and the picture 192 from which raindrops are deleted may be generated by completely omitting raindrops from the image.
Fig. 25 is a flowchart of a method performed by the system to simulate a weather effect in a reference image corresponding to a current time, according to an embodiment of the present disclosure.
Operations S2500 to S2520 correspond to operations S900 to S920 of fig. 9, and thus redundant description thereof is omitted.
In operation S2525, the server 2000 may generate a plurality of reference images corresponding to a preset time period. The server 2000 may generate a reference image corresponding to the morning, a reference image corresponding to the afternoon, a reference image corresponding to the evening, and a reference image corresponding to the evening, for example, by changing the color of an image or other object in the image or a feature in the image indicating a certain time of day (such as brightness, illumination), or the like.
In operation S2530, the server 2000 may provide the apparatus 1000 with a plurality of reference images, object identification information, spatial information, and criterion information.
In operation S2535, the apparatus 1000 may identify a current time.
In operation S2540, the apparatus 1000 may select a reference image corresponding to the current time. The apparatus 1000 may select a reference image corresponding to the current time from among the reference images received from the server 2000. For example, when the current time is 13:00 pm, the apparatus 1000 may select a reference image corresponding to the afternoon.
In operation S2545, the apparatus 1000 may simulate a weather effect in the selected reference image.
Fig. 26 is a schematic diagram illustrating an example of a reference image corresponding to a preset time period according to an embodiment of the present invention.
Referring to fig. 26, a reference image corresponding to the morning, a reference image corresponding to the afternoon, a reference image corresponding to the evening, and a reference image corresponding to the evening may be generated using the images.
Fig. 27 is a flowchart of a method performed by the system to change a color of a reference image based on a color change path and simulate a weather effect in the color-changed reference image, according to an embodiment of the present disclosure.
Operations S2700 to S2720 correspond to operations S900 to S920 of fig. 9, and thus redundant description thereof is omitted.
In operation S2725, the server 2000 may generate a plurality of reference images corresponding to a preset time period. For example, the server 2000 may generate a reference image corresponding to the morning, a reference image corresponding to the afternoon, a reference image corresponding to the evening, and a reference image corresponding to the evening by changing the color of the image.
In operation S2730, the server 2000 may generate color change path information indicating a color change path between the plurality of reference images. The server 2000 may change the first reference image to the second reference image in such a manner that the color of the first reference image is smoothly changed to the color of the second reference image. To this end, the server 2000 may generate the color change path information by determining a path for changing the color of the specific area of the first reference image to the color of the specific area of the second reference image. The server 2000 may obtain color change path information indicating a color change path between the reference images by inputting a plurality of reference images to the fourth artificial intelligence model. The fourth artificial intelligence model may be a model that is pre-trained in consideration of characteristics of the reference images to naturally change colors between the reference images, and may be, for example, an artificial neural network-based model.
The server 2000 may provide the apparatus 1000 with a plurality of reference images, color change path information, object identification information, spatial information, and criterion information in operation S2735.
In operation S2740, the apparatus 1000 may identify a current time.
In operation S2745, the apparatus 1000 may select a reference image corresponding to the current time. The apparatus 1000 may select a reference image corresponding to the current time from among the reference images received from the server 2000. Device 1000 may also select a reference picture that follows the selected reference picture. For example, when the current time is 13:00, the apparatus 1000 may select a reference image corresponding to afternoon and a reference image corresponding to evening.
In operation S2750, the apparatus 1000 may change the color of the selected reference image based on the color change path information. For example, the server 2000 may gradually change the color of the reference image corresponding to the afternoon based on the color change path information between the reference image corresponding to the afternoon and the reference image corresponding to the evening.
In operation S2755, the apparatus 1000 may simulate a weather effect in the color-changed reference image.
Fig. 28 is a schematic diagram illustrating an example of a color change path between reference images according to an embodiment of the present invention.
Referring to fig. 28, color change path information on how the color of the afternoon reference image 230 needs to be changed from the afternoon reference image 230 to the evening reference image 231 may be generated. A color change path may be determined as to how the color needs to change between the various regions of the reference image.
For example, a path 238 on how the color of the specific area 234 in the afternoon reference image 230 needs to be changed to the color of the specific area 235 in the evening reference image 231 may be determined on a specific color chart 237. The specific area 234 in the afternoon reference image 230 may be an area located at the same position as the specific area 235 in the evening reference image 231.
Although in the above description, the server 2000 obtains the object identification information by using the first artificial intelligence model, obtains the spatial information by using the second artificial intelligence model, deletes the weather object in the image therefrom by using the third artificial intelligence model, and obtains the color change path information between the reference images by using the fourth artificial intelligence model, the present invention is not limited thereto.
The apparatus 1000 may store at least one of the first to fourth artificial intelligence models received from the server 2000 and obtain required data by using the at least one of the first to fourth artificial intelligence models. In this case, the first to fourth artificial intelligence models may be implemented as software.
Fig. 29 is a schematic diagram illustrating an example of changing a color pattern of an image according to an embodiment of the present invention.
Referring to fig. 29, an original image may be converted into images of various color patterns. In this case, a reference image related to a specific color pattern may be registered in advance, and when the registered reference image of the specific color pattern is selected, an original image may be converted based on the color pattern of the selected reference image.
Fig. 30 is a schematic diagram illustrating an example of an image reflecting a rain effect according to an embodiment of the present disclosure.
Referring to fig. 30, the device 1000 may provide a weather effect caused when a moving object such as raindrops hits another object in an image by using spatial information of the image. For example, the device 1000 may provide the effect of raindrops splashing on the shoulders and arms of a person. The apparatus 1000 may recognize a position of the display person by using spatial information of the image and provide an effect of raindrop scattering based on a depth of the space in which the person is displayed. The device 1000 may sequentially overlap an image segment of a rain texture image and a segment of a weather texture image including splashed raindrops at the moment when raindrops hit the body of a person. In this case, the position of the splattered raindrop may be determined using spatial information of the image.
The device 1000 may display a dense fog on a long-distance object and a light fog on a short-distance object. The apparatus 1000 may provide 3D image effects reflecting weather parameters (e.g., haze, fog, or fog spots), or provide 3D image effects related to fine dust or dust clouds.
Fig. 31 is a block diagram of a server 2000 according to an embodiment of the present disclosure.
Referring to fig. 31, a server 2000 according to an embodiment of the present disclosure may include a communication interface 2100, a Database (DB)2200, an artificial intelligence model 2300, and a processor 2400. The DB 2200 may include an object recognition information DB 2210, a spatial information DB 2220, and a criterion information DB 2230.
Communication interface 2100 may include one or more elements for communicating with device 1000. For example, communication interface 2100 may include short-range wireless communicators, mobile communicators, and wireless communicators. Communication interface 2100 can transmit to device 1000 or receive from device 1000 information needed to simulate a weather effect in an image.
The device 1000 may store a program for processing and controlling the operation of the processor 2400 in a memory and store information required to simulate a weather effect in an image. The DB 2200 may store, for example, images, weather texture images, object identification information, spatial information, simulation criterion information, reference images, and color change path information. The object recognition information DB 2210 may store the object recognition information output from the first artificial intelligence model. The spatial information DB 2220 may store spatial information output from the second artificial intelligence model. The criterion information DB 2230 may store information on various criteria for simulating a weather effect.
The artificial intelligence model 2300 can perform the operations necessary to simulate a weather effect in an image. For example, the artificial intelligence model 2300 can include a first artificial intelligence model for identifying objects in an image, a second artificial intelligence model for analyzing space in the image, and a third artificial intelligence model for deleting weather objects in the image. The artificial intelligence model 2300 can also include a fourth artificial intelligence model for generating color change path information indicative of color change paths between reference images.
A processor and memory may be used to perform the functions of the present invention related to artificial intelligence. The processor 2400 may include one or more processors. In this case, the one or more processors may be general-purpose processors such as a Central Processing Unit (CPU), an Application Processor (AP), and a Digital Signal Processor (DSP), special-purpose graphics processors such as a Graphics Processing Unit (GPU) and a Visual Processing Unit (VPU), or special-purpose artificial intelligence processors such as a digital processing unit (NPU). The one or more processors control the processing of the input data based on predefined operating rules or artificial intelligence models stored in the memory. Alternatively, when one or more of the processors is a dedicated artificial intelligence processor, the dedicated artificial intelligence processor can be designed as a hardware architecture that is dedicated to processing a particular artificial intelligence model.
Predefined operation rules or artificial intelligence models are generated through training. Here, to be realized by training means to train a basic artificial intelligence model based on pieces of training data by using a learning algorithm, thereby forming a predefined operation rule or an artificial intelligence model configured to realize a desired feature (or purpose). The training may be performed by a device having artificial intelligence functionality according to the present invention, or by a separate server and/or system. The learning algorithm may include, for example, supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning, but is not limited thereto.
The artificial intelligence model can include a plurality of neural network layers. Each of the plurality of neural network layers has a plurality of weights, and the neural network calculation is performed by a calculation between a calculation result of a previous layer and the plurality of weights. The plurality of weights for the plurality of neural network layers may be optimized by training the results of the artificial intelligence model. For example, the plurality of weight values may be modified to reduce loss values or minimize cost values obtained by the artificial intelligence model during training. The artificial neural network may include, for example, CNN, DNN, RNN, RBM, DBN, BRDNN, or deep Q network, but is not limited thereto.
The processor 2400 controls the overall operation of the server 2000. The processor 2400 may control the communication interface 2100, the DB 2200, and the artificial intelligence model 2300 by executing a program stored in the device 1000 the processor 2400 may perform the operation of the server 2000 described herein by controlling the communication interface 2100, the DB 2200, and the artificial intelligence model 2300.
Specifically, processor 2400 may obtain an image selected by device 1000 and identify an object in the image by using a first artificial intelligence model.
The processor 2400 may obtain spatial information regarding the depth of an object in the image by using a second artificial intelligence model.
Processor 2400 may obtain a plurality of weather texture images. The processor 2400 may obtain a plurality of weather texture images pre-registered by weather from the DB 2200. The processor 2400 may obtain, for example, a plurality of weather texture images indicating rainy weather, a plurality of weather texture images indicating snowy weather, and a plurality of weather texture images indicating foggy weather. The plurality of weather texture images pre-registered by weather may respectively correspond to the depth ranges. The plurality of weather texture images previously registered according to weather may be different from each other based on the depth range. The depth range may be a value for defining a depth range of a space in the image.
Processor 2400 may delete the weather object from the image by using a third artificial intelligence model.
Processor 2400 may determine criteria for simulating a plurality of weather texture images. The processor 2400 may, for example, determine criteria for a weather texture image to be used based on weather, criteria for a portion of the weather texture image from which the image segments are obtained, criteria for an interval between image segments, and criteria for a time to display an image segment based on a depth range.
The processor 2400 may generate a plurality of reference images corresponding to a preset time period. The time period may include a range of hours in a day and may be classified as a generalized period. The processor 2400 may generate a reference image corresponding to the morning, a reference image corresponding to the afternoon, a reference image corresponding to the evening, and a reference image corresponding to the evening by changing colors of the images, for example. The processor 2400 may generate color change path information indicating a color change path between a plurality of reference images.
The processor 2400 may provide the device 1000 with object identification information, spatial information, a plurality of weather texture images, simulation criterion information, an image from which a weather object is deleted, a plurality of reference images, and color change path information.
Fig. 32 is a block diagram of an apparatus 1000 according to an embodiment of the present invention.
As shown in fig. 32, the device 1000 may include a user input 1100, an output 1200, a processor 1300, a sensor 1400, a communicator 1500, an audio/video (a/V) input 1600, and a memory 1700.
The user input device 1100 refers to a device for a user to input data for controlling the apparatus 1000. For example, the user input 1100 may include a keyboard, a dome switch, a touch pad (e.g., a capacitive overlay, a resistive overlay, an infrared beam, a surface acoustic wave, an integral strain gauge, or a piezoelectric touch pad), a jog wheel, or a jog switch, but is not limited thereto.
The user input 1100 may receive a user input for simulating a weather effect in an image.
The outputter 1200 may output an audio signal, a video signal, or a vibration signal, and may include a display 1210, a sound outputter 1220, and a vibration motor 1230.
Display 1210 displays information processed by device 1000. For example, the display 1210 may display a GUI for simulating weather effects in an image.
When the display 1210 and the touch pad are layered to configure a touch screen, the display 1210 may be used as an output device and may also be used as an input device. The display 1210 may include at least one of a Liquid Crystal Display (LCD), a thin film transistor LCD (TFT-LCD), an Organic Light Emitting Diode (OLED), a flexible display, a 3D display, or an electrophoretic display.
The sound outputter 1220 outputs audio data received from the communicator 1500 or stored in the memory 1700. The vibration motor 1230 may output a vibration signal to generate a haptic effect.
The processor 1300 generally controls the overall operation of the device 1000. For example, processor 1300 may control user input 1100, output 1200, sensors 1400, communicator 1500, and a/V input 1600 by executing programs stored in memory 1700. Processor 2400 may perform the operations of device 1000 described herein by controlling user input 1100, output 1200, sensors 1400, communicator 1500, and a/V input 1600.
In particular, the processor 1300 may select an image. The processor 1300 may display a GUI for selecting an image to be displayed on a screen in the environment mode, and select the image based on a user input received through the displayed GUI.
Processor 1300 may send the image to server 2000. The processor 1300 may provide the image to the server 2000 and request the server 2000 to provide data required to simulate a weather effect in the image.
The processor 1300 may receive object identification information, spatial information, a plurality of weather texture images, simulation criterion information, an image from which a weather object is deleted, a plurality of reference images, and color change path information from the server 2000.
The processor 1300 may identify current weather and obtain a weather texture image corresponding to the current weather. The processor 1300 may select a weather texture image corresponding to characteristics of current weather from among the plurality of weather texture images received from the server 2000. Processor 1300 may select a weather texture image based on characteristics of the current weather and spatial depth in the image. Processor 1300 may simulate a weather effect in the image indicating the current weather by using the selected weather texture image. Processor 1300 may obtain an image segment from the selected weather texture image. The processor 1300 may provide a weather effect in an image by sequentially and iteratively overlapping image segments on the image in a preset period.
The processor 1300 may simulate a weather effect in an image based on the simulation criteria information. The processor 1300 may identify the location of the image segment and the interval between the image segments according to the characteristics of the current weather based on the criterion information for simulating the weather effect. Processor 1300 may select a plurality of image segments from the weather texture image based on the identified clipping locations and intervals. The processor 1300 may identify a simulation period according to characteristics of current weather and a depth range based on criteria information for simulating a weather effect. The processor 1300 may simulate in an image segment corresponding to a first depth range and an image segment corresponding to a second depth range. The processor 1300 may overlay one of the plurality of image segments corresponding to the first depth range and one of the plurality of image segments corresponding to the second depth range together on the image.
Processor 1300 may simulate a weather effect in the image from which the weather object was removed.
Processor 1300 may simulate a weather effect in a reference image corresponding to the current time. The processor 1300 may select a reference image corresponding to the current time from the reference images received from the server 2000. Processor 1300 may simulate a weather effect in the selected reference image.
The processor 1300 may change the color of the selected reference image based on the color change path information. For example, the processor 1300 may gradually change the color of the reference image corresponding to the afternoon based on the color change path information between the reference image corresponding to the afternoon and the reference image corresponding to the evening. The processor 1300 may simulate a weather effect in the color changed reference image.
The sensor 1400 may detect a state of the device 1000 or a state near the device 1000 and transmit the detected information to the processor 1300.
The sensors 1400 may include at least one of a magnetic sensor 1410, an acceleration sensor 1420, a temperature/humidity sensor 1430, an infrared sensor 1440, a gyro sensor 1450, a position sensor (e.g., GPS)1460, an air pressure sensor 1470, a proximity sensor 1480, or an RGB sensor (or illuminance sensor) 1490, but are not limited thereto. The function of the sensor can be understood by those skilled in the art from the name thereof, and thus a detailed description thereof will not be provided herein.
Communicator 1500 may include one or more elements for communicating with server 2000. For example, the communicator 1500 may include a short range wireless communicator 1510, a mobile communicator 1520 and a broadcast receiver 1530.
The short-range wireless communicator 1510 may include, for example, a bluetooth communicator, a Bluetooth Low Energy (BLE) communicator, a near field communicator, a Wireless Local Area Network (WLAN) (or Wi-Fi) communicator, a Zigbee communicator, an infrared data association (IrDA) communicator, a Wi-Fi direct (WFD) communicator, an ultra-wideband (UWB) communicator, and an Ant + communicator, but is not limited thereto.
The mobile communicator 1520 transmits and receives a radio signal to and from at least one of a base station, an external terminal device, or a server in the mobile communication network. Here, the radio signal may include various types of data based on transmission and reception of a voice call signal, a video call signal, or a text/multimedia message.
The broadcast receiver 1530 receives a broadcast signal and/or broadcast-related information from an external source through a broadcast channel. The broadcast channel may include a satellite channel and a terrestrial channel. Depending on the implementation, the device 1000 may not include the broadcast receiver 1530.
The communicator 1500 may transmit and receive information required for a weather effect in a simulated image to and from the server 2000.
The a/V input 1600 is used to input an audio signal or a video signal, and may include a camera 1610 and a microphone 1620. The camera 1610 may obtain image frames such as still images or moving images through an image sensor in a video call mode or an image capturing mode. The image captured by the image sensor may be processed by the processor 1300 or a separate image processor.
The image frames processed by the camera 1610 may be stored in the memory 1700 or transmitted to the outside through the communicator 1500. Depending on the implementation of the device 1000, two or more cameras 1610 may be included.
The microphone 1620 receives an external sound signal and processes it into electric voice data. For example, the microphone 1620 may receive a sound signal from an external device or a user. The microphone 1620 may use various noise canceling algorithms to cancel noise occurring when an external sound signal is received.
The memory 1700 may store programs for processing and controlling the operation of the processor 1300, and store data input to or output from the device 1000.
The memory 1700 may include at least one type of storage medium of a flash memory, a hard disk, a multimedia card micro, a memory card (e.g., a Secure Digital (SD) or extreme digital (XD) card), a Random Access Memory (RAM), a static RAM (sram), a Read Only Memory (ROM), an electrically erasable programmable ROM (eeprom), a programmable ROM (prom), a magnetic memory, a magnetic disk, and an optical disk.
Programs stored in memory 1700 may be classified into modules based on their functions, for example, a User Interface (UI) module 1710, a touch screen module 1720, and a notification module 1730.
UI module 1710 may, for example, provide a dedicated UI or GUI for each application to connect to device 1000. The touch screen module 1720 may detect a touch gesture of a user on the touch screen and send information about the touch gesture to the processor 1300. Touch screen module 1720 according to embodiments of the present invention may recognize and analyze touch codes. Touch screen module 1720 may be implemented as standalone hardware including a controller.
The notification module 1730 may generate a signal for giving notification of an event of the device 1000.
The device 1000 may perform some or all of the functionality of the server 2000 described herein. For example, the apparatus 1000 may perform a function of generating spatial information based on an image, a function of generating object identification information based on an image, and a function of setting a criterion for simulating a weather effect. In addition, the apparatus 1000 may store various types of information required to simulate a weather effect, for example, spatial information, object recognition information, criterion information, and a weather texture image. In this case, the device 1000 may be a high performance device.
Fig. 33 is a schematic diagram illustrating an example in which the external device 3000 sets a criterion for simulating a weather effect and the device 1000 receives information on the set criterion through the external DB4000 and provides the weather effect in an image according to one embodiment of the present disclosure.
Referring to fig. 33, the external device 3000 may set a criterion for simulating a weather effect in an image through the server 2000, and the device 1000 may receive information required to simulate the weather effect from the external DB4000 to simulate the weather effect in the image.
The external device 3000 may provide an image to which a weather effect is to be applied to the server 2000 through a specific GUI displayed on a screen of the external device 3000, and set criteria for simulating the weather effect. For example, the external device 3000 may set a criterion for simulating a weather effect, the external device 3000 may not provide an image to the server 2000, and an image stored in the server 2000 may be selected.
The server 2000 may generate information for simulating a weather effect based on a setting value input through the GUI by the external device 3000. The information for simulating the weather effect may include, for example, an image, a processed image, a weather texture image, spatial information, object recognition information, and criterion information, but the information for simulating the weather effect is not limited thereto. The information for simulating the weather effect may include the parameter information described above with respect to fig. 19.
The external device 3000 may provide the image to the server 2000 based on the web and receive information for simulating a weather effect in the image from the server 2000. The information for simulating the weather effect may be data in a metadata format.
The external device 3000 may provide the image and information for simulating the weather effect received from the server 2000 to the external DB4000, and the external DB4000 may store the image and information for simulating the weather effect.
The apparatus 1000 may provide the image ID to the external DB4000 and request information for simulating a weather effect, and receive the image and the information for simulating the weather effect, and the apparatus 1000 may provide the weather effect in the image based on a preset criterion by using the information for simulating the weather effect.
The external device 3000 may be, for example, a smart phone, a tablet, a PC, a smart television, a mobile phone, a PDA, a notebook, a media player, a mini server, a GPS device, an e-book reader, a digital broadcast receiver, a navigation system, a kiosk, an MP3 player, a digital camera, a home appliance, or another mobile or non-mobile computing device, but is not limited thereto. The external device 3000 may include the elements of fig. 32, but is not limited thereto.
The external DB4000 may be a server for storing and managing information for simulating weather effects.
Fig. 34 is a schematic diagram illustrating an example in which the external device 3000 sets a criterion for simulating a weather effect and the device 1000 receives information on the set criterion through the server 2000 and provides the weather effect in an image according to one embodiment of the present disclosure.
The external device 3000 may provide an image to which a weather effect is to be applied to the server 2000 through a specific GUI displayed on a screen of the external device 3000, and set criteria for simulating the weather effect. For example, the external device 3000 may set a criterion for simulating a weather effect, the external device 3000 may not provide an image to the server 2000, and an image stored in the server 2000 may be selected.
The server 2000 may generate information for simulating a weather effect based on a setting value input through the GUI by the external device 3000. The information for simulating the weather effect may include, for example, an image, a processed image, a weather texture image, spatial information, object recognition information, and criterion information, but the information for simulating the weather effect is not limited thereto. The information for simulating the weather effect may include the parameter information described above with respect to fig. 19. The server 2000 may store images and information for simulating weather effects.
The device 1000 may provide the image ID to the server 2000 and request information for simulating a weather effect, and receive the image and the information for simulating the weather effect, and the device 1000 may provide the weather effect in the image based on a preset criterion by using the information for simulating the weather effect.
Fig. 35 is a schematic diagram illustrating an example in which the external device 3000 sets a criterion for simulating a weather effect through the server 2000 and the device 1000 receives information on the criterion through the external DB4000 and provides the weather effect in an image according to an embodiment of the present disclosure.
The external device 3000 may provide an image to which a weather effect is to be applied to the server 2000 through a specific GUI displayed on a screen of the external device 3000, and set criteria for simulating the weather effect. For example, the external device 3000 may set criteria for simulating a weather effect through the GUI 20 shown in fig. 20. The external device 3000 may not provide an image to the server 2000 and may select an image stored in the server 2000.
The server 2000 may generate information for simulating a weather effect based on a setting value input through the GUI by the external device 3000. The information for simulating the weather effect may include, for example, an image, a processed image, a weather texture image, spatial information, object identification information, and criterion information, but is not limited thereto. The information for simulating the weather effect may include the parameter information described above with respect to fig. 19.
The server 2000 may provide the external DB4000 with images and information for simulating a weather effect, and the external DB4000 may store the images and information for simulating a weather effect.
The apparatus 1000 may provide the image ID to the external DB4000 and request information for simulating a weather effect, and receive the image and the information for simulating a weather effect from the external DB 4000. The apparatus 1000 may provide a weather effect in an image based on a preset criterion by using information for simulating the weather effect.
Embodiments of the present invention may be embodied in the form of a computer-readable recording medium including instructions executable by a computer, such as program modules, by the computer. The computer-readable recording medium may be any available medium that can be accessed by the computer, and examples thereof include all volatile, nonvolatile, detachable and non-detachable media. The computer-readable recording medium may include computer storage media and communication media. Examples of computer storage media include all volatile, nonvolatile, removable, and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules, or other data. Examples of communication media may typically be comprised of computer readable instructions, data structures, and other data in a modulated data signal, such as program modules.
As used herein, the suffix "unit" or "-device" may indicate a hardware component, such as a processor or a circuit, and/or a software component that is executed by a hardware component, such as a processor.
Throughout the disclosure, "at least one of a, b, or c" means all or a variation of only a, only b, only c, both a and b, both a and c, both b and c, a, b, and c.
The foregoing description of the invention has been provided for the purposes of illustration and it will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the scope of the invention. Accordingly, it is to be understood that the disclosed embodiments described herein are to be considered in a descriptive sense only and not for purposes of limitation. For example, each component described as a single type may be implemented in a distributed manner, and likewise, components described as distributed may be implemented in a combined manner.
The scope of the invention is defined by the following claims, rather than the detailed description, and all modifications from the claims and equivalents thereof should be understood to be included within the scope of the invention.

Claims (15)

1. A method of a device for providing weather effects in an image, the method comprising:
obtaining an image to which a weather effect is to be applied;
obtaining at least one weather texture image showing a weather effect; and
a weather effect is provided in an image based on a weather texture image by sequentially overlapping a plurality of image segments obtained from an obtained weather texture image on the image.
2. The method of claim 1, further comprising:
receiving object identification information regarding a shape of an object in an image and spatial information regarding a depth of the object in the image from a server,
wherein the object recognition information is generated by the server by applying the image as an input to the first artificial intelligence model, an
Wherein the spatial information is generated by the server by applying the image as input to the second artificial intelligence model.
3. The method of claim 1, wherein the weather texture image comprises a first weather texture image showing a weather effect corresponding to a first depth range and a second weather texture image showing a weather effect corresponding to a second depth range.
4. The method of claim 3, wherein providing a weather effect comprises:
obtaining a plurality of first image segments from a first antenna texture image;
obtaining a plurality of second image segments from a second weather texture image; and
a first image segment selected from the plurality of first image segments and a second image segment selected from the plurality of second image segments are overlaid together on the image.
5. The method of claim 4, wherein overlapping comprises sequentially overlapping the plurality of first image segments on the image and sequentially overlapping the plurality of second image segments on the image.
6. The method of claim 5, wherein a first period during which the plurality of first image segments are overlapped is different from a second period during which the plurality of second image segments are overlapped.
7. The method of claim 4, wherein the locations of the plurality of first image segments and the plurality of second image segments and the spacing between regions corresponding to the plurality of first image segments in the first weather texture image and the plurality of first image segments in the second weather texture image are determined based on weather.
8. The method of claim 3, wherein a first size of a weather object indicating a weather effect in the first weather texture image is different than a second size of a weather object indicating a weather effect in the second weather texture image.
9. The method of claim 2, wherein obtaining the image comprises receiving from the server the image from which the weather object indicative of the weather effect was deleted by the server.
10. The method of claim 2, wherein obtaining the image comprises receiving a reference image corresponding to a preset time period from a server.
11. An apparatus for providing weather effects in an image, the apparatus comprising:
a display;
a memory storing one or more instructions; and
a processor configured to execute the one or more instructions to obtain an image to which a weather effect is to be applied, select at least one weather texture image showing the weather effect, and provide the weather effect in the image on the display based on the weather texture image by sequentially overlapping a plurality of image segments obtained from the weather texture image on the image.
12. The apparatus of claim 11, further comprising:
a communication interface for the communication of the information to the external,
wherein the processor is further configured to receive, from the server via the communication interface, object identification information regarding a shape of an object in the image and spatial information regarding a depth of the object in the image,
wherein the object recognition information is generated by the server by applying the image as an input to the first artificial intelligence model, an
Wherein the spatial information is generated by the server by applying the image as input to the second artificial intelligence model.
13. The device of claim 11, wherein the weather texture image comprises a first weather texture image showing a weather effect corresponding to a first depth range and a second weather texture image showing a weather effect corresponding to a second depth range.
14. The device of claim 13, wherein the processor is further configured to execute one or more instructions to:
a plurality of first image segments are obtained from a first antenna texture image,
obtaining a plurality of second image segments from a second weather texture image, an
A first image segment selected from the plurality of first image segments and a second image segment selected from the plurality of second image segments are overlaid together on the image.
15. A computer-readable recording medium having recorded thereon a computer program for executing the method of claim 1.
CN202080015016.3A 2019-02-18 2020-02-04 System and method for providing weather effects in images Pending CN113474822A (en)

Applications Claiming Priority (7)

Application Number Priority Date Filing Date Title
KR20190018823 2019-02-18
KR10-2019-0018823 2019-02-18
KR1020190075228A KR20200100515A (en) 2019-02-18 2019-06-24 System and method for providing image effect regarding weather
KR10-2019-0075228 2019-06-24
KR1020190122660A KR20200100519A (en) 2019-02-18 2019-10-02 System and method for providing image effect regarding weather
KR10-2019-0122660 2019-10-02
PCT/KR2020/001627 WO2020171425A1 (en) 2019-02-18 2020-02-04 System and method for providing weather effect in image

Publications (1)

Publication Number Publication Date
CN113474822A true CN113474822A (en) 2021-10-01

Family

ID=72242460

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080015016.3A Pending CN113474822A (en) 2019-02-18 2020-02-04 System and method for providing weather effects in images

Country Status (2)

Country Link
KR (2) KR20200100515A (en)
CN (1) CN113474822A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114327718A (en) * 2021-12-27 2022-04-12 北京百度网讯科技有限公司 Interface display method and device, equipment and medium
WO2023116653A1 (en) * 2021-12-21 2023-06-29 北京字跳网络技术有限公司 Element display method and apparatus, and electronic device and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7043368B1 (en) * 2002-04-08 2006-05-09 Wsi Corporation Method and system for creating visualizations of future weather conditions
US20070250591A1 (en) * 2006-04-24 2007-10-25 Microsoft Corporation Personalized information communications
US20090186604A1 (en) * 2008-01-14 2009-07-23 Lg Electronics Inc. Mobile terminal capable of providing weather information and method of controlling the mobile terminal
CN103984553A (en) * 2014-05-26 2014-08-13 中科创达软件股份有限公司 3D (three dimensional) desktop display method and system
US20160048995A1 (en) * 2014-08-12 2016-02-18 Xiaomi Inc. Weather displaying method and device
US20180231871A1 (en) * 2016-06-27 2018-08-16 Zhejiang Gongshang University Depth estimation method for monocular image based on multi-scale CNN and continuous CRF
CN109241465A (en) * 2018-07-19 2019-01-18 华为技术有限公司 interface display method, device, terminal and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7043368B1 (en) * 2002-04-08 2006-05-09 Wsi Corporation Method and system for creating visualizations of future weather conditions
US20070250591A1 (en) * 2006-04-24 2007-10-25 Microsoft Corporation Personalized information communications
US20090186604A1 (en) * 2008-01-14 2009-07-23 Lg Electronics Inc. Mobile terminal capable of providing weather information and method of controlling the mobile terminal
CN103984553A (en) * 2014-05-26 2014-08-13 中科创达软件股份有限公司 3D (three dimensional) desktop display method and system
US20160048995A1 (en) * 2014-08-12 2016-02-18 Xiaomi Inc. Weather displaying method and device
US20180231871A1 (en) * 2016-06-27 2018-08-16 Zhejiang Gongshang University Depth estimation method for monocular image based on multi-scale CNN and continuous CRF
CN109241465A (en) * 2018-07-19 2019-01-18 华为技术有限公司 interface display method, device, terminal and storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023116653A1 (en) * 2021-12-21 2023-06-29 北京字跳网络技术有限公司 Element display method and apparatus, and electronic device and storage medium
CN114327718A (en) * 2021-12-27 2022-04-12 北京百度网讯科技有限公司 Interface display method and device, equipment and medium
CN114327718B (en) * 2021-12-27 2024-07-12 北京百度网讯科技有限公司 Interface display method, device, equipment and medium

Also Published As

Publication number Publication date
KR20200100515A (en) 2020-08-26
KR20200100519A (en) 2020-08-26

Similar Documents

Publication Publication Date Title
CN112805718B (en) Object recognition method for automatic driving device and automatic driving device
ES2925453T3 (en) Image acquisition device and control procedure thereof
CN110476405B (en) Method and system for providing recommendation information related to photographing
RU2609757C2 (en) Method and device for displaying weather
CN109155073B (en) Material aware three-dimensional scanning
KR101707095B1 (en) Non-static 3d map views
US10809091B2 (en) Street-level guidance via route path
US9641755B2 (en) Reimaging based on depthmap information
KR20200023702A (en) Method of providing image to vehicle, and electronic device therefor
US20150185825A1 (en) Assigning a virtual user interface to a physical object
KR101184876B1 (en) Apparatus and method for creating character's dynamic effect related to image contents
KR20150079387A (en) Illuminating a Virtual Environment With Camera Light Data
US11682168B1 (en) Method and system for virtual area visualization
CN113474822A (en) System and method for providing weather effects in images
CN113110731B (en) Method and device for generating media content
US11302040B2 (en) System and method for providing weather effect in image
EP3891707B1 (en) System and method for providing weather effect in image
KR20230171949A (en) Digital map animation using real-world signals
US11726740B2 (en) Immersive audio tours
US20230409265A1 (en) Program, mobile terminal control method, mobile terminal, and information processing device
US20240071008A1 (en) Generating immersive augmented reality experiences from existing images and videos
CN116108118A (en) Method and terminal equipment for generating thermal map
CN117670691A (en) Image processing method and device, computing device and storage medium
CN114745515A (en) Interactive information display method and device, mobile terminal and storage medium
KR20240090122A (en) Integration of 3D scenes and media content

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination