WO2023191793A1 - Color palettes of background images - Google Patents

Color palettes of background images Download PDF

Info

Publication number
WO2023191793A1
WO2023191793A1 PCT/US2022/022868 US2022022868W WO2023191793A1 WO 2023191793 A1 WO2023191793 A1 WO 2023191793A1 US 2022022868 W US2022022868 W US 2022022868W WO 2023191793 A1 WO2023191793 A1 WO 2023191793A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
color
electronic device
user
controller
Prior art date
Application number
PCT/US2022/022868
Other languages
French (fr)
Inventor
Rafael DAL ZOTTO
Original Assignee
Hewlett-Packard Development Company, L.P.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett-Packard Development Company, L.P. filed Critical Hewlett-Packard Development Company, L.P.
Priority to PCT/US2022/022868 priority Critical patent/WO2023191793A1/en
Publication of WO2023191793A1 publication Critical patent/WO2023191793A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • H04N21/4854End-user interface for client configuration for modifying image parameters, e.g. image brightness, contrast
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/147Communication arrangements, e.g. identifying the communication as a video-communication, intermediate storage of the signals

Definitions

  • Electronic devices such as desktops, laptops, notebooks, tablets, and smartphones include executable code that enables users to communicate with users of other electronic devices during meetings.
  • a user of an electronic device comprising the executable code that enables the users to communicate during meetings (e.g., videoconferencing application) shares audio content, video content, or a combination thereof, of the electronic device.
  • the video content may include a real-time video of the user.
  • FIGS. 1 A and 1 B are block diagrams of electronic devices for determining color palettes of background images, in accordance with various examples.
  • FIGS. 2A and 2B are block diagrams of electronic devices for determining color palettes of background images, in accordance with various examples.
  • FIG. 3 is a flow diagram of a method for an electronic device for determining color palettes of background images, in accordance with various examples.
  • FIG. 4 is a block diagram of an electronic device for determining color palettes of background images, in accordance with various examples.
  • FIG. 5 is a block diagram of an electronic device for determining color palettes of background images, in accordance with various examples.
  • FIG. 6 is a block diagram of an electronic device for determining color palettes of background images, in accordance with various examples.
  • electronic devices include executable code that enables users to share content with other remote users during meetings.
  • the other remote users are referred to herein as an audience.
  • the content includes a real-time video of a user.
  • the real-time video is referred to herein as an image signal.
  • the image signal includes multiple frames that are sequential in time to each other.
  • a frame of the image signal includes an image.
  • a background image refers to a portion of the image that is disposed behind or around the user and depicts the environment in which the electronic device is disposed.
  • the user adjusts the background image by obscuring the portion of the image that does not include the user or by replacing the background image with another background image.
  • the adjusted background image does not suit a context of the meeting, distracts from the meeting, and reduces effectiveness of communication.
  • the user taking action to manually adjust the background distracts the user from the meeting and reduces the experience of the user and the audience.
  • an electronic device includes an image sensor that captures an image signal.
  • the electronic device analyzes an image of the image signal to determine whether the image depicts a user.
  • the electronic device analyzes an element of the image to determine a color palette.
  • the element includes clothing of the user, a type of the clothing, a color of the clothing, or a combination thereof.
  • the color palette includes the color of the clothing, a neutral color, a contrast color, or a combination thereof.
  • a neutral color, as used herein, is a muted shade that is not on the color wheel.
  • the color wheel includes primary colors such as red, orange, yellow, green, blue, purple, and shades thereof.
  • Neutral colors include brown, gray, black, white, or shades thereof.
  • a contrast color as used herein, is a color that complements another color. Contrast colors are located opposite the color that they complement on the color wheel.
  • the electronic device selects a background image from a set of background images utilizing a color of the color palette.
  • the electronic device selects the background image from the set of background images by determining a context of the image. To determine the context of the image, the electronic device utilizes the type of the clothing, a time of day, or a combination thereof.
  • the user utilizes a graphical user interface (GUI) to generate meeting profiles that include different contexts.
  • GUI graphical user interface
  • a meeting profile includes a type of meeting (e.g., personal, work, interview, exercise, casual, formal), a color of a color palette utilized with the type of meeting (e.g., a color worn by user, a contrast color, a neutral color), a subset of the set of background images used with the type of meeting, or a combination thereof.
  • the electronic device determines whether the context of the image is equivalent to a meeting profile of multiple meeting profiles. In response to a determination that the context of the image is equivalent to the meeting profile of the multiple meeting profiles, the electronic device selects a background image from the subset of the set of background images associated with the meeting profile.
  • the electronic device By automatically adjusting the background image, the electronic device enhances the user and the audience experience because the electronic device is able to quickly switch background images to reduce distractions.
  • the electronic device By utilizing the context to select the background image, the electronic device enhances the user and the audience experience because the electronic device is able to quickly switch background images in response to a switching of context.
  • an electronic device includes an image sensor, and a controller to receive an image from the image sensor.
  • the controller determines a color palette associated with the user.
  • the controller selects a background image based on the color palette, modifies the image to include the background image, and causes display, transmission, or a combination thereof, of the modified image.
  • an electronic device includes an image sensor, and a controller to receive an image from the image sensor and identify a color palette of an element depicted in the image.
  • the element is clothing of a user.
  • the controller selects, by using the color palette, a background image from a first subset of images of the first meeting profile.
  • the controller selects, by using the color palette, the background image from a second subset of images of the second meeting profile.
  • the controller modifies the image to include the background image, and causes display, transmission, or a combination thereof, of the modified image.
  • non-transitory does not encompass transitory propagating signals.
  • the non-transitory machine-readable medium stores machine-readable instructions which, when executed by a controller of an electronic device, cause the controller to receive an image from an image sensor and identify an element depicted in the image.
  • the element includes a clothing of the user, a type of the clothing, a color of the clothing, or a combination thereof.
  • the machine-readable instructions when executed by the controller, cause the controller to select a background image based on the element, modify the image to include the background image, and cause display, transmission, or a combination thereof, of the modified image.
  • the electronic device 100 is a desktop, laptop, notebook, tablet, smartphone, or other suitable computing device able to determine color palettes of background images, for example.
  • the electronic device 100 includes a display device 102A, 102B.
  • the display devices 102A, 102B are referred to collectively as a display device 102.
  • the display device 102 is any suitable device for displaying data of the electronic device 100.
  • the display devices 102 is a liquid crystal display (LCD), a light-emitting diode (LED) display, a plasma display, or a quantum dot (QD) display, for example.
  • LCD liquid crystal display
  • LED light-emitting diode
  • QD quantum dot
  • the electronic device 100 includes the display device 102A, in accordance with various examples.
  • the display device 102A displays an image 103.
  • the image 103 depicts a user 104 and a background 106.
  • the user 104 is sitting in a chair 108 and is wearing a headset 110.
  • the chair 108 is a first color, as indicated by the dotted grid lines.
  • the headset 110 is a second color, as indicated by the black-and-white-checkered pattern.
  • the user 104 is wearing a shirt including a collar 112, a body 114, and graphic designs 116, 118, 120.
  • the collar 112 has a third color, as indicated by the vertical stripes.
  • the body 114 has a fourth color, as indicated by horizontal stripes.
  • a graphic design 116 is a triangle having a fifth color, as indicated by the black-dotted white background.
  • a graphic design 118 is a circle having a sixth color, as indicated by the white-dotted black background.
  • a graphic design 120 is a rectangle having a seventh color, as indicated by the diagonal bricked pattern.
  • the background 106 includes a window and a wall.
  • a frame of the window has an eighth color, as indicated by the diagonal stripes.
  • the electronic device 100 includes the display device 102B, in accordance with various examples.
  • the display device 102B displays a GUI 122.
  • the GUI 122 includes options 124, 126, 128, 130 and buttons 132, 134.
  • An option 124 enables selection of a background color from a list of background colors.
  • An option 126 enables selection of a background type from a list of background types.
  • the list includes “Office,” “Coffee,” “Gaming,” “Family Chat,” and an option to scroll up or down to view more background types.
  • An option 128 enables selection of a neutral color.
  • An option 130 enables selection of a contrast color.
  • a button 132 enables saving of a meeting profile.
  • a button 134 enables a search using the context of the meeting profile selected via the options 124, 126, 128, 130.
  • the electronic device 100 includes an image sensor (not explicitly shown) that captures the image 103.
  • the electronic device 100 causes the display device 102A to display the image 103 in response to a request by an application to access the image sensor.
  • the application as used herein, is executable code, which when executed by the electronic device, causes the electronic device to perform a task, such as videoconferencing.
  • the electronic device 100 analyzes the image 103 to determine whether the image 103 depicts the user 104.
  • the electronic device 100 detects the user 104 utilizing facial detection techniques.
  • the electronic device 100 decomposes the image 103 utilizing a pre-processing technique. Decomposing, as used herein, reduces objects of the image 103 to edge-like structures.
  • the pre-processing techniques include grayscaling, blurring, sharpening, thresholding, resizing, cropping, or a combination thereof, for example.
  • the electronic device 100 utilizes the facial detection technique to determine whether low intensity regions of the decomposed image include facial features.
  • the facial features include eyebrows, eyes, a nose, lips, hairline, jawline, or a combination thereof, for example.
  • the electronic device 100 utilizes a machine learning technique to detect the facial features.
  • the machine learning technique compares the facial features to multiple templates to determine that the features indicate a face, for example.
  • the electronic device 100 utilizes a machine learning technique that implements a convolution neural network (CNN) to determine whether the image 103 includes a face.
  • the CNN is trained with a training set that includes multiple images of multiple faces, for example.
  • the multiple images include faces having different user profiles. Utilizing the trained CNN, the electronic device 100 identifies facial features of the user 104 depicted within the image 103.
  • CNN convolution neural network
  • the electronic device 100 detects the user 104 using a CNN to perform a segmentation technique. Using the segmentation technique, the electronic device 100 divides the image 103 into pixel groupings. The electronic device 100 uses the CNN to identify tangible objects of the pixel groupings, features of the tangible objects, boundaries of the tangible objects, or a combination thereof. The electronic device 100 uses the machine learning technique to distinguish the user 104 from the background 106. To distinguish the user 104 from the background 106, the electronic device 100 uses the CNN to determine whether the image 103 includes different body parts (e.g., head, left hand, right hand, front torso, back torso, left arm, right arm), for example.
  • body parts e.g., head, left hand, right hand, front torso, back torso, left arm, right arm
  • the electronic device 100 analyzes an element of the image 103 to determine a color palette.
  • the element is the user 104.
  • the electronic device 100 analyzes pixels associated with the user 104 to determine a set of colors that occur with frequencies that are greater than a frequency threshold. For example, in response to the frequency threshold being equivalent to 18.5%, the electronic device 100 determines that the color palette includes a first color that occurs within 20% of the pixels associated with the user 104 and a second color that occurs within 45% of the pixels associated with the user 104. In some examples, the electronic device 100 determines frequencies of colors of multiple elements of the image 103 to determ ine the color palette.
  • the electronic device 100 determ ines the color palette is a first subset of the colors of the multiple elements that have frequencies that are greater than the frequencies of a second subset of the colors. For example, the electronic device 100 determines that six different colors are associated with six elements associated with the user 104. The electronic device 100 determines the color palette includes the five colors of the multiple elements that have frequencies that are greater than the sixth color of the multiple elements. [0023] As described above, in some examples, the user 104 utilizes the GU1 122 to generate meeting profiles that include different contexts. For example, the electronic device 100 determines the color palette using the elements of the image 103 that are items of clothing of the user 104.
  • the electronic device 100 identifies the collar 112, the body 114, the graphic designs 116, 118, 120 as the items of clothing associated with different body parts of the user 104, for example.
  • the electronic device 100 analyzes pixels associated with the collar 112, the body 114, and the graphic designs 116, 118, 120, to determine the color palette.
  • the electronic device 100 causes the display device 102B to display the color palette as the option 124, for example.
  • the electronic device 100 in response to the user 104 enabling the option 128, includes a neutral color in the color palette displayed as the option 124. In other examples, in response to the user 104 enabling the option 128, the electronic device 100 replaces the color palette displayed as the option 124 with a different color palette that includes a number of neutral colors. In various examples, in response to the user 104 enabling the option 130, the electronic device 100 includes a contrast color in the color palette displayed as the option 124. In other examples, in response to the user 104 enabling the option 130, the electronic device 100 replaces the color palette displayed as the option 124 with a different color palette that includes a number of contrast colors.
  • the electronic device 100 in response to the user 104 selecting a background type from the option 126 and selecting the button 132, stores the color palette displayed as the option 124 and the background type selected from the option 126 to a meeting profile. In some examples, the electronic device 100 prompts the user 104 to enter a label for the meeting profile. In other examples, the electronic device 100 generates the label for the meeting profile using an identifier of a color of the color palette selected from the option 124, an identifier of the color palette displayed as the option 124, an identifier of the background type selected from the option 126, or other suitable meeting profile identifier.
  • the electronic device 100 enhances the user experience by providing the electronic device 100 context to use to quickly switch background images to reduce distractions.
  • the electronic device 100 selects a background image from a set of background images utilizing a color of the color palette. For example, in response to the user 104 selecting a color of the option 124 and selecting the button 134, the electronic device 100 searches the set of background images for images that include the color. In other examples, the electronic device 100 selects the background image from a subset of the set of background images by determining the context of the image 103 in response to the user 104 selecting a background type from the option 126.
  • the electronic device 100 selects the background image from the subset of the set of background images determined using the context of the image 103 in response to the user 104 selecting the background type from the option 126 and the user 104 selecting the color of the option 124.
  • the background images are stored to the electronic device 100, to a remote storage device communicatively coupled to the electronic device 100, or a combination thereof.
  • FIGS. 2A and 2B block diagrams of an electronic device 200 for determining color palettes of background images are shown, in accordance with various examples.
  • the electronic device 200 is the electronic device 100, for example.
  • the electronic device 200 includes a display device 202.
  • the display device 202 is the display device 102, for example.
  • the electronic device 200 includes the display device 202 displaying an image 204A, in accordance with various examples.
  • the image 204A depicts a user 206 and a background 208A.
  • the user 206 is wearing a shirt having buttons that are a first color 210, as indicated by the black circles, and a body that is a second color 212, as indicated by the black-dotted white background.
  • the electronic device 200 includes the display device 202 displaying an image 204B, in accordance with various examples.
  • the image 204B is the image 204A after the electronic device 200 has adjusted the background image, for example.
  • the image 204B depicts the user 206 and a background 208B.
  • a color of the background 208B is equivalent to the second color 212, as indicated by the black-dotted white background.
  • the electronic device 200 includes an image sensor (not explicitly shown) that captures the image 204A.
  • the electronic device 200 analyzes the image 204A to determine whether the image 204A depicts the user 206.
  • the electronic device 200 analyzes clothing of the user 206 to determine a color palette.
  • the color palette includes the first color 210 and the second color 212.
  • the electronic device 200 obscures the background 208A using the second color 212 to generate the image 204B having the background 208B.
  • the electronic device 200 enhances the user 206 and the audience experience because the electronic device 200 is able to quickly switch background images to reduce distractions.
  • FIG. 3 is a flow diagram of a method 300 for an electronic device (e.g., the electronic device 100, 200) for determining color palettes of background images is shown, in accordance with various examples.
  • the method 300 includes receiving an image signal (302).
  • the method 300 also includes isolating frames of the image signal (304). Additionally, the method 300 includes decomposing an image of a frame (306).
  • the method 300 includes determining whether a user is depicted in the image (308). In response to a determination that the user is not depicted in the frame, the method 300 includes releasing the frame (310). In response to a determination that the user is depicted in the image, the method 300 includes determining a color palette of the image (312).
  • the method 300 also includes locating a background image based on the color palette (314). Additionally, the method 300 includes applying the background image to the image of the frame (316). The method 300 includes releasing the frame (310).
  • the method 300 also includes receiving a request by an application to access an image sensor. In response to receiving the request, the method 300 begins determining a color palette of a background image by receiving the image signal (302). In some examples, the method 300 includes decomposing an image (e.g., the image 103, the image 204A) of a frame of the image signal, determining whether the image depicts the user (e.g., the user 104, 206), or a combination thereof, using the techniques described above with respect to FIG. 1. In other examples, the method 300 determines that the image depicts the user by using a machine learning technique to perform object tracking.
  • an image e.g., the image 103, the image 204A
  • the method 300 determines that the image depicts the user by using a machine learning technique to perform object tracking.
  • the method 300 includes using a CNN to compare subsequent frames of the image signal to distinguish objects of the background (e.g., the background 106, 208A) from the user, elements (e.g., the chair 108, the headset 110, the collar 112, the body 114, the graphic designs 116, 118, 120, the buttons having the first color 210, the body having the second color 212) associated with the user, or a combination thereof.
  • elements e.g., the chair 108, the headset 110, the collar 112, the body 114, the graphic designs 116, 118, 120, the buttons having the first color 210, the body having the second color 212
  • the method 300 includes determining a color palette of the image utilizing the techniques described above with respect to FIG. 1 , for example.
  • the method 300 includes determining the color palette of the image using machine learning techniques.
  • a CNN is trained with training sets that include multiple images, for example. The image of the frame of the image signal is used as an input into the CNN. The CNN outputs a color palette based on the image of the frame of the image signal.
  • the method 300 also includes locating a background image based on the color palette using a knowledge graph.
  • the knowledge graph is a data structure that includes multiple background images, color palettes associated with the multiple background images, types of clothing associated with the multiple background images, types of meetings associated with the multiple background images, times of day associated with the multiple background images, multiple meeting profiles associated with the multiple background images, or other data that enables selection of a background image based on the color palette of the image of the frame of the image signal.
  • the method 300 includes determining color palettes associated with the multiple background images utilizing the techniques described above with respect to FIG. 1 .
  • a color palette, a type of clothing, a type of meeting, a time of day, a meeting profile, or a combination thereof is associated with a subset of the multiple background images.
  • Different color palettes, types of clothing, types of meeting, times of day, meeting profiles, or a combination thereof, are associated with different subsets of the multiple background images, for example.
  • Type of clothing includes a t-shirt, a collared-shirt, a sweater, a jacket, a suit jacket, a tie, a lapel pin, a hat, eyeglasses, a headset, or other items of clothing that provide context that indicate a type of meeting.
  • the method 300 includes causing a display device (e.g., the display device 102, 202) to display multiple background images that include the color palette.
  • the method 300 includes causing the display device to display a first background image that is a color that is equivalent to a first color of the color palette, a second background image that is a color that is equivalent to a second color of the color palette, a third background image that includes a third and a fourth color of the color palette, and a fourth background image that includes each color of the color palette.
  • the method 300 includes applying the background image to the image of the frame of the image signal.
  • the method 300 includes releasing the frame to the application that requested access to the image sensor.
  • the method 300 in response to determining that the image depicts the user, includes determining whether a contrast ratio of the background image is less than a contrast threshold. In response to a determination that the contrast ratio is less than the contrast threshold, the method 300 includes releasing the frame. In response to a determination that the contrast ratio is equivalent to or greater than the contrast threshold, the method 300 includes determining the color palette (312). By adjusting the background image when the contrast ratio is equivalent to or greater than the contrast threshold, the method 300 enhances the user and audience experience by blocking audience distraction due to a dynamic nature of the background image.
  • the method 300 includes determining that a brightness of the background image of the frame is greater than a brightness threshold. In response to the determination that the brightness of the background image of the frame is greater than the brightness threshold, the method 300 also includes determining the color palette (312). In response to the determination that the brightness of the background image of the frame is equivalent to or less than the brightness threshold, the method 300 also includes releasing the frame (310). By automatically adjusting the background image based on the brightness, the method 300 blocks audience distraction due to a glare of the original background image, for example.
  • the method 300 includes determining whether the background image includes a total number of objects that is greater than a count threshold. In response to the determination that the total number of objects is equivalent to or less than the count threshold, the method 300 includes releasing the frame (310). In response to the determination that the total number of objects is greater than the count threshold, the method 300 includes determining the color palette (312). By automatically adjusting the background image based on the total number of objects in the background image, the method 300 blocks audience distraction due to a busyness of the background image, for example.
  • the method 300 includes determining whether the background image includes movement. To determine whether the background image includes movement, the method 300 includes comparing objects in the background of a first frame to the objects in the background of a second frame, for example. In response to a determination that objects are in movement behind the user, the method 300 determines the color palette.
  • the electronic device 400 is the electronic device 100, 200, for example.
  • the electronic device 400 includes a controller 402, an image sensor 404, and a storage device 406.
  • the controller 402 is a microcontroller, a microcomputer, a programmable integrated circuit, a programmable gate array, or other suitable device for managing operations of the electronic device 400 or a component or multiple components of the electronic device 400.
  • the controller 402 is a central processing unit (CPU), a graphics processing unit (GPU), or an embedded security controller (EpSC).
  • the controller 402 is an embedded artificial intelligence (eAl) of the image sensor 404.
  • the image sensor 404 is any suitable device that converts an optical image into an electronic signal (e.g., an image signal).
  • the storage device 406 is a hard drive, a solid-state drive (SSD), flash memory, random access memory (RAM), or other suitable memory for storing data or machine-readable instructions of the electronic device 400.
  • the electronic device 400 includes network interfaces, video adapters, sound cards, local buses, peripheral devices (e.g., a keyboard, a mouse, a touchpad, a speaker, a microphone, a display device), or a combination thereof.
  • the image sensor 404 is shown as an integrated image sensor of the electronic device 400, in other examples, the image sensor 404 is coupled to the electronic device 400 via a wired (e.g., Universal Serial Bus (USB)) or a wireless (e.g., WI-FI®, BLUETOOTH®) connection.
  • the network interfaces enable communication over a network.
  • the network interfaces may include a wired (e.g., Ethernet, USB) or a wireless (e.g., WI-FI®, BLUETOOTH®) connection, for example.
  • the network is a local area network (LAN), a wide area network (WAN), a virtual private network (VPN), a client/server network, an Internet (e.g., cloud), or any other suitable system for sharing processing and memory resources, for example.
  • the controller 402 is coupled to the image sensor 404 and the storage device 406.
  • the storage device 406 stores machine-readable instructions 408, 410, 412, 414, 416, which, when executed by the controller 402, cause the controller 402 to perform some or all of the actions attributed herein to the controller 402.
  • the machine-readable instructions 408, 410, 412, 414, 416 when executed by the controller 402, cause the controller 402 to perform some or all of the method 300, for example.
  • the machine-readable instructions 408, 410, 412, 414, 416 which, when executed by the controller 402, cause the controller 402 to determine color palettes of background images.
  • the machine-readable instruction 408, when executed by the controller 402, causes the controller 402 to receive an image from the image sensor 404.
  • the machine-readable instruction 410 when executed by the controller 402, causes the controller 402 to determine a color palette associated with the user.
  • the machine-readable instruction 412 when executed by the controller 402, causes the controller 402 to select a background image based on the color palette.
  • the machine-readable instruction 414 when executed by the controller 402, causes the controller 402 to modify the image to include the background image.
  • the machine-readable instruction 416 when executed by the controller 402, causes the controller 402 to cause display, transmission, or a combination thereof, of the modified image.
  • the controller 402 determines whether the image depicts the user using the techniques described above with respect to FIGS. 1 or 3. In response to the determination that the image depicts the user, the controller 402 utilizes the techniques described above with respect to FIGS. 1 or 3 to determine the color palette associated with the user. For example, the controller 402 determines a color of clothing of the user to determine the color palette associated with the user. In various examples, the color palette associated with the user includes the color of the clothing, a neutral color, a contrast color, or a combination thereof.
  • the controller 402 selects the background image using the techniques described above with respect to FIGS. 1 , 2, or 3. In other examples, the controller 402 determines whether a color of the color palette is within a color range of a color of the background image. For example, the color of the color palette is a shade of red having a first color value. The color value is indicated by a Red Green Blue (RGB) color scale, for example. The controller 402 determines whether the first color value is within a color range of a shade of red of the background image having a second color value. In some examples, different color values are associated with different color ranges. For example, shades of red are associated with a first color range, shades of blue are associated with a second color range, and shades of green are associated with a third color range.
  • RGB Red Green Blue
  • the background image is stored to the storage device 406 of the electronic device 400.
  • the user e.g., the user 104, 206 uses a GUI (e.g., the GU1 122) to store a background image, a meeting profile, or a combination thereof, to the storage device 406 of the electronic device 400.
  • the controller 402 analyzes the new background image to determine a color palette for the background image.
  • the controller 402 stores the color palette, the background image, the meeting profile, or the combination thereof, to a knowledge graph stored to the storage device 406.
  • the background image is stored to a remote storage device (not explicitly shown).
  • the controller 402 transmits a signal to receive the background image from the remote storage device, for example.
  • the electronic device 500 is the electronic device 100, 200, 400, for example.
  • the electronic device 500 includes a controller 502, an image sensor 504, and a storage device 506.
  • the controller 502 is the controller 402, for example.
  • the image sensor 504 is the image sensor 404, for example.
  • the storage device 506 is the storage device 406, for example.
  • the controller 502 is coupled to the image sensor 504 and the storage device 506.
  • the storage device 506 stores machine-readable instructions 508, 510, 512, 514, 516, 518, which, when executed by the controller 502, cause the controller 502 to perform some or all of the actions attributed herein to the controller 502.
  • the machine-readable instruction 508, when executed by the controller 502, causes the controller 502 to receive an image from the image sensor 504.
  • the machine-readable instruction 510 when executed by the controller 502, causes the controller 502 to identify a color palette of an element depicted in the image.
  • the element is a clothing of a user.
  • the machine-readable instruction 512 when executed by the controller 502, causes the controller 502 to select, by using the color palette, a background image from a first subset of images associated with the first meeting profile.
  • the machine-readable instruction 514 when executed by the controller 502, causes the controller 502 to select, by using the color palette, the background image from a second subset of images associated with the second meeting profile.
  • the machine-readable instruction 516 when executed by the controller 502, causes the controller 502 to modify the image to include the background image.
  • the machine-readable instruction 518 when executed by the controller 502, causes the controller 502 to cause display, transmission, or a combination thereof, of the modified image.
  • the controller 502 identifies the color palette of an element depicted in an image (e.g., the image 103, 204A) using the techniques described above with respect to FIGS. 1 or 3.
  • the controller 502 uses the techniques described above with respect to FIGS. 1 or 3 to determinate that a specified portion of the user (e.g., the user 104, 206) is disposed within a furniture item (e.g., the chair 108) and identifies the furniture item as another element that is associated with the user.
  • the controller 502 enhances a boundary associated with the user. The enhanced boundary reduces distortions to the image that are generated by a user movement.
  • the user uses a GUI (e.g., the GU1 122) to generate the first and the second meeting profiles.
  • the first meeting profile and the second meeting profile are stored to the storage device 506 of the electronic device 500.
  • the first meeting profile and the second meeting profile are stored to a remote storage device (not explicitly shown).
  • the controller 502 causes transmission of a signal via a network interface (not explicitly shown) to the remote storage device.
  • the controller 502 receives another signal via the network interface.
  • the another signal includes the first meeting profile, the second meeting profile, or a combination thereof.
  • the first subset of images associated with the first meeting profile is different than the second subset of images associated with the second meeting profile.
  • the second subset of images includes an image of the first subset of images.
  • the first meeting profile and the second meeting profile are associated with a first user.
  • a third meeting profile and a fourth meeting profile are associated with a second user.
  • the controller 502 determines that a user depicted in the image is the second user.
  • the controller 502 identifies a color palette of an element depicted in the image.
  • the controller 502 selects, by using the color palette, a background image from a third subset of images of the third meeting profile.
  • the controller 502 selects, by using the color palette, the background image from a fourth subset of images of the fourth meeting profile.
  • the controller 502 modifies the image to include the background image.
  • the controller 502 causes display, transmission, or a combination thereof, of the modified image.
  • the GUI enables the user to specify a frequency with which the controller 502 adjusts the background image.
  • the frequency is a number of minutes, a number of hours, daily, weekly, per meeting, or other suitable time interval.
  • the controller 502 adjusts the background image from a first background image determined using the color palette to a second background image determined using the color palette after 15 minutes.
  • FIG. 6 a block diagram depicting an electronic device 600 for determining color palettes of background images is shown, in accordance with various examples.
  • the electronic device 600 is the electronic device 100, 200, 400, 500, for example.
  • the electronic device 600 includes a controller 602 and a non-transitory machine-readable medium 604.
  • the controller 602 is the controller 402, 502, for example.
  • the non-transitory machine-readable medium 604 is the storage device 406, 506, for example.
  • the controller 602 is coupled to the non-transitory machine-readable medium 604.
  • the non-transitory machine-readable medium 604 stores machine-readable instructions 606, 608, 610, 612, 614, which, when executed by the controller 602, cause the controller 602 to perform some or all of the actions attributed herein to the controller 602.
  • the machine-readable instructions 606, 608, 610, 612, 614 when executed by the controller 602, cause the controller 602 to determine color palettes of background images, for example.
  • the machine-readable instruction 606, when executed by the controller 602, causes the controller 602 to receive an image (e.g., the image 103, 204A) from an image sensor (e.g., the image sensor 404, 504).
  • the element includes clothing of the user, a type of the clothing, a color of the clothing, or a combination thereof.
  • the controller 602 compares the type of clothing, the color of the clothing, or the combination thereof, to a meeting profile to select the background image.
  • the meeting profiles are stored to the non-transitory machine- readable medium 604, for example.
  • the meeting profiles are generated using the techniques described above with respect to FIG. 1 , for example.
  • the controller 602 compares the color of the clothing to a color palette to select the background image, as described above with respect to FIG. 1 .
  • the element is the type of clothing having the color of the clothing.
  • the controller 602 selects the background image from a first subset of images of the first meeting profile based on the color of the clothing.
  • the controller 602 selects the background image from a second subset of images of the second meeting profile based on the color of the clothing.
  • some or all of the method 300 is performed by a controller (e.g., the controller 402, 502, 602) concurrently or in different sequences and by circuity of an electronic device (e.g., the electronic device 400, 500, 600), execution of machine-readable instructions of the electronic device, or a combination thereof.
  • the method 300 is implemented by machine- readable instructions stored to a storage device (e.g., the storage device 406, 506, the non-transitory machine-readable medium 604, or another storage device not explicitly shown of the electronic device), circuitry (some of which is not explicitly shown) of the electronic device, or a combination thereof.
  • the controller executes the machine-readable instructions to perform some or all of the method 300, for example.
  • a user e.g., the user 104, the user 206 specifies the thresholds, the ranges, or a combination thereof, used by an electronic device (e.g., the electronic device 100, 200, 400, 500, 600) to determine color palettes of background images.
  • the user specifies a frequency threshold, a contrast threshold, a brightness threshold, a count threshold, a color threshold, or a combination thereof.
  • the GUI enables the user to specify whether to select background images from an integrated storage device (e.g., the storage device 406, 506, the non-transitory machine-readable medium 604) of the electronic device or a remote storage device via a network interface.
  • a manufacturer of the electronic device specifies the thresholds.
  • the electronic device uses machine learning techniques to determine the thresholds. The machine learning techniques use selections made by the user to adjust the thresholds specified by the manufacturer, for example.
  • the separate components are integrated in a single package.
  • the storage device 406, 506, is integrated with the controller 402, 502, respectively.
  • the single package may herein be referred to as an integrated circuit (IC) or an integrated chip (IC).
  • the term “comprising” is used in an open-ended fashion, and thus should be interpreted to mean “including, but not limited to... .”
  • the term “couple” or “couples” is intended to be broad enough to encompass both direct and indirect connections. Thus, if a first device couples to a second device, that connection may be through a direct connection or through an indirect connection via other devices, components, and connections.
  • the word “or” is used in an inclusive manner. For example, “A or B” means any of the following: “A” alone, “B” alone, or both “A” and “B.”

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)

Abstract

In some examples, an electronic device includes an image sensor, and a controller to receive an image from the image sensor. In response to a determination that the image depicts a user, the controller determines a color palette associated with the user. The controller selects a background image based on the color palette, modifies the image to include the background image, and causes display, transmission, or a combination thereof, of the modified image.

Description

COLOR PALETTES OF BACKGROUND IMAGES
BACKGROUND
[0001] Electronic devices such as desktops, laptops, notebooks, tablets, and smartphones include executable code that enables users to communicate with users of other electronic devices during meetings. A user of an electronic device comprising the executable code that enables the users to communicate during meetings (e.g., videoconferencing application) shares audio content, video content, or a combination thereof, of the electronic device. The video content may include a real-time video of the user.
BRIEF DESCRIPTION OF THE DRAWINGS
[0002] Various examples are described below referring to the following figures.
[0003] FIGS. 1 A and 1 B are block diagrams of electronic devices for determining color palettes of background images, in accordance with various examples.
[0004] FIGS. 2A and 2B are block diagrams of electronic devices for determining color palettes of background images, in accordance with various examples.
[0005] FIG. 3 is a flow diagram of a method for an electronic device for determining color palettes of background images, in accordance with various examples.
[0006] FIG. 4 is a block diagram of an electronic device for determining color palettes of background images, in accordance with various examples.
[0007] FIG. 5 is a block diagram of an electronic device for determining color palettes of background images, in accordance with various examples.
[0008] FIG. 6 is a block diagram of an electronic device for determining color palettes of background images, in accordance with various examples.
DETAILED DESCRIPTION
[0009] As described above, electronic devices include executable code that enables users to share content with other remote users during meetings. The other remote users are referred to herein as an audience. In some instances, the content includes a real-time video of a user. For privacy or aesthetic reasons, the user may not want to share the environment in which the electronic device is disposed. The real-time video is referred to herein as an image signal. The image signal includes multiple frames that are sequential in time to each other. A frame of the image signal includes an image. A background image, as used herein, refers to a portion of the image that is disposed behind or around the user and depicts the environment in which the electronic device is disposed. The user adjusts the background image by obscuring the portion of the image that does not include the user or by replacing the background image with another background image. In some instances, the adjusted background image does not suit a context of the meeting, distracts from the meeting, and reduces effectiveness of communication. The user taking action to manually adjust the background distracts the user from the meeting and reduces the experience of the user and the audience.
[0010] To automate adjusting the background image, an electronic device includes an image sensor that captures an image signal. The electronic device analyzes an image of the image signal to determine whether the image depicts a user. In response to a determination that the image includes the user, the electronic device analyzes an element of the image to determine a color palette. The element includes clothing of the user, a type of the clothing, a color of the clothing, or a combination thereof. The color palette includes the color of the clothing, a neutral color, a contrast color, or a combination thereof. A neutral color, as used herein, is a muted shade that is not on the color wheel. The color wheel includes primary colors such as red, orange, yellow, green, blue, purple, and shades thereof. Neutral colors include brown, gray, black, white, or shades thereof. A contrast color, as used herein, is a color that complements another color. Contrast colors are located opposite the color that they complement on the color wheel. The electronic device selects a background image from a set of background images utilizing a color of the color palette.
[0011] In some examples, the electronic device selects the background image from the set of background images by determining a context of the image. To determine the context of the image, the electronic device utilizes the type of the clothing, a time of day, or a combination thereof. In various examples, the user utilizes a graphical user interface (GUI) to generate meeting profiles that include different contexts. A meeting profile includes a type of meeting (e.g., personal, work, interview, exercise, casual, formal), a color of a color palette utilized with the type of meeting (e.g., a color worn by user, a contrast color, a neutral color), a subset of the set of background images used with the type of meeting, or a combination thereof. The electronic device determines whether the context of the image is equivalent to a meeting profile of multiple meeting profiles. In response to a determination that the context of the image is equivalent to the meeting profile of the multiple meeting profiles, the electronic device selects a background image from the subset of the set of background images associated with the meeting profile.
[0012] By automatically adjusting the background image, the electronic device enhances the user and the audience experience because the electronic device is able to quickly switch background images to reduce distractions. By utilizing the context to select the background image, the electronic device enhances the user and the audience experience because the electronic device is able to quickly switch background images in response to a switching of context.
[0013] In some examples in accordance with the present description, an electronic device is shown. The electronic device includes an image sensor, and a controller to receive an image from the image sensor. In response to a determination that the image depicts a user, the controller determines a color palette associated with the user. The controller selects a background image based on the color palette, modifies the image to include the background image, and causes display, transmission, or a combination thereof, of the modified image.
[0014] In other examples in accordance with the present description, an electronic device is shown. The electronic device includes an image sensor, and a controller to receive an image from the image sensor and identify a color palette of an element depicted in the image. The element is clothing of a user. In response to a determination that a first meeting profile includes the element and a second meeting profile does not include the element, the controller selects, by using the color palette, a background image from a first subset of images of the first meeting profile. In response to a determination that the second meeting profile includes the element and the first meeting profile does not include the element, the controller selects, by using the color palette, the background image from a second subset of images of the second meeting profile. The controller modifies the image to include the background image, and causes display, transmission, or a combination thereof, of the modified image.
[0015] In yet other examples in accordance with the present description, a non- transitory machine-readable medium is shown. The term “non-transitory,” as used herein, does not encompass transitory propagating signals. The non-transitory machine-readable medium stores machine-readable instructions which, when executed by a controller of an electronic device, cause the controller to receive an image from an image sensor and identify an element depicted in the image. The element includes a clothing of the user, a type of the clothing, a color of the clothing, or a combination thereof. The machine-readable instructions, when executed by the controller, cause the controller to select a background image based on the element, modify the image to include the background image, and cause display, transmission, or a combination thereof, of the modified image.
[0016] Referring now to FIGS. 1A and 1 B, block diagrams of an electronic device 100 for determining color palettes of background images are shown, in accordance with various examples. The electronic device 100 is a desktop, laptop, notebook, tablet, smartphone, or other suitable computing device able to determine color palettes of background images, for example. The electronic device 100 includes a display device 102A, 102B. The display devices 102A, 102B are referred to collectively as a display device 102. The display device 102 is any suitable device for displaying data of the electronic device 100. The display devices 102 is a liquid crystal display (LCD), a light-emitting diode (LED) display, a plasma display, or a quantum dot (QD) display, for example.
[0017] Referring now to FIG. 1A, the electronic device 100 includes the display device 102A, in accordance with various examples. The display device 102A displays an image 103. The image 103 depicts a user 104 and a background 106. The user 104 is sitting in a chair 108 and is wearing a headset 110. The chair 108 is a first color, as indicated by the dotted grid lines. The headset 110 is a second color, as indicated by the black-and-white-checkered pattern. The user 104 is wearing a shirt including a collar 112, a body 114, and graphic designs 116, 118, 120. The collar 112 has a third color, as indicated by the vertical stripes. The body 114 has a fourth color, as indicated by horizontal stripes. A graphic design 116 is a triangle having a fifth color, as indicated by the black-dotted white background. A graphic design 118 is a circle having a sixth color, as indicated by the white-dotted black background. A graphic design 120 is a rectangle having a seventh color, as indicated by the diagonal bricked pattern. The background 106 includes a window and a wall. A frame of the window has an eighth color, as indicated by the diagonal stripes.
[0018] Referring now to FIG. 1 B, the electronic device 100 includes the display device 102B, in accordance with various examples. The display device 102B displays a GUI 122. The GUI 122 includes options 124, 126, 128, 130 and buttons 132, 134. An option 124 enables selection of a background color from a list of background colors. An option 126 enables selection of a background type from a list of background types. The list includes “Office,” “Coffee,” “Gaming,” “Family Chat,” and an option to scroll up or down to view more background types. An option 128 enables selection of a neutral color. An option 130 enables selection of a contrast color. A button 132 enables saving of a meeting profile. A button 134 enables a search using the context of the meeting profile selected via the options 124, 126, 128, 130.
[0019] Referring again to FIGS. 1A and 1 B, as described above, to automate adjusting the background 106 of the image 103, the electronic device 100 includes an image sensor (not explicitly shown) that captures the image 103. In some examples, the electronic device 100 causes the display device 102A to display the image 103 in response to a request by an application to access the image sensor. The application, as used herein, is executable code, which when executed by the electronic device, causes the electronic device to perform a task, such as videoconferencing.
[0020] The electronic device 100 analyzes the image 103 to determine whether the image 103 depicts the user 104. In various examples, the electronic device 100 detects the user 104 utilizing facial detection techniques. For example, the electronic device 100 decomposes the image 103 utilizing a pre-processing technique. Decomposing, as used herein, reduces objects of the image 103 to edge-like structures. The pre-processing techniques include grayscaling, blurring, sharpening, thresholding, resizing, cropping, or a combination thereof, for example. The electronic device 100 utilizes the facial detection technique to determine whether low intensity regions of the decomposed image include facial features. The facial features include eyebrows, eyes, a nose, lips, hairline, jawline, or a combination thereof, for example. In another example, the electronic device 100 utilizes a machine learning technique to detect the facial features. The machine learning technique compares the facial features to multiple templates to determine that the features indicate a face, for example. In various examples, the electronic device 100 utilizes a machine learning technique that implements a convolution neural network (CNN) to determine whether the image 103 includes a face. The CNN is trained with a training set that includes multiple images of multiple faces, for example. The multiple images include faces having different user profiles. Utilizing the trained CNN, the electronic device 100 identifies facial features of the user 104 depicted within the image 103.
[0021] In other examples, the electronic device 100 detects the user 104 using a CNN to perform a segmentation technique. Using the segmentation technique, the electronic device 100 divides the image 103 into pixel groupings. The electronic device 100 uses the CNN to identify tangible objects of the pixel groupings, features of the tangible objects, boundaries of the tangible objects, or a combination thereof. The electronic device 100 uses the machine learning technique to distinguish the user 104 from the background 106. To distinguish the user 104 from the background 106, the electronic device 100 uses the CNN to determine whether the image 103 includes different body parts (e.g., head, left hand, right hand, front torso, back torso, left arm, right arm), for example.
[0022] In response to a determination that the image 103 includes the user 104, the electronic device 100 analyzes an element of the image 103 to determine a color palette. In some examples, the element is the user 104. The electronic device 100 analyzes pixels associated with the user 104 to determine a set of colors that occur with frequencies that are greater than a frequency threshold. For example, in response to the frequency threshold being equivalent to 18.5%, the electronic device 100 determines that the color palette includes a first color that occurs within 20% of the pixels associated with the user 104 and a second color that occurs within 45% of the pixels associated with the user 104. In some examples, the electronic device 100 determines frequencies of colors of multiple elements of the image 103 to determ ine the color palette. The electronic device 100 determ ines the color palette is a first subset of the colors of the multiple elements that have frequencies that are greater than the frequencies of a second subset of the colors. For example, the electronic device 100 determines that six different colors are associated with six elements associated with the user 104. The electronic device 100 determines the color palette includes the five colors of the multiple elements that have frequencies that are greater than the sixth color of the multiple elements. [0023] As described above, in some examples, the user 104 utilizes the GU1 122 to generate meeting profiles that include different contexts. For example, the electronic device 100 determines the color palette using the elements of the image 103 that are items of clothing of the user 104. The electronic device 100 identifies the collar 112, the body 114, the graphic designs 116, 118, 120 as the items of clothing associated with different body parts of the user 104, for example. The electronic device 100 analyzes pixels associated with the collar 112, the body 114, and the graphic designs 116, 118, 120, to determine the color palette. The electronic device 100 causes the display device 102B to display the color palette as the option 124, for example.
[0024] In some examples, in response to the user 104 enabling the option 128, the electronic device 100 includes a neutral color in the color palette displayed as the option 124. In other examples, in response to the user 104 enabling the option 128, the electronic device 100 replaces the color palette displayed as the option 124 with a different color palette that includes a number of neutral colors. In various examples, in response to the user 104 enabling the option 130, the electronic device 100 includes a contrast color in the color palette displayed as the option 124. In other examples, in response to the user 104 enabling the option 130, the electronic device 100 replaces the color palette displayed as the option 124 with a different color palette that includes a number of contrast colors.
[0025] In various examples, in response to the user 104 selecting a background type from the option 126 and selecting the button 132, the electronic device 100 stores the color palette displayed as the option 124 and the background type selected from the option 126 to a meeting profile. In some examples, the electronic device 100 prompts the user 104 to enter a label for the meeting profile. In other examples, the electronic device 100 generates the label for the meeting profile using an identifier of a color of the color palette selected from the option 124, an identifier of the color palette displayed as the option 124, an identifier of the background type selected from the option 126, or other suitable meeting profile identifier. By enabling the user 104 to select from the options 124, 126, 128, 130 and save the selections using the button 132, the electronic device 100 enhances the user experience by providing the electronic device 100 context to use to quickly switch background images to reduce distractions.
[0026] As described above, in some examples, the electronic device 100 selects a background image from a set of background images utilizing a color of the color palette. For example, in response to the user 104 selecting a color of the option 124 and selecting the button 134, the electronic device 100 searches the set of background images for images that include the color. In other examples, the electronic device 100 selects the background image from a subset of the set of background images by determining the context of the image 103 in response to the user 104 selecting a background type from the option 126. In some examples, the electronic device 100 selects the background image from the subset of the set of background images determined using the context of the image 103 in response to the user 104 selecting the background type from the option 126 and the user 104 selecting the color of the option 124. The background images are stored to the electronic device 100, to a remote storage device communicatively coupled to the electronic device 100, or a combination thereof.
[0027] Referring now to FIGS. 2A and 2B, block diagrams of an electronic device 200 for determining color palettes of background images are shown, in accordance with various examples. The electronic device 200 is the electronic device 100, for example. The electronic device 200 includes a display device 202. The display device 202 is the display device 102, for example.
[0028] Referring now to FIG. 2A, the electronic device 200 includes the display device 202 displaying an image 204A, in accordance with various examples. The image 204A depicts a user 206 and a background 208A. The user 206 is wearing a shirt having buttons that are a first color 210, as indicated by the black circles, and a body that is a second color 212, as indicated by the black-dotted white background.
[0029] Referring now to FIG. 2B, the electronic device 200 includes the display device 202 displaying an image 204B, in accordance with various examples. The image 204B is the image 204A after the electronic device 200 has adjusted the background image, for example. The image 204B depicts the user 206 and a background 208B. A color of the background 208B is equivalent to the second color 212, as indicated by the black-dotted white background.
[0030] Referring again to FIGS. 2A and 2B, as described above, to automate adjusting the background 208A of the image 204A, the electronic device 200 includes an image sensor (not explicitly shown) that captures the image 204A. The electronic device 200 analyzes the image 204A to determine whether the image 204A depicts the user 206. In response to a determination that the image 204A includes the user 206, the electronic device 200 analyzes clothing of the user 206 to determine a color palette. The color palette includes the first color 210 and the second color 212. The electronic device 200 obscures the background 208A using the second color 212 to generate the image 204B having the background 208B. By automatically adjusting the background 208A, 208B, the electronic device 200 enhances the user 206 and the audience experience because the electronic device 200 is able to quickly switch background images to reduce distractions.
[0031] FIG. 3 is a flow diagram of a method 300 for an electronic device (e.g., the electronic device 100, 200) for determining color palettes of background images is shown, in accordance with various examples. The method 300 includes receiving an image signal (302). The method 300 also includes isolating frames of the image signal (304). Additionally, the method 300 includes decomposing an image of a frame (306). The method 300 includes determining whether a user is depicted in the image (308). In response to a determination that the user is not depicted in the frame, the method 300 includes releasing the frame (310). In response to a determination that the user is depicted in the image, the method 300 includes determining a color palette of the image (312). The method 300 also includes locating a background image based on the color palette (314). Additionally, the method 300 includes applying the background image to the image of the frame (316). The method 300 includes releasing the frame (310).
[0032] In various examples, the method 300 also includes receiving a request by an application to access an image sensor. In response to receiving the request, the method 300 begins determining a color palette of a background image by receiving the image signal (302). In some examples, the method 300 includes decomposing an image (e.g., the image 103, the image 204A) of a frame of the image signal, determining whether the image depicts the user (e.g., the user 104, 206), or a combination thereof, using the techniques described above with respect to FIG. 1. In other examples, the method 300 determines that the image depicts the user by using a machine learning technique to perform object tracking. For example, the method 300 includes using a CNN to compare subsequent frames of the image signal to distinguish objects of the background (e.g., the background 106, 208A) from the user, elements (e.g., the chair 108, the headset 110, the collar 112, the body 114, the graphic designs 116, 118, 120, the buttons having the first color 210, the body having the second color 212) associated with the user, or a combination thereof.
[0033] In response to a determination that the user is depicted in the image, the method 300 includes determining a color palette of the image utilizing the techniques described above with respect to FIG. 1 , for example. In another example, the method 300 includes determining the color palette of the image using machine learning techniques. A CNN is trained with training sets that include multiple images, for example. The image of the frame of the image signal is used as an input into the CNN. The CNN outputs a color palette based on the image of the frame of the image signal.
[0034] In various examples, the method 300 also includes locating a background image based on the color palette using a knowledge graph. The knowledge graph, as used herein, is a data structure that includes multiple background images, color palettes associated with the multiple background images, types of clothing associated with the multiple background images, types of meetings associated with the multiple background images, times of day associated with the multiple background images, multiple meeting profiles associated with the multiple background images, or other data that enables selection of a background image based on the color palette of the image of the frame of the image signal. In some examples, the method 300 includes determining color palettes associated with the multiple background images utilizing the techniques described above with respect to FIG. 1 . In some examples, a color palette, a type of clothing, a type of meeting, a time of day, a meeting profile, or a combination thereof, is associated with a subset of the multiple background images. Different color palettes, types of clothing, types of meeting, times of day, meeting profiles, or a combination thereof, are associated with different subsets of the multiple background images, for example. Type of clothing, as used herein, includes a t-shirt, a collared-shirt, a sweater, a jacket, a suit jacket, a tie, a lapel pin, a hat, eyeglasses, a headset, or other items of clothing that provide context that indicate a type of meeting.
[0035] In some examples, the method 300 includes causing a display device (e.g., the display device 102, 202) to display multiple background images that include the color palette. For example, the method 300 includes causing the display device to display a first background image that is a color that is equivalent to a first color of the color palette, a second background image that is a color that is equivalent to a second color of the color palette, a third background image that includes a third and a fourth color of the color palette, and a fourth background image that includes each color of the color palette. In response to a selection of a background image of the multiple background images, the method 300 includes applying the background image to the image of the frame of the image signal. In various examples, the method 300 includes releasing the frame to the application that requested access to the image sensor.
[0036] In various example, in response to determining that the image depicts the user, the method 300 includes determining whether a contrast ratio of the background image is less than a contrast threshold. In response to a determination that the contrast ratio is less than the contrast threshold, the method 300 includes releasing the frame. In response to a determination that the contrast ratio is equivalent to or greater than the contrast threshold, the method 300 includes determining the color palette (312). By adjusting the background image when the contrast ratio is equivalent to or greater than the contrast threshold, the method 300 enhances the user and audience experience by blocking audience distraction due to a dynamic nature of the background image.
[0037] In some examples, the method 300 includes determining that a brightness of the background image of the frame is greater than a brightness threshold. In response to the determination that the brightness of the background image of the frame is greater than the brightness threshold, the method 300 also includes determining the color palette (312). In response to the determination that the brightness of the background image of the frame is equivalent to or less than the brightness threshold, the method 300 also includes releasing the frame (310). By automatically adjusting the background image based on the brightness, the method 300 blocks audience distraction due to a glare of the original background image, for example.
[0038] In other examples, the method 300 includes determining whether the background image includes a total number of objects that is greater than a count threshold. In response to the determination that the total number of objects is equivalent to or less than the count threshold, the method 300 includes releasing the frame (310). In response to the determination that the total number of objects is greater than the count threshold, the method 300 includes determining the color palette (312). By automatically adjusting the background image based on the total number of objects in the background image, the method 300 blocks audience distraction due to a busyness of the background image, for example.
[0039] In some examples, the method 300 includes determining whether the background image includes movement. To determine whether the background image includes movement, the method 300 includes comparing objects in the background of a first frame to the objects in the background of a second frame, for example. In response to a determination that objects are in movement behind the user, the method 300 determines the color palette.
[0040] Referring now to FIG. 4, a block diagram of an electronic device 400 for determining color palettes of background images is shown, in accordance with various examples. The electronic device 400 is the electronic device 100, 200, for example. The electronic device 400 includes a controller 402, an image sensor 404, and a storage device 406. The controller 402 is a microcontroller, a microcomputer, a programmable integrated circuit, a programmable gate array, or other suitable device for managing operations of the electronic device 400 or a component or multiple components of the electronic device 400. For example, the controller 402 is a central processing unit (CPU), a graphics processing unit (GPU), or an embedded security controller (EpSC). In another example, the controller 402 is an embedded artificial intelligence (eAl) of the image sensor 404. The image sensor 404 is any suitable device that converts an optical image into an electronic signal (e.g., an image signal). The storage device 406 is a hard drive, a solid-state drive (SSD), flash memory, random access memory (RAM), or other suitable memory for storing data or machine-readable instructions of the electronic device 400.
[0041] While not explicitly shown, in some examples, the electronic device 400 includes network interfaces, video adapters, sound cards, local buses, peripheral devices (e.g., a keyboard, a mouse, a touchpad, a speaker, a microphone, a display device), or a combination thereof. While the image sensor 404 is shown as an integrated image sensor of the electronic device 400, in other examples, the image sensor 404 is coupled to the electronic device 400 via a wired (e.g., Universal Serial Bus (USB)) or a wireless (e.g., WI-FI®, BLUETOOTH®) connection. The network interfaces enable communication over a network. The network interfaces may include a wired (e.g., Ethernet, USB) or a wireless (e.g., WI-FI®, BLUETOOTH®) connection, for example. The network is a local area network (LAN), a wide area network (WAN), a virtual private network (VPN), a client/server network, an Internet (e.g., cloud), or any other suitable system for sharing processing and memory resources, for example. In various examples, the controller 402 is coupled to the image sensor 404 and the storage device 406.
[0042] In some examples, the storage device 406 stores machine-readable instructions 408, 410, 412, 414, 416, which, when executed by the controller 402, cause the controller 402 to perform some or all of the actions attributed herein to the controller 402. The machine-readable instructions 408, 410, 412, 414, 416, when executed by the controller 402, cause the controller 402 to perform some or all of the method 300, for example. [0043] In various examples, the machine-readable instructions 408, 410, 412, 414, 416, which, when executed by the controller 402, cause the controller 402 to determine color palettes of background images. The machine-readable instruction 408, when executed by the controller 402, causes the controller 402 to receive an image from the image sensor 404. In response to a determination that the image depicts a user (e.g., the user 104, 206), the machine-readable instruction 410, when executed by the controller 402, causes the controller 402 to determine a color palette associated with the user. The machine-readable instruction 412, when executed by the controller 402, causes the controller 402 to select a background image based on the color palette. The machine-readable instruction 414, when executed by the controller 402, causes the controller 402 to modify the image to include the background image. The machine-readable instruction 416, when executed by the controller 402, causes the controller 402 to cause display, transmission, or a combination thereof, of the modified image.
[0044] In some examples, the controller 402 determines whether the image depicts the user using the techniques described above with respect to FIGS. 1 or 3. In response to the determination that the image depicts the user, the controller 402 utilizes the techniques described above with respect to FIGS. 1 or 3 to determine the color palette associated with the user. For example, the controller 402 determines a color of clothing of the user to determine the color palette associated with the user. In various examples, the color palette associated with the user includes the color of the clothing, a neutral color, a contrast color, or a combination thereof.
[0045] In various examples, the controller 402 selects the background image using the techniques described above with respect to FIGS. 1 , 2, or 3. In other examples, the controller 402 determines whether a color of the color palette is within a color range of a color of the background image. For example, the color of the color palette is a shade of red having a first color value. The color value is indicated by a Red Green Blue (RGB) color scale, for example. The controller 402 determines whether the first color value is within a color range of a shade of red of the background image having a second color value. In some examples, different color values are associated with different color ranges. For example, shades of red are associated with a first color range, shades of blue are associated with a second color range, and shades of green are associated with a third color range.
[0046] In some examples, the background image is stored to the storage device 406 of the electronic device 400. In some examples, the user (e.g., the user 104, 206) uses a GUI (e.g., the GU1 122) to store a background image, a meeting profile, or a combination thereof, to the storage device 406 of the electronic device 400. The controller 402 analyzes the new background image to determine a color palette for the background image. In various examples, the controller 402 stores the color palette, the background image, the meeting profile, or the combination thereof, to a knowledge graph stored to the storage device 406. In other examples, the background image is stored to a remote storage device (not explicitly shown). The controller 402 transmits a signal to receive the background image from the remote storage device, for example.
[0047] Referring now to FIG. 5, a block diagram of an electronic device 500 for determining color palettes of background images is shown, in accordance with various examples. The electronic device 500 is the electronic device 100, 200, 400, for example. The electronic device 500 includes a controller 502, an image sensor 504, and a storage device 506. The controller 502 is the controller 402, for example. The image sensor 504 is the image sensor 404, for example. The storage device 506 is the storage device 406, for example. In various examples, the controller 502 is coupled to the image sensor 504 and the storage device 506.
[0048] In some examples, the storage device 506 stores machine-readable instructions 508, 510, 512, 514, 516, 518, which, when executed by the controller 502, cause the controller 502 to perform some or all of the actions attributed herein to the controller 502. The machine-readable instructions 508, 510, 512, 514, 516, 518, when executed by the controller 502, cause the controller 502 to determine color palettes of background images, for example. The machine-readable instruction 508, when executed by the controller 502, causes the controller 502 to receive an image from the image sensor 504. The machine-readable instruction 510, when executed by the controller 502, causes the controller 502 to identify a color palette of an element depicted in the image. The element is a clothing of a user. In response to a determination that a first meeting profile includes the element and a second meeting profile does not include the element, the machine-readable instruction 512, when executed by the controller 502, causes the controller 502 to select, by using the color palette, a background image from a first subset of images associated with the first meeting profile. In response to a determination that the second meeting profile includes the element and the first meeting profile does not include the element, the machine-readable instruction 514, when executed by the controller 502, causes the controller 502 to select, by using the color palette, the background image from a second subset of images associated with the second meeting profile. The machine-readable instruction 516, when executed by the controller 502, causes the controller 502 to modify the image to include the background image. The machine-readable instruction 518, when executed by the controller 502, causes the controller 502 to cause display, transmission, or a combination thereof, of the modified image.
[0049] In various examples, the controller 502 identifies the color palette of an element depicted in an image (e.g., the image 103, 204A) using the techniques described above with respect to FIGS. 1 or 3. In some examples, the controller 502 uses the techniques described above with respect to FIGS. 1 or 3 to determinate that a specified portion of the user (e.g., the user 104, 206) is disposed within a furniture item (e.g., the chair 108) and identifies the furniture item as another element that is associated with the user. By associating the furniture item with the user, the controller 502 enhances a boundary associated with the user. The enhanced boundary reduces distortions to the image that are generated by a user movement.
[0050] As described above with respect to FIG. 1 , the user uses a GUI (e.g., the GU1 122) to generate the first and the second meeting profiles. In some examples, the first meeting profile and the second meeting profile are stored to the storage device 506 of the electronic device 500. In other examples, the first meeting profile and the second meeting profile are stored to a remote storage device (not explicitly shown). The controller 502 causes transmission of a signal via a network interface (not explicitly shown) to the remote storage device. In response to the signal, the controller 502 receives another signal via the network interface. The another signal includes the first meeting profile, the second meeting profile, or a combination thereof. In some examples, the first subset of images associated with the first meeting profile is different than the second subset of images associated with the second meeting profile. In other examples, the second subset of images includes an image of the first subset of images.
[0051] In various examples, the first meeting profile and the second meeting profile are associated with a first user. A third meeting profile and a fourth meeting profile are associated with a second user. The controller 502 determines that a user depicted in the image is the second user. The controller 502 identifies a color palette of an element depicted in the image. In response to a determination that the third meeting profile includes the element and the fourth meeting profile does not include the element, the controller 502 selects, by using the color palette, a background image from a third subset of images of the third meeting profile. In response to a determination that the fourth meeting profile includes the element and the third meeting profile does not include the element, the controller 502 selects, by using the color palette, the background image from a fourth subset of images of the fourth meeting profile. The controller 502 modifies the image to include the background image. The controller 502 causes display, transmission, or a combination thereof, of the modified image.
[0052] In some examples, the GUI enables the user to specify a frequency with which the controller 502 adjusts the background image. The frequency is a number of minutes, a number of hours, daily, weekly, per meeting, or other suitable time interval. For example, during a meeting, the controller 502 adjusts the background image from a first background image determined using the color palette to a second background image determined using the color palette after 15 minutes.
[0053] Referring now to FIG. 6, a block diagram depicting an electronic device 600 for determining color palettes of background images is shown, in accordance with various examples. The electronic device 600 is the electronic device 100, 200, 400, 500, for example. The electronic device 600 includes a controller 602 and a non-transitory machine-readable medium 604. The controller 602 is the controller 402, 502, for example. The non-transitory machine-readable medium 604 is the storage device 406, 506, for example. In various examples, the controller 602 is coupled to the non-transitory machine-readable medium 604. [0054] In some examples, the non-transitory machine-readable medium 604 stores machine-readable instructions 606, 608, 610, 612, 614, which, when executed by the controller 602, cause the controller 602 to perform some or all of the actions attributed herein to the controller 602. The machine-readable instructions 606, 608, 610, 612, 614, when executed by the controller 602, cause the controller 602 to determine color palettes of background images, for example. [0055] In various examples, the machine-readable instruction 606, when executed by the controller 602, causes the controller 602 to receive an image (e.g., the image 103, 204A) from an image sensor (e.g., the image sensor 404, 504). The machine-readable instruction 608, when executed by the controller 602, causes the controller 602 to identify an element depicted in the image. The element includes clothing of the user, a type of the clothing, a color of the clothing, or a combination thereof. The machine-readable instruction 610, when executed by the controller 602, causes the controller 602 to select a background image based on the element. The machine-readable instruction 612, when executed by the controller 602, causes the controller 602 to modify the image to include the background image. The machine-readable instruction 614, when executed by the controller 602, causes the controller 602 to cause display, transmission, or a combination thereof, of the modified image.
[0056] In some examples, the controller 602 compares the type of clothing, the color of the clothing, or the combination thereof, to a meeting profile to select the background image. The meeting profiles are stored to the non-transitory machine- readable medium 604, for example. In various examples, the meeting profiles are generated using the techniques described above with respect to FIG. 1 , for example. In some examples, the controller 602 compares the color of the clothing to a color palette to select the background image, as described above with respect to FIG. 1 . In various examples, the element is the type of clothing having the color of the clothing. In response to a determination that the type of clothing indicates a first meeting profile, the controller 602 selects the background image from a first subset of images of the first meeting profile based on the color of the clothing. In response to a determination that the type of clothing indicates a second meeting profile, the controller 602 selects the background image from a second subset of images of the second meeting profile based on the color of the clothing.
[0057] Unless infeasible, some or all of the method 300 is performed by a controller (e.g., the controller 402, 502, 602) concurrently or in different sequences and by circuity of an electronic device (e.g., the electronic device 400, 500, 600), execution of machine-readable instructions of the electronic device, or a combination thereof. For example, the method 300 is implemented by machine- readable instructions stored to a storage device (e.g., the storage device 406, 506, the non-transitory machine-readable medium 604, or another storage device not explicitly shown of the electronic device), circuitry (some of which is not explicitly shown) of the electronic device, or a combination thereof. The controller executes the machine-readable instructions to perform some or all of the method 300, for example.
[0058] In some examples, utilizing a GUI (e.g., the GUI 122), a user (e.g., the user 104, the user 206) specifies the thresholds, the ranges, or a combination thereof, used by an electronic device (e.g., the electronic device 100, 200, 400, 500, 600) to determine color palettes of background images. For example, the user specifies a frequency threshold, a contrast threshold, a brightness threshold, a count threshold, a color threshold, or a combination thereof. In various examples, the GUI enables the user to specify whether to select background images from an integrated storage device (e.g., the storage device 406, 506, the non-transitory machine-readable medium 604) of the electronic device or a remote storage device via a network interface. In other examples, a manufacturer of the electronic device specifies the thresholds. In various examples, the electronic device uses machine learning techniques to determine the thresholds. The machine learning techniques use selections made by the user to adjust the thresholds specified by the manufacturer, for example.
[0059] While some components are shown as separate components of the electronic device 400, 500, 600, in other examples, the separate components are integrated in a single package. For example, the storage device 406, 506, is integrated with the controller 402, 502, respectively. The single package may herein be referred to as an integrated circuit (IC) or an integrated chip (IC). [0060] The above description is meant to be illustrative of the principles and various examples of the present description. Numerous variations and modifications become apparent to those skilled in the art once the above description is fully appreciated. It is intended that the following claims be interpreted to embrace all such variations and modifications.
[0061] In the figures, certain features and components disclosed herein are shown in exaggerated scale or in somewhat schematic form, and some details of certain elements are not shown in the interest of clarity and conciseness. In some of the figures, in order to improve clarity and conciseness, a component or an aspect of a component are omitted.
[0062] In the above description and in the claims, the term “comprising” is used in an open-ended fashion, and thus should be interpreted to mean “including, but not limited to... .” Also, the term “couple” or “couples” is intended to be broad enough to encompass both direct and indirect connections. Thus, if a first device couples to a second device, that connection may be through a direct connection or through an indirect connection via other devices, components, and connections. Additionally, the word “or” is used in an inclusive manner. For example, “A or B” means any of the following: “A” alone, “B” alone, or both “A” and “B.”

Claims

CLAIMS What is claimed is:
1 . An electronic device, comprising: an image sensor; and a controller to: receive an image from the image sensor; in response to a determination that the image depicts a user, determine a color palette associated with the user; select a background image based on the color palette; modify the image to include the background image; and cause display, transmission, or a combination thereof, of the modified image.
2. The electronic device of claim 1 , wherein the color palette associated with the user includes a neutral color, a contrast color, or a combination thereof.
3. The electronic device of claim 1 , wherein the background image is stored to a storage device of the electronic device.
4. The electronic device of claim 1 , wherein the background image is stored to a remote storage device, and wherein the controller is to transmit a signal to receive the background image from the remote storage device.
5. The electronic device of claim 1 , wherein the controller is to determine a color of clothing of the user to determine the color palette associated with the user.
6. An electronic device, comprising: an image sensor; and a controller to: receive an image from the image sensor; identify a color palette of an element depicted in the image, the element clothing of a user; in response to a determination that a first meeting profile includes the element and a second meeting profile does not include the element, select, by using the color palette, a background image from a first subset of images of the first meeting profile; in response to a determination that the second meeting profile includes the element and the first meeting profile does not include the element, select, by using the color palette, the background image from a second subset of images of the second meeting profile; modify the image to include the background image; and cause display, transmission, or a combination thereof, of the modified image.
7. The electronic device of claim 6, wherein the first meeting profile and the second meeting profile are stored to a storage device of the electronic device.
8. The electronic device of claim 6, wherein the first subset of images is different than the second subset of images.
9. The electronic device of claim 6, wherein the second subset of images includes an image of the first subset of images.
10. The electronic device of claim 6, wherein the first meeting profile and the second meeting profile are associated with a first user, wherein a third meeting profile and a fourth meeting profile are associated with a second user, and wherein the controller is to: determine that a user depicted in the image is the second user; in response to a determination that the third meeting profile includes the element and the fourth meeting profile does not include the element, select, by using the color palette, the background image from a third subset of images of the third meeting profile; in response to a determination that the fourth meeting profile includes the element and the third meeting profile does not include the element, select, by using the color palette, the background image from a fourth subset of images of the fourth meeting profile; modify the image to include the background image; and cause the display, the transmission, or the combination thereof, of the modified image.
11. A non-transitory computer-readable medium storing machine-readable instructions which, when executed by a controller of an electronic device, cause the controller to: receive an image from an image sensor; identify an element depicted in the image, the element including clothing of a user, a type of the clothing, a color of the clothing, or a combination thereof; select a background image based on the element; modify the image to include the background image; and cause display, transmission, or a combination thereof, of the modified image.
12. The non-transitory machine-readable medium of claim 11 , wherein the controller is to compare the type of clothing, the color of the clothing, or the combination thereof, to a meeting profile to select the background image.
13. The non-transitory machine-readable medium of claim 11 , wherein the controller is to compare the color of the clothing to a color palette to select the background image.
14. The non-transitory machine-readable medium of claim 13, wherein the color palette includes a neutral color, a contrast color, or a combination thereof.
15. The non-transitory machine-readable medium of claim 11 , wherein the element is the type of clothing having the color of the clothing, and wherein the controller is to: in response to a determination that the type of the clothing indicates a first meeting profile, select the background image from a first subset of images of the first meeting profile based on the color of the clothing; and in response to a determination that the type of clothing indicates a second meeting profile, select the background image from a second subset of images of the second meeting profile based on the color of the clothing.
PCT/US2022/022868 2022-03-31 2022-03-31 Color palettes of background images WO2023191793A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2022/022868 WO2023191793A1 (en) 2022-03-31 2022-03-31 Color palettes of background images

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2022/022868 WO2023191793A1 (en) 2022-03-31 2022-03-31 Color palettes of background images

Publications (1)

Publication Number Publication Date
WO2023191793A1 true WO2023191793A1 (en) 2023-10-05

Family

ID=81595686

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2022/022868 WO2023191793A1 (en) 2022-03-31 2022-03-31 Color palettes of background images

Country Status (1)

Country Link
WO (1) WO2023191793A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015130309A1 (en) * 2014-02-28 2015-09-03 Hewlett-Packard Development Company, L.P. Customizable profile to modify an identified feature in video feed
US20150379733A1 (en) * 2014-06-26 2015-12-31 Amazon Technologies, Inc. Automatic image-based recommendations using a color palette
US20170140543A1 (en) * 2015-11-18 2017-05-18 Avaya Inc. Semi-background replacement based on rough segmentation
US20180184171A1 (en) * 2016-12-28 2018-06-28 Facebook, Inc. Aggregation of media effects
US20190114813A1 (en) * 2017-10-18 2019-04-18 Facebook, Inc. Color sampling for displaying content items
EP3771203A1 (en) * 2019-07-24 2021-01-27 Huawei Technologies Co., Ltd. Electronic nameplate display method and apparatus in video conference
US20220070389A1 (en) * 2020-09-02 2022-03-03 Cisco Technology, Inc. Matching foreground and virtual background during a video communication session

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015130309A1 (en) * 2014-02-28 2015-09-03 Hewlett-Packard Development Company, L.P. Customizable profile to modify an identified feature in video feed
US20150379733A1 (en) * 2014-06-26 2015-12-31 Amazon Technologies, Inc. Automatic image-based recommendations using a color palette
US20170140543A1 (en) * 2015-11-18 2017-05-18 Avaya Inc. Semi-background replacement based on rough segmentation
US20180184171A1 (en) * 2016-12-28 2018-06-28 Facebook, Inc. Aggregation of media effects
US20190114813A1 (en) * 2017-10-18 2019-04-18 Facebook, Inc. Color sampling for displaying content items
EP3771203A1 (en) * 2019-07-24 2021-01-27 Huawei Technologies Co., Ltd. Electronic nameplate display method and apparatus in video conference
US20220070389A1 (en) * 2020-09-02 2022-03-03 Cisco Technology, Inc. Matching foreground and virtual background during a video communication session

Similar Documents

Publication Publication Date Title
US11074725B2 (en) Rendering semi-transparent user interface elements
US11250241B2 (en) Face image processing methods and apparatuses, and electronic devices
US10554921B1 (en) Gaze-correct video conferencing systems and methods
US9811933B2 (en) Image editing using selective editing tools
US9940745B2 (en) Image manipulation for electronic display
EP3635621A1 (en) System and method for image de-identification
US11138695B2 (en) Method and device for video processing, electronic device, and storage medium
JP2017516140A (en) Facial expression tracking
US8731248B2 (en) Method of performing eye circle correction an image and related computing device
US20210035336A1 (en) Augmented reality display method of simulated lip makeup
US10817071B2 (en) Selectively reducing reflectivity of a display
JP2021517676A (en) Image processing methods and devices, image devices and storage media
US20200242778A1 (en) Pose correction
WO2023093291A1 (en) Image processing method and apparatus, computer device, and computer program product
US11275924B2 (en) Eye-protection mode processing method, device, terminal and computer-readable storage medium in fingerprint recognition mode
CN113282212A (en) Interface display method, interface display device and electronic equipment
US20220286641A1 (en) Background image adjustment in virtual meeting
Zhang et al. A skin color model based on modified GLHS space for face detection
US20230298253A1 (en) Appearance editing method and apparatus for virtual pet, terminal, and storage medium
WO2023191793A1 (en) Color palettes of background images
US10636191B2 (en) Method and apparatus of displaying window border shadow
KR20220029212A (en) Electronic apparatus and controlling method thereof
KR102334030B1 (en) Method for dyeing hair by using computer device
TW201227469A (en) Implement methods of an interactive multi-media bathroom mirror touch display system
WO2023191814A1 (en) Audience configurations of audiovisual signals

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22722371

Country of ref document: EP

Kind code of ref document: A1