WO2015031111A1 - Systems, devices and methods for displaying pictures in a picture - Google Patents
Systems, devices and methods for displaying pictures in a picture Download PDFInfo
- Publication number
- WO2015031111A1 WO2015031111A1 PCT/US2014/051732 US2014051732W WO2015031111A1 WO 2015031111 A1 WO2015031111 A1 WO 2015031111A1 US 2014051732 W US2014051732 W US 2014051732W WO 2015031111 A1 WO2015031111 A1 WO 2015031111A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- pip
- display
- video
- screen
- displaying
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/44—Receiver circuitry for the reception of television signals according to analogue transmission standards
- H04N5/445—Receiver circuitry for the reception of television signals according to analogue transmission standards for displaying additional information
- H04N5/45—Picture in picture, e.g. displaying simultaneously another television channel in a region of the screen
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/14—Display of multiple viewports
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/4223—Cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
- H04N21/4312—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
- H04N21/4316—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/44008—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/4402—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
- H04N21/440263—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the spatial resolution, e.g. for displaying on a connected PDA
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/44—Receiver circuitry for the reception of television signals according to analogue transmission standards
- H04N5/445—Receiver circuitry for the reception of television signals according to analogue transmission standards for displaying additional information
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2352/00—Parallel handling of streams of display data
Definitions
- This disclosure relates generally to display systems and methods.
- features for capturing, displaying and/or recording two or more pictures in a picture (PIPs) system with tracking capability on a display are disclosed.
- Some of the desired features in display devices include displaying a picture in a picture (PIP) on a display of the device.
- PIP picture in a picture
- the ability to display a PIP allows a viewer to watch video on the main screen as well as other videos in the PIP. For example, one may watch a live football game as it's being played in a main screen while also watching video highlights of another game, or other information, in a smaller picture window.
- a single PIP does not allow for watching more than one other video at the same time as watching the main video.
- a viewer while watching a football game on the main video screen, may want to watch more than one other football game in smaller picture windows.
- Conventional systems that display only one PIP therefore compromise the enjoyment and utility of the display.
- the method comprises displaying a video on a screen of the device; selecting a first object in the video; displaying a first PIP on the screen, wherein the first PIP comprises the first object; and displaying a second PIP on the screen, wherein the second PIP comprises a second object.
- the system comprises a selection module configured to recognize selection of a first object to track in a first video; and a screen configured to display the first video, a first PIP comprising the first object as it is being tracked, and a second PIP comprising a second object.
- embodiments of a system for displaying a plurality of PIPs on an electronic device comprise means for displaying a video on a screen of the device; means for selecting a first object in the video; means for tracking the first object; means for displaying a first PIP on the screen, wherein the first PIP displays the first object as it is being tracked; and means for displaying a second PIP on the screen, wherein the second PIP displays a second object.
- a non-transient computer readable medium configured to store instructions that when executed by a processor perform a method for displaying a plurality of PIPs on an electronic device.
- the method comprises displaying a video on a screen of the device; selecting a first object to track in the video; displaying a first PIP on the screen, wherein the first PIP displays the first object as it is being tracked; and displaying a second PIP on the screen, wherein the second PIP displays a second object.
- Figure 1 depicts a front view of an embodiment of a display having two pictures in a picture (PIPs) with one of the PIPs showing a tracked object on the display.
- PIPs picture
- Figure 3 depicts a front view of the display of Figure 1 with three PIPs, where two of the PIPs are displaying tracked objects.
- Figure 7 is a flow chart diagramming one embodiment of a method for displaying and recording two PIPs on a display, where the position of the first PIP moves if it is interfering with the tracked object it is displaying.
- Figure 8 is a flow chart diagramming one embodiment of a method for displaying and recording four PIPs on a display, where the position of the PIP moves if it is interfering with the respective tracked object it is displaying.
- one of the PIP's may display a view taken from a second imaging sensor on the display device.
- the objects in the scene captured from the second image sensor may be shown in their own PIP window on the same screen as the objects viewed by the first image sensor.
- the second image sensor may be viewing an entirely different field of view as the image sensor.
- the first image sensor may be on the front of a cell phone, and the second image sensor may be on the back of the cell phone.
- multiple PIPs allow for objects from the first field of view to be displayed on the screen along with objects from the second field of view.
- the second imaging sensor may be on an entirely different device. For instance, a separate video source may provide the video to display in one or all of the PIPs.
- the first object 112 is in the same field of view as that shown by the display 110.
- one or both displayed objects 120, 124 being shown in the PIPs 116, 118 are in the same field of view as each other.
- the second object is in a different field of view as shown in the main video 105 of the display 110.
- the second object is a user of the device 100, and the user's face is displayed as the second displayed object 120 in the second PIP 116.
- the two PIPs 116, 118 may be resized.
- the second PIP 116 may be expanded to the size shown by resized PIP 122.
- the second PIP 116 may thus have larger dimensions in four directions.
- the second PIP 116 may also have smaller dimensions, as well as different shapes, line weightings, etc.
- the boundary of the second PIP 116 may be dashed as shown by the boundary of the resized PIP 122.
- the location or locations on the display 110 to which the first PIP 118 moves when it interferes with the first object 112 may be controlled. For instance, when the first PIP 118 interferes with the first object 112, the first PIP 118 may move from its original location to a second location that is not interfering with other objects or specified areas on the display 110. In some embodiments, when the first PIP 118 interferes with the first object 112, the first PIP 118 may move to a location where the first PIP 118 will not interfere with the first object 112 or with the second PIP 116.
- More than four PIPs may be useful, for example, when recording and/or viewing microscopic images, such as a cluster of microorganisms, or viewing many objects, such as a herd of many animals.
- the various PIP's may be used to individually track the many microorganisms or animals of interest.
- the first PIP 118, the second PIP 116, the third PIP 126 and a fourth PIP 134 may be shown on the display 110.
- each PIP may display separate objects.
- the first PIP 118 may display the first displayed object 124, shown as a star-shaped object
- the second PIP 116 may display the second displayed object 120, shown as a face
- the third PIP 126 may display a third displayed object 128, shown as a triangular-shaped object
- the fourth PIP 134 may display a fourth displayed object 136, shown as a circular-shaped object.
- Embodiments with four PIPs 116, 118, 126, 134 possess the same abilities and features as other embodiments with only two or three PIPs, as discussed above.
- the four PIPs 116, 118, 126, 134 can similarly be resized, manually or automatically relocated, have changed transparency properties, may show tracked objects, may show objects from the same or from different fields of view, may display objects from different video sources, etc.
- Any features discussed above with respect to embodiments with two or three PIPs, though not explicitly addressed with respect to embodiments with four PIPS may also be implemented, mutatis mutandis, in embodiments with four or more PIPs.
- any features discussed with respect to embodiments with four PIPs, though not explicitly addressed with respect to embodiments with two or three PIPS may also be implemented, mutatis mutandis, in embodiments with two, three or more than four PIPs.
- the various embodiments of the display 110 discussed with respect to Figures 1-4 may be embodied on a wide range of devices and systems.
- Figure 5 depicts an embodiment of one such device 500.
- the device 500 may incorporate any of the features of the various components as discussed above with respect to Figures 1-4, including tracking objects, resizing and/or relocating PIPs, changing PIP properties such as transparency, displaying the PIPs in different configurations, etc.
- the device 500 may be a mobile device, such as a tablet as shown, or a handheld, portable, laptop or desktop computer.
- the device 500 may be a small, handheld computing device, with a display screen with touch input and/or a miniature keyboard, weighing less than 2 pounds (0.91 kg).
- the device 500 may also be manufactured by companies such as Apple®, Nokia®, HTC®, LG®, BlackBerry®, and Motorola Mobility®.
- the device 500 has an operating system (OS), and can run various types of application software, such as apps or applications.
- the device may be equipped with Wi-Fi, Bluetooth, and GPS capabilities that can allow connections to the Internet and other Bluetooth-capable devices, such as an automobile or a microphone headset.
- One or more imaging sensors 512 which may be a camera or cameras, or a media player feature for video and/or music files may be on the device 500 along with a stable battery power source such as a lithium battery.
- the device 500 may be a smartphone. This may be a mobile phone built on a mobile operating system, with advanced computing and connectivity capabilities. It may combine the functions of a personal digital assistant (PDA), email functionality, and a mobile phone.
- PDA personal digital assistant
- the device 500 may also have the functionality of portable media players, low-end compact digital cameras, pocket video cameras, and GPS navigation units to form one multi-use device 500.
- the device 500 also includes a high-resolution touchscreen and web browsers that display standard web pages as well as mobile-optimized sites. High-speed data access may be provided by Wi-Fi, mobile broadband, near field communication (NFC) and/or Bluetooth.
- the first viewed object 504 may be in a field of view that is in a first direction 505 relative to the device 500, as depicted.
- the second viewed object 502 such as a user of the device 500, may be in a field of view that is in a second direction 507 relative to the device 500, as depicted.
- the directions 505, 507 of the two fields of view of the device 500 are in opposite directions.
- the field of view in the direction 507 is viewed by an imaging sensor 512, which may be a camera.
- the field of view in the opposite direction 505 is viewed by an imaging sensor (not shown), which may be another camera, on the opposite side of the device 500 as that of the imaging sensor 512. Therefore, the first displayed object 516 in the first PIP 506 is from a different field of view of the second displayed object 514 in the second PIP 508.
- the main video 503 may display objects from the field of view in the direction 505.
- the first PIP 506 displays the first viewed object 504 which is in that same field of view.
- the second PIP 508 displays the second viewed object 502 from the field of view in the direction 507.
- the two PIPs 506, 508 are overlaid onto the main video 503.
- the PIPs 506, 508 completely obscure the parts of the main video 503 that are "behind” or "underneath” the PIPs 506, 508.
- the PIPs 506, 508 partially obscure the parts of the main video 503 that are "behind” or “underneath” the PIPs 506, 508.
- the PIPs 506, 508 are transparent to some percentage, such as 25%, 50%, 75% etc.
- the various videos or objects being shown by the various PIPs by the devices, systems and methods disclosed herein may also be connected with other video sources or displays.
- the videos shown in the PIPs may originate from a device or source other than the device on which the PIPs are being shown. Further, in some embodiments, the videos shown in the PIPs on one device may also be shown on other devices and/or displays. This is discussed in further detail herein, for example with respect to Figure 6.
- Figure 6 depicts a block diagram of one such embodiment of a device 600 with a display 625.
- the device 600 may be a mobile device, such as those discussed above with respect to Figure 5.
- the device 600 may also be a cell phone, digital camera, personal digital assistant, or the like. It may also be a more stationary device such as a desktop personal computer, video conferencing station, or the like.
- the device 600 may comprise a recording system having modules for displaying, recording and/or managing multiple PIPs.
- the device 600 has a set of components including a processor 620 linked to an imaging sensor 615.
- a working memory 605, storage 610, electronic display 625, and memory 630 containing various modules are also in communication with the processor 620.
- the processor 620 may be a general purpose processing unit or a processor specially designed for imaging applications. As shown, the processor 620 is connected to the working memory 605 and the memory 630.
- the modules in memory 630 may be software, such as programs or applications.
- a plurality of modules may be in the device 600. These modules include instructions that configure the processor 620 to perform various image processing and device management tasks.
- the modules may include modules related to PIP, as well as traditional photographic applications, high dynamic range imaging, panoramic video, or stereoscopic imaging such as 3D images or 3D video.
- the memory 630 stores an imaging sensor control module 635, object of interest detection module 640, touch screen input module 655, settings management module 660, window display module 670, preview control module 675, operating system 680, selection module 685, recording module 690, and PIP display module 695.
- the working memory 605 may be used by processor 620 to store a working set of processor instructions contained in the modules of memory 630. Alternatively, working memory 605 may also be used by processor 620 to store dynamic data created during the operation of device 600.
- the processor 620 is configured by the several modules stored in the memory 630.
- the imaging sensor control module 635 may include instructions that configure the processor 620 to capture images with imaging sensor 615.
- the imaging sensor control module 635 may also include instructions that adjust the focus position of imaging sensor 615. Therefore, processor 620, along with image capture control module 635, imaging sensor 615, and working memory 605 represent one means for capturing an image using an imaging sensor.
- Recording module 690 may include instructions that configure the processor 620 to record the captured images.
- Touch screen input module 655 may include instructions that configure the processor 620 to receive touch inputs from a touch screen display, for example, display 625.
- the settings management module 660 may include instructions to manage various parameter settings for device 600. For example, parameters related to the configuration of the various PIPs may be managed by module 660.
- the window display module 670 may include instructions to manage the layout of data, such as zoom level, within the PIPs generated on display 625 on device 600. For example, the display 625 may include more than one image "window,” such as a PIP, within it. Some "windows" may display data at differing scales. Instructions within window display module 670 may configure the processor to translate data related to each of these sub windows into display commands for display 625.
- the preview control module 675 includes instructions that configure the processor 620 to display a preview window on the display 625 according to the methods described above.
- the preview control module 675 may include instructions that call subroutines in the imaging control module 635 in order to configure the processor 620 to capture a first image using the imaging sensor 615.
- the preview control module 675 may then call the object of interest detection module 640 to detect objects of interest in a first image captured by the imaging sensor 615. Instructions in the preview control module 675 may then invoke the settings management module 660 to determine how the operator has configured the preview window to display on the display 625, for example, in a PIP.
- Window display module 670 may invoke instructions in operating system 680 to control the display and cause it to display the appropriate preview window configuration on electronic display 625.
- the preview control module 675 may also include instructions that configure the processor 620 to verify, or seek verification from a user, of selected and/or detected objects of interest on display 625.
- the operating system module 680 is a mobile operating system and the device 600 is a mobile device.
- the mobile operating system used by the mobile device such as a smartphone, may be Google's Android, Apple's iOS, Symbian, Blackberry Ltd's BlackBerry 10, Samsung's Bada, Microsoft's Windows Phone, Hewlett-Packard's webOS, embedded Linux distributions such as Maemo and MeeGo, Mozilla's Firefox OS, Canonical Ltd.'s Ubuntu Phone, Tizen, or others.
- the mobile device can receive multiple operating system module 680 updates over its lifetime.
- the processor 620 may write data to the storage module 610. While the storage module 610 is represented graphically as a traditional disk device, those with skill in the art understand multiple embodiments could include either a disk based storage device or one of several other type storage mediums to include a memory disk, USB drive, flash drive, remotely connected storage medium, virtual disk driver, or the like.
- Figure 6 depicts a device comprising separate components to include a processor, imaging sensor, and memory
- the memory components may be combined with processor components to save cost and improve performance.
- Figure 6 illustrates two memory components, to include memory component 630 comprising several modules, and a separate memory 605 comprising a working memory, one with skill in the art would recognize several embodiments utilizing different memory architectures.
- the device 600 may have a design that utilizes ROM or static RAM memory for the storage of processor instructions implementing the modules contained in memory 630.
- the object of interest detection module 640 may communicate data related to the first viewed object 504 to the processor 620 which may then display the first object 516 in the main video 503.
- the first object 516 may be selected on the display 501 by, for example, the touch screen input module 655.
- a user of device 600 may touch the area of the display 501 corresponding to the first object 516.
- the touch screen input module 655 may then take this input and communicate it to the processor 620 for further processing.
- the first PIP 506 may be moved or resized using the PIP display module 695 or the window display module 670.
- tracking of the first object 510 in the first PIP 506 may be indicated by the settings management module 660 and carried out by the imaging sensor control module 635 or the object of interest detection module 640.
- any of the functions that may be carried out with the modules of Figure 6 involving the first object 510 in the first PIP 506 may also be carried out on the second displayed object 514 in the second PIP 508.
- the various sets of instructions in the modules of Figure 6 to be carried out on the various features shown in Figure 5 may be carried out by various modules and need not be limited to just one module for one capability.
- the object of interest detection module 640 may be used to detect the first object 510, it is understood that another module, for instance the window display module 670 may instead or in addition provide this functionality. It is therefore understood that the functionalities recited of the various modules and components of Figure 6 as applied to the features of Figure 5 are not the sole modules or components capable of carrying out the aforementioned functions but are merely listed as examples of how the disclosed implementations may be implemented.
- step 730 the first PIP is moved or otherwise altered on the display.
- the first PIP is moved or relocated to a different location on the display. In some embodiments, this movement is automatic such that the first PIP will move without any input from a user or viewer of the display.
- the location or locations on the display to which the first PIP moves may be repeated. For instance, the first PIP may always move to a same second location when it interferes with the first object. The first PIP may then move back to its original location if it interferes with the first object in this second location.
- the first PIP may move to a location where the first PIP will not interfere with the first object, with the second PIP, with the second object, or with any other indicated locations.
- Many other configurations of moving the first PIP are within the ordinary skill of the art and are thus within the scope of the present disclosure.
- the first PIP may alter its appearance. This may be for example to facilitate viewing the first object on the display if the first object overlaps with the first PIP. In some embodiments, the first PIP may disappear when there is interference and then reappear. In some embodiments, the first PIP may become transparent. For example, the first PIP 118 may become fifty percent transparent. Further, combinations of relocating and/or altering the appearance of the first PIP may be implemented. Also, any of the features or capabilities related to interference and/or moving as discussed above with respect to Figures 1 and 2 may be implemented in the method 700.
- the process 800 begins with step 805 wherein a video on a screen or display of a device is displayed.
- the video displayed in step 805 may comprise one or several objects of interest, which may be moving or stationary.
- This step 805 may further include receiving a selection of a first object in the video, detecting the first object, and/or tracking the first object805and/or detecting and tracking the first object.
- the process 800 then moves to step 810 where the first object is displayed in a first PIP on the display. 800Next, in step 815, it is determined whether the location of the first PIP on the display interferes with the location of the first object on the display.
- the process 800 moves to step 870 where the main video, the first PIP, the second PIP, the third PIP and the fourth PIP are recorded.
- the video is the main video on the display over which the first and second PIPs are situated.
- the video, the first PIP, the second PIP, the third PIP and the fourth PIP are all recorded in the same display.
- the recording may be of the main video with the four PIPs situated over the main video.
- the video, the first PIP, the second PIP, the third PIP and the fourth PIP are all recorded in different displays or otherwise in different files.
- the main video may be recorded on one display or file, the first PIP recorded in a first display or file, the second PIP recorded in a second display or file, etc.
- portions of process 800 are performed.
- process 800 may be performed with only two or three PIPs.
- process 800 is performed in different orders.
- the first PIP may be located first
- the third PIP may be located second.
- Other variations such as these to the process 800 are within the scope of the present disclosure.
- a software module may reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of non- transitory storage medium known in the art.
- An exemplary computer-readable storage medium is coupled to the processor such the processor can read information from, and write information to, the computer-readable storage medium.
- the storage medium may be integral to the processor.
- the processor and the storage medium may reside in an ASIC.
- the ASIC may reside in a user terminal, camera, or other device.
- the processor and the storage medium may reside as discrete components in a user terminal, camera, or other device.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Business, Economics & Management (AREA)
- Marketing (AREA)
- Controls And Circuits For Display Device (AREA)
- Transforming Electric Information Into Light Information (AREA)
- Studio Devices (AREA)
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201480046888.0A CN105519097B (zh) | 2013-08-27 | 2014-08-19 | 用于显示画中画的系统、装置和方法 |
| EP14767172.1A EP3039858A1 (en) | 2013-08-27 | 2014-08-19 | Systems, devices and methods for displaying pictures in a picture |
| JP2016538968A JP2016538601A (ja) | 2013-08-27 | 2014-08-19 | ピクチャインピクチャを表示するためのシステム、デバイス、および方法 |
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201361870732P | 2013-08-27 | 2013-08-27 | |
| US61/870,732 | 2013-08-27 | ||
| US14/185,683 | 2014-02-20 | ||
| US14/185,683 US9973722B2 (en) | 2013-08-27 | 2014-02-20 | Systems, devices and methods for displaying pictures in a picture |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2015031111A1 true WO2015031111A1 (en) | 2015-03-05 |
Family
ID=52582732
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2014/051732 Ceased WO2015031111A1 (en) | 2013-08-27 | 2014-08-19 | Systems, devices and methods for displaying pictures in a picture |
Country Status (5)
| Country | Link |
|---|---|
| US (1) | US9973722B2 (enExample) |
| EP (1) | EP3039858A1 (enExample) |
| JP (1) | JP2016538601A (enExample) |
| CN (1) | CN105519097B (enExample) |
| WO (1) | WO2015031111A1 (enExample) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN105163165A (zh) * | 2015-08-31 | 2015-12-16 | 广州酷狗计算机科技有限公司 | 基于画中画的多媒体内容展示方法和装置 |
Families Citing this family (35)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9392217B2 (en) * | 2014-03-20 | 2016-07-12 | Blackberry Limited | Automatically relocating picture-in-picture window in video calls |
| KR102396036B1 (ko) * | 2015-05-18 | 2022-05-10 | 엘지전자 주식회사 | 디스플레이 디바이스 및 그 제어 방법 |
| US20160381297A1 (en) | 2015-06-26 | 2016-12-29 | Jsc Yukon Advanced Optics Worldwide | Providing enhanced situational-awareness using magnified picture-in-picture within a wide field-of-view optical image |
| KR102393510B1 (ko) * | 2015-08-25 | 2022-05-03 | 엘지전자 주식회사 | 디스플레이 디바이스 및 그 제어 방법 |
| US9609230B1 (en) * | 2015-12-30 | 2017-03-28 | Google Inc. | Using a display as a light source |
| JP6479220B2 (ja) * | 2016-01-15 | 2019-03-06 | 楽天株式会社 | コンテンツ投影制御装置、コンテンツ投影制御方法及びプログラム |
| US20170213389A1 (en) * | 2016-01-22 | 2017-07-27 | Lg Electronics Inc. | Mobile terminal and operating method thereof |
| CN107396165B (zh) | 2016-05-16 | 2019-11-22 | 杭州海康威视数字技术股份有限公司 | 一种视频播放方法及装置 |
| CN106028137A (zh) * | 2016-06-22 | 2016-10-12 | 北京小米移动软件有限公司 | 直播处理方法及装置 |
| US11295706B2 (en) * | 2016-06-30 | 2022-04-05 | Microsoft Technology Licensing, Llc | Customizable compact overlay window |
| WO2018003939A1 (ja) * | 2016-06-30 | 2018-01-04 | アイキューブド研究所株式会社 | 画像出力装置、画像出力方法、およびプログラム |
| KR102745637B1 (ko) * | 2016-08-17 | 2024-12-24 | 삼성전자주식회사 | 다시점 영상 제어 방법 및 이를 지원하는 전자 장치 |
| CN106131651B (zh) * | 2016-08-23 | 2019-07-16 | 腾讯科技(深圳)有限公司 | 一种同屏直播方法及装置 |
| US11076200B2 (en) * | 2016-12-13 | 2021-07-27 | Rovi Guides, Inc. | Systems and methods for minimizing obstruction of a media asset by an overlay by predicting a path of movement of an object of interest of the media asset and avoiding placement of the overlay in the path of movement |
| US10586111B2 (en) | 2017-01-13 | 2020-03-10 | Google Llc | Using machine learning to detect which part of the screen includes embedded frames of an uploaded video |
| KR20180094340A (ko) | 2017-02-15 | 2018-08-23 | 엘지전자 주식회사 | 이동단말기 및 그 제어 방법 |
| WO2019046095A1 (en) * | 2017-08-30 | 2019-03-07 | Vid Scale, Inc. | VIDEO ZOOM FOLLOW |
| US10997760B2 (en) * | 2018-08-31 | 2021-05-04 | Snap Inc. | Augmented reality anthropomorphization system |
| CN111263207A (zh) * | 2018-11-30 | 2020-06-09 | 青岛海尔多媒体有限公司 | 用于视频播放设备的控制方法、装置及计算机存储介质 |
| US10893339B2 (en) | 2019-02-26 | 2021-01-12 | Capital One Services, Llc | Platform to provide supplemental media content based on content of a media stream and a user accessing the media stream |
| EP3940687A4 (en) | 2019-03-12 | 2022-05-04 | Sony Group Corporation | INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD AND PROGRAM |
| CN109905749B (zh) * | 2019-04-11 | 2020-12-29 | 腾讯科技(深圳)有限公司 | 视频播放方法和装置、存储介质及电子装置 |
| WO2021083146A1 (zh) | 2019-10-30 | 2021-05-06 | 北京字节跳动网络技术有限公司 | 视频处理方法、装置、终端及存储介质 |
| CN110784674B (zh) | 2019-10-30 | 2022-03-15 | 北京字节跳动网络技术有限公司 | 视频处理的方法、装置、终端及存储介质 |
| CN112040249A (zh) * | 2020-08-11 | 2020-12-04 | 浙江大华技术股份有限公司 | 一种录播方法、装置及单相机 |
| CN113694529B (zh) * | 2021-09-23 | 2024-07-16 | 网易(杭州)网络有限公司 | 游戏画面的显示方法及装置、存储介质、电子设备 |
| CN116132790B (zh) * | 2022-05-25 | 2023-12-05 | 荣耀终端有限公司 | 录像方法和相关装置 |
| CN116095465B (zh) * | 2022-05-25 | 2023-10-20 | 荣耀终端有限公司 | 录像方法、装置及存储介质 |
| CN116095460B (zh) * | 2022-05-25 | 2023-11-21 | 荣耀终端有限公司 | 录像方法、装置及存储介质 |
| CN116112781B (zh) * | 2022-05-25 | 2023-12-01 | 荣耀终端有限公司 | 录像方法、装置及存储介质 |
| CN116112782B (zh) * | 2022-05-25 | 2024-04-02 | 荣耀终端有限公司 | 录像方法和相关装置 |
| CN116055861B (zh) * | 2022-05-30 | 2023-10-20 | 荣耀终端有限公司 | 一种视频编辑方法和电子设备 |
| CN117407596A (zh) | 2022-07-07 | 2024-01-16 | 抖音视界(北京)有限公司 | 用于内容呈现的方法、装置、设备和存储介质 |
| CN117425057A (zh) | 2022-07-07 | 2024-01-19 | 抖音视界(北京)有限公司 | 用于影像拍摄的方法、装置、设备和存储介质 |
| CN115334246B (zh) | 2022-09-06 | 2025-07-01 | 抖音视界有限公司 | 用于影像拍摄的方法、装置、设备和存储介质 |
Citations (16)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP1052849A1 (en) * | 1998-11-30 | 2000-11-15 | Sony Corporation | Information providing device and method |
| US6195497B1 (en) * | 1993-10-25 | 2001-02-27 | Hitachi, Ltd. | Associated image retrieving apparatus and method |
| EP1724695A1 (en) * | 2005-05-17 | 2006-11-22 | Sony Corporation | Image processing apparatus and image processing method |
| US20100218228A1 (en) * | 2009-02-20 | 2010-08-26 | Walter Edward A | System and method for processing image objects in video data |
| US20100239130A1 (en) | 2009-03-18 | 2010-09-23 | Industrial Technology Research Institute | System and method for performing rapid facial recognition |
| US20110047384A1 (en) | 2009-08-21 | 2011-02-24 | Qualcomm Incorporated | Establishing an ad hoc network using face recognition |
| US7916976B1 (en) | 2006-10-05 | 2011-03-29 | Kedikian Roland H | Facial based image organization and retrieval method |
| US8064685B2 (en) | 2004-08-19 | 2011-11-22 | Apple Inc. | 3D object recognition |
| US8249299B1 (en) | 2009-08-17 | 2012-08-21 | Adobe Systems Incorporated | Systems and methods of tracking objects in video |
| US8254699B1 (en) | 2009-02-02 | 2012-08-28 | Google Inc. | Automatic large scale video object recognition |
| US8452107B2 (en) | 2009-10-02 | 2013-05-28 | Qualcomm Incorporated | Methods and systems for occlusion tolerant face recognition |
| US20130174035A1 (en) * | 2011-12-30 | 2013-07-04 | United Video Properties, Inc. | Systems and methods for representing a content dependency list |
| US8483437B2 (en) | 2005-01-07 | 2013-07-09 | Qualcomm Incorporated | Detecting and tracking objects in images |
| US20130176203A1 (en) | 2012-01-11 | 2013-07-11 | Hae-Young Yun | Display apparatus and method of displaying three-dimensional image using the same |
| US20130258141A1 (en) | 2012-03-30 | 2013-10-03 | Qualcomm Incorporated | Method to reject false positives detecting and tracking image objects |
| US20130335575A1 (en) | 2012-06-14 | 2013-12-19 | Qualcomm Incorporated | Accelerated geometric shape detection and accurate pose tracking |
Family Cites Families (21)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH11338449A (ja) * | 1998-05-21 | 1999-12-10 | Mitsubishi Electric Corp | 拡大表示装置 |
| US8250617B2 (en) * | 1999-10-29 | 2012-08-21 | Opentv, Inc. | System and method for providing multi-perspective instant replay |
| US20040128317A1 (en) * | 2000-07-24 | 2004-07-01 | Sanghoon Sull | Methods and apparatuses for viewing, browsing, navigating and bookmarking videos and displaying images |
| US7206029B2 (en) * | 2000-12-15 | 2007-04-17 | Koninklijke Philips Electronics N.V. | Picture-in-picture repositioning and/or resizing based on video content analysis |
| US6816626B1 (en) * | 2001-04-27 | 2004-11-09 | Cisco Technology, Inc. | Bandwidth conserving near-end picture-in-picture videotelephony |
| US8004574B2 (en) * | 2004-07-21 | 2011-08-23 | Nokia Corporation | Portable electronic devices with picture in picture capability |
| CN1921605A (zh) | 2005-08-22 | 2007-02-28 | 上海乐金广电电子有限公司 | 利用机顶盒进行移动通信终端机数码影像传输方法 |
| US7697024B2 (en) | 2005-11-03 | 2010-04-13 | Broadcom Corp. | Method and system of tracking and stabilizing an image transmitted using video telephony |
| US8004555B2 (en) | 2006-05-31 | 2011-08-23 | Motorola Mobility, Inc. | Methods and devices for simultaneous dual camera video telephony |
| JP2008048364A (ja) * | 2006-08-21 | 2008-02-28 | Toshiba Corp | 画像表示装置 |
| JP2008096868A (ja) * | 2006-10-16 | 2008-04-24 | Sony Corp | 撮像表示装置、撮像表示方法 |
| KR100836616B1 (ko) * | 2006-11-14 | 2008-06-10 | (주)케이티에프테크놀로지스 | 영상 합성 기능을 가지는 휴대용 단말기 및 휴대용단말기의 영상 합성 방법 |
| EP2479993A3 (en) * | 2006-12-04 | 2013-12-11 | Lynx System Developers, Inc. | Autonomous systems and methods for still and moving picture production |
| JP4915420B2 (ja) * | 2006-12-11 | 2012-04-11 | 株式会社ニコン | 電子カメラ |
| KR101333065B1 (ko) * | 2007-07-04 | 2013-11-27 | 삼성전자주식회사 | 화면 속 화면을 이용한 방송데이터 표시방법 및 장치 |
| KR101009881B1 (ko) * | 2008-07-30 | 2011-01-19 | 삼성전자주식회사 | 재생되는 영상의 타겟 영역을 확대 디스플레이하기 위한장치 및 방법 |
| US20100188579A1 (en) | 2009-01-29 | 2010-07-29 | At&T Intellectual Property I, L.P. | System and Method to Control and Present a Picture-In-Picture (PIP) Window Based on Movement Data |
| US9264659B2 (en) | 2010-04-07 | 2016-02-16 | Apple Inc. | Video conference network management for a mobile device |
| KR101680684B1 (ko) | 2010-10-19 | 2016-11-29 | 삼성전자주식회사 | 영상 처리 방법 및 이를 적용한 영상 촬영 장치 |
| CA2798298C (en) * | 2011-12-09 | 2016-08-23 | W-Ideas Network Inc. | Systems and methods for video processing |
| US20130155308A1 (en) * | 2011-12-20 | 2013-06-20 | Qualcomm Incorporated | Method and apparatus to enhance details in an image |
-
2014
- 2014-02-20 US US14/185,683 patent/US9973722B2/en not_active Expired - Fee Related
- 2014-08-19 JP JP2016538968A patent/JP2016538601A/ja active Pending
- 2014-08-19 CN CN201480046888.0A patent/CN105519097B/zh not_active Expired - Fee Related
- 2014-08-19 EP EP14767172.1A patent/EP3039858A1/en not_active Ceased
- 2014-08-19 WO PCT/US2014/051732 patent/WO2015031111A1/en not_active Ceased
Patent Citations (16)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6195497B1 (en) * | 1993-10-25 | 2001-02-27 | Hitachi, Ltd. | Associated image retrieving apparatus and method |
| EP1052849A1 (en) * | 1998-11-30 | 2000-11-15 | Sony Corporation | Information providing device and method |
| US8064685B2 (en) | 2004-08-19 | 2011-11-22 | Apple Inc. | 3D object recognition |
| US8483437B2 (en) | 2005-01-07 | 2013-07-09 | Qualcomm Incorporated | Detecting and tracking objects in images |
| EP1724695A1 (en) * | 2005-05-17 | 2006-11-22 | Sony Corporation | Image processing apparatus and image processing method |
| US7916976B1 (en) | 2006-10-05 | 2011-03-29 | Kedikian Roland H | Facial based image organization and retrieval method |
| US8254699B1 (en) | 2009-02-02 | 2012-08-28 | Google Inc. | Automatic large scale video object recognition |
| US20100218228A1 (en) * | 2009-02-20 | 2010-08-26 | Walter Edward A | System and method for processing image objects in video data |
| US20100239130A1 (en) | 2009-03-18 | 2010-09-23 | Industrial Technology Research Institute | System and method for performing rapid facial recognition |
| US8249299B1 (en) | 2009-08-17 | 2012-08-21 | Adobe Systems Incorporated | Systems and methods of tracking objects in video |
| US20110047384A1 (en) | 2009-08-21 | 2011-02-24 | Qualcomm Incorporated | Establishing an ad hoc network using face recognition |
| US8452107B2 (en) | 2009-10-02 | 2013-05-28 | Qualcomm Incorporated | Methods and systems for occlusion tolerant face recognition |
| US20130174035A1 (en) * | 2011-12-30 | 2013-07-04 | United Video Properties, Inc. | Systems and methods for representing a content dependency list |
| US20130176203A1 (en) | 2012-01-11 | 2013-07-11 | Hae-Young Yun | Display apparatus and method of displaying three-dimensional image using the same |
| US20130258141A1 (en) | 2012-03-30 | 2013-10-03 | Qualcomm Incorporated | Method to reject false positives detecting and tracking image objects |
| US20130335575A1 (en) | 2012-06-14 | 2013-12-19 | Qualcomm Incorporated | Accelerated geometric shape detection and accurate pose tracking |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN105163165A (zh) * | 2015-08-31 | 2015-12-16 | 广州酷狗计算机科技有限公司 | 基于画中画的多媒体内容展示方法和装置 |
| CN105163165B (zh) * | 2015-08-31 | 2018-09-04 | 广州酷狗计算机科技有限公司 | 基于画中画的多媒体内容展示方法和装置 |
Also Published As
| Publication number | Publication date |
|---|---|
| JP2016538601A (ja) | 2016-12-08 |
| US9973722B2 (en) | 2018-05-15 |
| CN105519097B (zh) | 2018-06-22 |
| CN105519097A (zh) | 2016-04-20 |
| EP3039858A1 (en) | 2016-07-06 |
| US20150062434A1 (en) | 2015-03-05 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US9973722B2 (en) | Systems, devices and methods for displaying pictures in a picture | |
| US9392165B2 (en) | Array camera, mobile terminal, and methods for operating the same | |
| US10614855B2 (en) | Spherical video editing | |
| US10922530B2 (en) | Display device and operating method thereof with adjustments related to an image display according to bending motion of the display device | |
| US10686990B2 (en) | Mobile terminal and method of controlling the same | |
| KR102018887B1 (ko) | 신체 부위 검출을 이용한 이미지 프리뷰 | |
| CN106688227B (zh) | 多摄像装置、多摄像方法 | |
| KR102384054B1 (ko) | 이동 단말기 및 그 제어 방법 | |
| CN104995558B (zh) | 一种获取全景图像的方法及终端 | |
| EP3226537A1 (en) | Mobile terminal and method for controlling the same | |
| KR102313755B1 (ko) | 이동 단말기 및 그 제어 방법 | |
| KR102240639B1 (ko) | 글래스 타입 단말기 및 그것의 제어 방법 | |
| CN108619721A (zh) | 虚拟场景中的距离信息显示方法、装置及计算机设备 | |
| CN108537845A (zh) | 位姿确定方法、装置及存储介质 | |
| CN107343081A (zh) | 移动终端及其控制方法 | |
| JP6400293B2 (ja) | 電子装置でコンテンツを制御する装置及びその方法 | |
| CN108322644A (zh) | 一种图像处理方法、移动终端以及计算机可读存储介质 | |
| CN109829937A (zh) | 使用遮挡来检测并跟踪三维对象 | |
| JP2014531644A (ja) | 撮像されるオブジェクトの特徴に基づく拡張現実 | |
| WO2017032336A1 (en) | System and method for capturing and displaying images | |
| KR20160127708A (ko) | 재생 제어를 위한 방법, 장치 및 전자 기기 | |
| US12482145B2 (en) | Remote landmark rendering for extended reality interfaces | |
| CN108776822B (zh) | 目标区域检测方法、装置、终端及存储介质 | |
| EP3236336A1 (en) | Virtual reality causal summary content | |
| CN109002248A (zh) | Vr场景截图方法、设备及存储介质 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14767172 Country of ref document: EP Kind code of ref document: A1 |
|
| DPE1 | Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101) | ||
| REEP | Request for entry into the european phase |
Ref document number: 2014767172 Country of ref document: EP |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2014767172 Country of ref document: EP |
|
| ENP | Entry into the national phase |
Ref document number: 2016538968 Country of ref document: JP Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |