US20100026721A1 - Apparatus and method for displaying an enlarged target region of a reproduced image - Google Patents

Apparatus and method for displaying an enlarged target region of a reproduced image Download PDF

Info

Publication number
US20100026721A1
US20100026721A1 US12/476,620 US47662009A US2010026721A1 US 20100026721 A1 US20100026721 A1 US 20100026721A1 US 47662009 A US47662009 A US 47662009A US 2010026721 A1 US2010026721 A1 US 2010026721A1
Authority
US
United States
Prior art keywords
window
image
target region
target object
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/476,620
Inventor
Hyun-Hee Park
Jong-man Kim
Min-Woo Lee
Yun-Je Oh
Sung-Dae Cho
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Assigned to SAMSUNG ELECTRONICS CO., LTD. reassignment SAMSUNG ELECTRONICS CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CHO, SUNG-DAE, KIM, JONG-MAN, LEE, MIN-WOO, OH, YUN-JE, PARK, HYUN-HEE
Publication of US20100026721A1 publication Critical patent/US20100026721A1/en
Priority to US14/249,993 priority Critical patent/US9648269B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/44Receiver circuitry for the reception of television signals according to analogue transmission standards
    • H04N5/445Receiver circuitry for the reception of television signals according to analogue transmission standards for displaying additional information
    • H04N5/45Picture in picture, e.g. displaying simultaneously another television channel in a region of the screen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/14Display of multiple viewports
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4316Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440263Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the spatial resolution, e.g. for displaying on a connected PDA
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/4728End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for selecting a Region Of Interest [ROI], e.g. for requesting a higher resolution version of a selected region
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/74Circuits for processing colour signals for obtaining special effects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04805Virtual magnifying lens, i.e. window or frame movable on top of displayed information to enlarge it for better reading or selection
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/10Special adaptations of display systems for operation with variable images
    • G09G2320/103Detection of image changes, e.g. determination of an index representative of the image change
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/0407Resolution change, inclusive of the use of different resolutions for different screen areas
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/045Zooming at least part of an image, i.e. enlarging it or shrinking it
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/0464Positioning
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/12Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/39Control of the bit-mapped memory
    • G09G5/395Arrangements specially adapted for transferring the contents of the bit-mapped memory to the screen
    • G09G5/397Arrangements specially adapted for transferring the contents of two or more bit-mapped memories to the screen simultaneously, e.g. for mixing or overlay

Definitions

  • the present invention relates generally to an apparatus and a method for enlarging a particular region of an image, and more particularly to an apparatus and a method for enlarging and displaying a particular region of an image while reproducing the image.
  • Portable devices such as Moving Picture Experts Group Audio Layer-3 (MP3) players, mobile terminals, Portable Multimedia Players (PMPs), etc.
  • MP3 players Moving Picture Experts Group Audio Layer-3
  • PMPs Portable Multimedia Players
  • the portable devices are constantly increasing in popularity. Consequently, the portable devices are constantly being improved to provide more user features.
  • MP3 improvements were limited to music listening or viewing images.
  • the improvements extend into various fields, such as Internet, games, image communication, etc.
  • a portable device such as a mobile terminal capable of displaying still images and video, provides various screen display methods.
  • the various screen display methods can include Picture In Picture (PIP), On-Screen Display (OSD), etc.
  • a PIP function provides another small screen on a larger screen, and can be used to display different images, e.g., from two different channels, on the two screens, respectively.
  • the PIP function only has to display signals, which are output from two video decks, in a predetermined part of an image. Accordingly, the PIP function reproduces a second image in a fixed potion of the larger screen displaying a first image.
  • a PIP type display is also used to simultaneously display a calling and called party on one screen for a videophone call. This technology can also be implemented by adding a hardware video codec.
  • An OSD function reconfigures two image sources into one image.
  • the OSD function is used when a mobile terminal simultaneously displays a user interface related to a menu with a broadcast screen.
  • the conventional PIP configuration displays images from two sources on an output screen simply by adding a hardware codec.
  • a user has difficulty adjusting the size and resolution of an image. Also, the user cannot change the position of a PIP window on a main screen.
  • the OSD function simply provides an overlapping function on a main screen. Accordingly, the OSD function does not change overlapping contents, and like the PIP function, is limited on position adjustment.
  • the present invention has been designed to solve the above-mentioned problems occurring in the prior art, and provides an apparatus and a method, wherein an image source is more conveniently displayed in a window set by a user in a large-sized reproduction device, such as a TV, and also in a small-sized display device, such as a mobile terminal.
  • the present invention provides an apparatus and a method, wherein a particular region of a moving image is enlarged, thereby enabling a user to continuously view the enlarged particular region while reproducing the moving image.
  • the present invention provides an apparatus and a method enabling a user to easily check user data on a small screen of a mobile terminal.
  • a method displaying an enlarged target region of an image reproduced by an image reproduction apparatus includes setting generation information on a window displayed on the reproduced image in such a manner that the window overlaps the reproduced image; designating a target region in the reproduced image; generating a window using the generation information on the window; and displaying an image of the designated target region in the generated window.
  • an apparatus for displaying an enlarged target region of a reproduced image includes a window information generation unit for setting generation information on a window displayed on the reproduced image in such a manner that the window overlaps the reproduced image; a target region selection unit for receiving a target region that a user designates in the reproduced image; a window generation unit for generating a window using the generation information on the window; and an image processing unit for displaying an image of the designated target region in the generated window.
  • FIG. 1 is a block diagram illustrating an internal configuration of an image reproduction apparatus according to an embodiment of the present invention
  • FIG. 2 is a block diagram illustrating a detailed configuration for window processing as illustrated in FIG. 1 ;
  • FIG. 3 is a flowchart illustrating an operation for window processing in an image reproduction apparatus according to an embodiment of the present invention
  • FIG. 4 illustrates a screen of an apparatus when a window is generated according to an embodiment of the present invention
  • FIG. 5 illustrates a screen on which a target object is viewed in a window according to an embodiment of the present invention
  • FIG. 6 illustrates a window display screen for improving document readability according to an embodiment of the present invention.
  • a particular region in a displayed image is selected and displayed in an enlarged form. More specifically, when a user selects a target region in an image displayed by an image reproduction apparatus, an image of the selected target region is displayed in a window that is adjustable by the user.
  • the image reproduction apparatus tracks a target object in the target region, and enables the user to continuously view the target object in the window. Particularly, the user can adjust a generation position, a window size, a zoom magnification, and a resolution of the window. As described above, the user can freely adjust the window in which the target object is displayed, thereby increasing the convenience of the user.
  • FIG. 1 is a block diagram illustrating an internal configuration of an image reproduction apparatus according to an embodiment of the present invention.
  • the image reproduction apparatus includes a window information generation unit 100 , a target region selection unit 110 , a window generation unit 120 , an image processing unit 130 , and an exceptional situation processing unit 140 .
  • the window information generation unit 100 receives, as input, generation information for a window to be displayed on a reproduced image in such a manner that the window overlaps the reproduced image.
  • the window information generation unit 100 receives, as input, setting information for window generation from a user.
  • the user may previously input the above setting information by using a setting menu.
  • the image reproduction apparatus may also include a touch-screen function, which may receive, as input, the setting information from the user through a touch input means (e.g., a touch pen or finger) while reproducing images.
  • a touch input means e.g., a touch pen or finger
  • the user may directly determine the generation parameters (i.e., position, size, etc.) of the window using a predetermined method (e.g., dragging or drawing).
  • the window generation information includes, for example, information on whether a window display function is turned on or off, setting information on the generation, size, zoom magnification, and resolution of the window, and information on the selection of an automatic or a manual operation of a target region.
  • the selection of an automatic or a manual operation of a target region implies that a target region including a target object, that the user intends to track, is automatically designated when the user selects the target object in advance.
  • the selection of a manual operation of a target region implies that the user directly designates the target region in an image reproduced through keypad input or touch input.
  • the target region selection unit 110 receives a target object, which is to be displayed in the window.
  • the target region selection unit 110 receives a target region that the user designates in the displayed image.
  • the selection of the target region indicates a target object (e.g., an object, a human being, or a word) that the user intends to enlarge in a moving image or on the web.
  • the target region selection unit 110 continuously tracks the target object while reproducing the image. Namely, the target region selection unit 110 continuously extracts the target object from a reproduced image.
  • face recognition technology can be used when a human being is the target object of the reproduced image.
  • character recognition technology can be used.
  • the extraction technology is publicly-known technology, and therefore, a detailed description of the extraction process of the target object will not be described in further detail in the present application.
  • the target region is determined by keypad input or touch input on a touch-screen, a means for designating the target region is not limited to these examples.
  • the window generation unit 120 generates the window, which displays the target object, by using the window generation information input by the user. Accordingly, a window with the size and resolution set by the user is generated at the requested position.
  • the image processing unit 130 includes a scaler, and displays an image of the target region in the generated window. Namely, the image processing unit 130 processes an image in such a manner that the target object may be displayed in the window according to the zoom magnification and resolution set by the user. Accordingly, the image processing unit 130 extracts and tracks the target object in the target region, and continuously displays the tracked target object in the generated window at the preset magnification.
  • the exceptional situation processing unit 140 determines if the position of the window overlaps the target region in the image while reproducing the image. When the overlapping situation occurs, it is preferable that the position of the window should be readjusted. Therefore, the exceptional situation processing unit 140 moves the window to an adjacent position and displays the moved window in such a manner that the window may not overlap the target region.
  • the adjacent position for example, is a position having a predetermined distance from the target region. Preferably, the size, zoom magnification, and resolution of the window are maintained even when the window is moved.
  • FIG. 2 is a block diagram illustrating window processing. More specifically, FIG. 2 illustrates an image reproduction processing scheme when image reproduction starts.
  • a parser 200 parses information. More specifically, the parser 200 parses information on a target region in the input image. Through this parsing, a first memory 210 temporarily stores the original image. A second memory 220 temporarily stores the information on the target region. The first memory 210 and second memory 220 act as buffers, which temporarily store an image and image information, respectively, before the image is output.
  • the “original image” is a reproduced moving image
  • the “information on a target image” is target objects (such as a Region Of Interest (ROI), an object, a character, a human being, etc., designated by the user), which is intended to be extracted and tracked in the original image.
  • ROI Region Of Interest
  • the scaler 230 scales the target image according to a set zoom magnification from among the window generation information so that the target object may be displayed in the window.
  • a multiplexer (MUX) 240 combines the original image from the first memory 210 with the target image scaled by the scaler 230 .
  • the target image is combined with the original image in the window having the size and generated in a position designated by the user. Accordingly, the target image is displayed in the window having the designated size and position in such a manner that the target image may overlap the original image.
  • a position comparator 250 compares the position of the target region with that of the window so that the position of the window will not obscure the target object in the original image. When the result of the comparison shows that the position of window overlaps the target region, the position comparator 250 adjusts the position of the window to a position adjacent to the target region, i.e., the nearest position to the target region, at which the window does not overlap the target region.
  • a distance between the target region and window may have a predetermined value. Accordingly, the window is continuously displayed without covering the target object when the designated window potion would have overlapped the target object.
  • the window only changes its position, and is the same size, uses the same zoom magnification, and displays in the same resolution as set by the user.
  • the apparatus described above does not simply display an image, such as in a PIP scheme where an image of a predetermined size is displayed at a predetermined position, but continuously tracks and displays the target object, that selected by the user, in a window of which the position, size, zoom magnification, and resolution are also designated by the user.
  • SZW Smart Zoom Window
  • DMB Digital Multimedia Broadcasting
  • VOD Video On Demand
  • the SZW function reconfigures the tracked target based on a position, size, and resolution set by the user, and displays the reconfigured target on a screen.
  • the SZW function for example, continuously counts an appearance frequency of a target object extracted from the image, and displays the appearance frequency.
  • FIG. 3 is a flowchart illustrating window processing in an image reproduction apparatus according to an embodiment of the present invention.
  • a window display method according to the present invention is not limited to this example.
  • the image reproduction apparatus extracts window generation information in step 305 .
  • the image reproduction apparatus determines if a target object or region is selected. The step of selecting the target region corresponds to the step of selecting a target object to be tracked, in a reproduced moving image. While the moving image is reproduced, the user can select a target object (e.g., an ROI, a human being, or a particular object). When browsing the web, for example, the user can select a target object (e.g., a character). When the target object is selected, the image reproduction apparatus generates a window by using the extracted window generation information in step 315 .
  • a target object or region is selected. The step of selecting the target region corresponds to the step of selecting a target object to be tracked, in a reproduced moving image.
  • a target object e.g., an ROI, a human being, or a particular object.
  • the image reproduction apparatus When browsing the web, for example, the user can select a target object (e.g., a character).
  • the user may also adjust the window according to a desired position and size from among the window generation information by, for example, dragging a touch pen while the moving image is being reproduced.
  • the zoom magnification and resolution of the window may be directly determined in real time in the same manner while the moving image is being reproduced.
  • the user may easily determine the target region by using a touch pen.
  • the user may determine a starting position and an end position of a target region by using a direction key, for example.
  • the image reproduction apparatus displays an image of the target region in the generated window in step 320 .
  • FIG. 4 illustrates windows of various shapes, which are denoted by reference numerals 410 , 420 and 430 , and are displayed when a target object 400 is designated in a moving image.
  • the user can set a desired position, size, zoom magnification, and resolution for each of the windows as described above. Therefore, the windows denoted by reference numerals 410 , 420 and 430 can have various shapes.
  • FIG. 4 illustrates the size of a target region displayed in each of the windows 410 , 420 , and 430 being different when sizes of the windows 410 , 420 , and 430 are different, although target objects are the same. Namely, the target object always exists in the center of each of the windows 410 , 420 and 430 . However, the size of the target region including the target object changes depending on the size, zoom magnification, etc., of the window.
  • the target object 400 is continuously extracted and tracked in the moving image. Accordingly, an image of the target region with the target object 400 as reference is continuously displayed in each window. If recognition technology is integrated with extraction and tracking technology when the target object is extracted and tracked in the moving image, it is possible to count the appearance frequency of the target object.
  • target objects 510 are displayed in windows 520 , 530 , 540 and 550 , respectively. More specifically, FIG. 5 illustrates windows 520 , 530 , 540 , and 550 , which have different sizes and also different zoom magnifications, respectively.
  • a preview image in the image reproduction apparatus e.g., a mobile terminal
  • the person's face which is enlarged and more accurate, is displayed in the window. Accordingly, the user can easily photograph the person.
  • the image reproduction apparatus determines in step 325 if the window overlaps the target region.
  • the image reproduction apparatus readjusts the position of the window in step 330 .
  • the image reproduction apparatus displays an image of the target region in the window having the adjusted position in step 335 . This operation prevents a window from obscuring the target object while the image is reproduced. Accordingly, the image reproduction apparatus moves the window to a position adjacent to the target region, and displays the moved window.
  • step 340 the image reproduction apparatus continuously extracts and tracks the target object in the moving image, and displays the target object in the window, as long as the reproduction of the moving image is not completed.
  • a target object can be actively and intelligently displayed in a window.
  • an image reproduction apparatus recognizes a target object and displays the recognized target object in a window while a moving image is reproduced.
  • the present invention may also be applied when the target object (e.g., text characters) designated by a user, is displayed in a desired window 610 even on a character output screen as illustrated in FIG. 6 .
  • the image reproduction apparatus recognizes the relevant word and displays the recognized word in the window, which is set by the user. Additionally, the number of times that the recognized word appears can be tracked. For example, when the user must read user data (e.g., web surfing data or a web surfing text) by using a small display device mounted on a mobile terminal, the invention eliminates the inconvenience that the user has to read small characters. Accordingly, the enlarged view in the window can improve the readability of the user data.
  • user data e.g., web surfing data or a web surfing text
  • the various embodiments of the present invention enable a user to continuously view an image of a particular region, which has been selected from an image displayed, in an image reproduction apparatus (e.g., a TV or a mobile terminal), according to a desired position, size, and resolution.
  • an image reproduction apparatus e.g., a TV or a mobile terminal

Abstract

An apparatus and a method for selecting and displaying a particular region in a displayed image. A user selects a target region in an image displayed on an image reproduction apparatus, an image of the selected target region is enlarged and displayed in a window, which the user can adjust a generation position, a size, a zoom magnification, and a resolution thereof. The image reproduction apparatus tracks a target object in the target region, and enables the user to continuously view the target object in the window.

Description

    PRIORITY
  • This application claims priority under 35 U.S.C. §119(a) to a Korean Patent Application filed in the Korean Intellectual Property Office on Jul. 30, 2008 and assigned Serial No. 10-2008-0074623, the contents of which are incorporated herein by reference.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present invention relates generally to an apparatus and a method for enlarging a particular region of an image, and more particularly to an apparatus and a method for enlarging and displaying a particular region of an image while reproducing the image.
  • 2. Description of the Related Art
  • Portable devices, such as Moving Picture Experts Group Audio Layer-3 (MP3) players, mobile terminals, Portable Multimedia Players (PMPs), etc., are constantly increasing in popularity. Consequently, the portable devices are constantly being improved to provide more user features. At an early stage, these improvements were limited to music listening or viewing images. However, now, the improvements extend into various fields, such as Internet, games, image communication, etc.
  • A portable device, such as a mobile terminal capable of displaying still images and video, provides various screen display methods. For example, the various screen display methods can include Picture In Picture (PIP), On-Screen Display (OSD), etc.
  • A PIP function provides another small screen on a larger screen, and can be used to display different images, e.g., from two different channels, on the two screens, respectively. The PIP function only has to display signals, which are output from two video decks, in a predetermined part of an image. Accordingly, the PIP function reproduces a second image in a fixed potion of the larger screen displaying a first image. Similar to the PIP function, a PIP type display is also used to simultaneously display a calling and called party on one screen for a videophone call. This technology can also be implemented by adding a hardware video codec.
  • An OSD function reconfigures two image sources into one image. For example, the OSD function is used when a mobile terminal simultaneously displays a user interface related to a menu with a broadcast screen.
  • The conventional PIP configuration displays images from two sources on an output screen simply by adding a hardware codec. In the PIP function, a user has difficulty adjusting the size and resolution of an image. Also, the user cannot change the position of a PIP window on a main screen.
  • The OSD function simply provides an overlapping function on a main screen. Accordingly, the OSD function does not change overlapping contents, and like the PIP function, is limited on position adjustment.
  • SUMMARY OF THE INVENTION
  • Accordingly, the present invention has been designed to solve the above-mentioned problems occurring in the prior art, and provides an apparatus and a method, wherein an image source is more conveniently displayed in a window set by a user in a large-sized reproduction device, such as a TV, and also in a small-sized display device, such as a mobile terminal.
  • Also, the present invention provides an apparatus and a method, wherein a particular region of a moving image is enlarged, thereby enabling a user to continuously view the enlarged particular region while reproducing the moving image.
  • Further, the present invention provides an apparatus and a method enabling a user to easily check user data on a small screen of a mobile terminal.
  • In accordance with an aspect of the present invention, there is provided a method displaying an enlarged target region of an image reproduced by an image reproduction apparatus. The method includes setting generation information on a window displayed on the reproduced image in such a manner that the window overlaps the reproduced image; designating a target region in the reproduced image; generating a window using the generation information on the window; and displaying an image of the designated target region in the generated window.
  • In accordance with another aspect of the present invention, there is provided an apparatus for displaying an enlarged target region of a reproduced image. The apparatus includes a window information generation unit for setting generation information on a window displayed on the reproduced image in such a manner that the window overlaps the reproduced image; a target region selection unit for receiving a target region that a user designates in the reproduced image; a window generation unit for generating a window using the generation information on the window; and an image processing unit for displaying an image of the designated target region in the generated window.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The above and other features, aspects, and advantages of the present invention will be more apparent from the following detailed description taken in conjunction with the accompanying drawings, in which:
  • FIG. 1 is a block diagram illustrating an internal configuration of an image reproduction apparatus according to an embodiment of the present invention;
  • FIG. 2 is a block diagram illustrating a detailed configuration for window processing as illustrated in FIG. 1;
  • FIG. 3 is a flowchart illustrating an operation for window processing in an image reproduction apparatus according to an embodiment of the present invention;
  • FIG. 4 illustrates a screen of an apparatus when a window is generated according to an embodiment of the present invention;
  • FIG. 5 illustrates a screen on which a target object is viewed in a window according to an embodiment of the present invention; and
  • FIG. 6 illustrates a window display screen for improving document readability according to an embodiment of the present invention.
  • DETAILED DESCRIPTION OF THE EMBODIMENTS OF THE INVENTION
  • Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings. It should be noted that the same reference numerals will denote the same configuration elements in the accompanying drawings, although they may be shown in different drawings. Also, a detailed description of known functions and configurations will be omitted when it may unnecessarily obscure the subject matter of the present invention.
  • In accordance with an embodiment of the present invention, a particular region in a displayed image is selected and displayed in an enlarged form. More specifically, when a user selects a target region in an image displayed by an image reproduction apparatus, an image of the selected target region is displayed in a window that is adjustable by the user. The image reproduction apparatus tracks a target object in the target region, and enables the user to continuously view the target object in the window. Particularly, the user can adjust a generation position, a window size, a zoom magnification, and a resolution of the window. As described above, the user can freely adjust the window in which the target object is displayed, thereby increasing the convenience of the user.
  • FIG. 1 is a block diagram illustrating an internal configuration of an image reproduction apparatus according to an embodiment of the present invention. Referring to FIG. 1, the image reproduction apparatus includes a window information generation unit 100, a target region selection unit 110, a window generation unit 120, an image processing unit 130, and an exceptional situation processing unit 140.
  • The window information generation unit 100 receives, as input, generation information for a window to be displayed on a reproduced image in such a manner that the window overlaps the reproduced image. Namely, the window information generation unit 100 receives, as input, setting information for window generation from a user. For example, the user may previously input the above setting information by using a setting menu. Alternatively, the image reproduction apparatus may also include a touch-screen function, which may receive, as input, the setting information from the user through a touch input means (e.g., a touch pen or finger) while reproducing images. When using the touch input means as described above, the user may directly determine the generation parameters (i.e., position, size, etc.) of the window using a predetermined method (e.g., dragging or drawing).
  • The window generation information includes, for example, information on whether a window display function is turned on or off, setting information on the generation, size, zoom magnification, and resolution of the window, and information on the selection of an automatic or a manual operation of a target region. Herein, the selection of an automatic or a manual operation of a target region implies that a target region including a target object, that the user intends to track, is automatically designated when the user selects the target object in advance. Also, the selection of a manual operation of a target region implies that the user directly designates the target region in an image reproduced through keypad input or touch input.
  • The target region selection unit 110 receives a target object, which is to be displayed in the window. The target region selection unit 110 receives a target region that the user designates in the displayed image. For example, the selection of the target region indicates a target object (e.g., an object, a human being, or a word) that the user intends to enlarge in a moving image or on the web. When the target object is designated, the target region selection unit 110 continuously tracks the target object while reproducing the image. Namely, the target region selection unit 110 continuously extracts the target object from a reproduced image. For example, face recognition technology can be used when a human being is the target object of the reproduced image. In the same manner, when a different character designated to be extracted, character recognition technology can be used. The extraction technology is publicly-known technology, and therefore, a detailed description of the extraction process of the target object will not be described in further detail in the present application.
  • Although the target region, as described above, is determined by keypad input or touch input on a touch-screen, a means for designating the target region is not limited to these examples.
  • The window generation unit 120 generates the window, which displays the target object, by using the window generation information input by the user. Accordingly, a window with the size and resolution set by the user is generated at the requested position.
  • The image processing unit 130 includes a scaler, and displays an image of the target region in the generated window. Namely, the image processing unit 130 processes an image in such a manner that the target object may be displayed in the window according to the zoom magnification and resolution set by the user. Accordingly, the image processing unit 130 extracts and tracks the target object in the target region, and continuously displays the tracked target object in the generated window at the preset magnification.
  • The exceptional situation processing unit 140 determines if the position of the window overlaps the target region in the image while reproducing the image. When the overlapping situation occurs, it is preferable that the position of the window should be readjusted. Therefore, the exceptional situation processing unit 140 moves the window to an adjacent position and displays the moved window in such a manner that the window may not overlap the target region. The adjacent position, for example, is a position having a predetermined distance from the target region. Preferably, the size, zoom magnification, and resolution of the window are maintained even when the window is moved.
  • FIG. 2 is a block diagram illustrating window processing. More specifically, FIG. 2 illustrates an image reproduction processing scheme when image reproduction starts.
  • Referring to FIG. 2, when an image to be reproduced is input, a parser 200 parses information. More specifically, the parser 200 parses information on a target region in the input image. Through this parsing, a first memory 210 temporarily stores the original image. A second memory 220 temporarily stores the information on the target region. The first memory 210 and second memory 220 act as buffers, which temporarily store an image and image information, respectively, before the image is output. Herein, the “original image” is a reproduced moving image, and the “information on a target image” is target objects (such as a Region Of Interest (ROI), an object, a character, a human being, etc., designated by the user), which is intended to be extracted and tracked in the original image.
  • The scaler 230 scales the target image according to a set zoom magnification from among the window generation information so that the target object may be displayed in the window.
  • A multiplexer (MUX) 240 combines the original image from the first memory 210 with the target image scaled by the scaler 230. The target image is combined with the original image in the window having the size and generated in a position designated by the user. Accordingly, the target image is displayed in the window having the designated size and position in such a manner that the target image may overlap the original image.
  • A position comparator 250 compares the position of the target region with that of the window so that the position of the window will not obscure the target object in the original image. When the result of the comparison shows that the position of window overlaps the target region, the position comparator 250 adjusts the position of the window to a position adjacent to the target region, i.e., the nearest position to the target region, at which the window does not overlap the target region. When the window overlaps the target region, a distance between the target region and window may have a predetermined value. Accordingly, the window is continuously displayed without covering the target object when the designated window potion would have overlapped the target object. Preferably, the window only changes its position, and is the same size, uses the same zoom magnification, and displays in the same resolution as set by the user.
  • The apparatus described above does not simply display an image, such as in a PIP scheme where an image of a predetermined size is displayed at a predetermined position, but continuously tracks and displays the target object, that selected by the user, in a window of which the position, size, zoom magnification, and resolution are also designated by the user.
  • This function as described above is referred to as a Smart Zoom Window (SZW) function in the present application. The SZW function extracts and tracks a region, an object, or a human being designated by a user from an image, by applying technology, which has been proposed for the reproduction of Digital Multimedia Broadcasting (DMB) and a Video On Demand (VOD) from among functions that users frequently use in mobile terminals. The SZW function reconfigures the tracked target based on a position, size, and resolution set by the user, and displays the reconfigured target on a screen.
  • Besides the operations described above, the SZW function, for example, continuously counts an appearance frequency of a target object extracted from the image, and displays the appearance frequency.
  • FIG. 3 is a flowchart illustrating window processing in an image reproduction apparatus according to an embodiment of the present invention. In the following description, although the description will be made with reference to illustrative views of FIGS. 4 and 5, a window display method according to the present invention is not limited to this example.
  • Referring to FIG. 3, when moving image reproduction starts in step 300, the image reproduction apparatus extracts window generation information in step 305. In step 310, the image reproduction apparatus determines if a target object or region is selected. The step of selecting the target region corresponds to the step of selecting a target object to be tracked, in a reproduced moving image. While the moving image is reproduced, the user can select a target object (e.g., an ROI, a human being, or a particular object). When browsing the web, for example, the user can select a target object (e.g., a character). When the target object is selected, the image reproduction apparatus generates a window by using the extracted window generation information in step 315.
  • A description is made referring to FIG. 3 on the assumption that the user previously sets, through a menu, stored window setting information including an initial generation position, a size, a zoom magnification, a resolution, etc., of a window. However, the user may also adjust the window according to a desired position and size from among the window generation information by, for example, dragging a touch pen while the moving image is being reproduced. Also, the zoom magnification and resolution of the window may be directly determined in real time in the same manner while the moving image is being reproduced. Further, the user may easily determine the target region by using a touch pen. When using a keypad, the user may determine a starting position and an end position of a target region by using a direction key, for example.
  • The image reproduction apparatus displays an image of the target region in the generated window in step 320.
  • FIG. 4 illustrates windows of various shapes, which are denoted by reference numerals 410, 420 and 430, and are displayed when a target object 400 is designated in a moving image. The user can set a desired position, size, zoom magnification, and resolution for each of the windows as described above. Therefore, the windows denoted by reference numerals 410, 420 and 430 can have various shapes.
  • In particular, FIG. 4 illustrates the size of a target region displayed in each of the windows 410, 420, and 430 being different when sizes of the windows 410, 420, and 430 are different, although target objects are the same. Namely, the target object always exists in the center of each of the windows 410, 420 and 430. However, the size of the target region including the target object changes depending on the size, zoom magnification, etc., of the window.
  • The target object 400 is continuously extracted and tracked in the moving image. Accordingly, an image of the target region with the target object 400 as reference is continuously displayed in each window. If recognition technology is integrated with extraction and tracking technology when the target object is extracted and tracked in the moving image, it is possible to count the appearance frequency of the target object.
  • It is also possible to designate more than one target object, as illustrated in FIG. 5.
  • Referring to FIG. 5, target objects 510 are displayed in windows 520, 530, 540 and 550, respectively. More specifically, FIG. 5 illustrates windows 520, 530, 540, and 550, which have different sizes and also different zoom magnifications, respectively. For example, when only a person's face is designated as the target region in a preview image in the image reproduction apparatus (e.g., a mobile terminal) equipped with a camera, the person's face, which is enlarged and more accurate, is displayed in the window. Accordingly, the user can easily photograph the person.
  • While displaying a target object in a window in such a manner that the window may overlap the target in a reproduced image, as illustrated in FIGS. 4 and 5, the image reproduction apparatus determines in step 325 if the window overlaps the target region. When the window overlaps the target region, the image reproduction apparatus readjusts the position of the window in step 330. Then, the image reproduction apparatus displays an image of the target region in the window having the adjusted position in step 335. This operation prevents a window from obscuring the target object while the image is reproduced. Accordingly, the image reproduction apparatus moves the window to a position adjacent to the target region, and displays the moved window.
  • In step 340, the image reproduction apparatus continuously extracts and tracks the target object in the moving image, and displays the target object in the window, as long as the reproduction of the moving image is not completed. In accordance with an embodiment of the present invention, a target object can be actively and intelligently displayed in a window.
  • In accordance with an embodiment of the present invention, an image reproduction apparatus recognizes a target object and displays the recognized target object in a window while a moving image is reproduced. However, the present invention may also be applied when the target object (e.g., text characters) designated by a user, is displayed in a desired window 610 even on a character output screen as illustrated in FIG. 6.
  • Referring to FIG. 6, when the user designates a word as the target object, the image reproduction apparatus recognizes the relevant word and displays the recognized word in the window, which is set by the user. Additionally, the number of times that the recognized word appears can be tracked. For example, when the user must read user data (e.g., web surfing data or a web surfing text) by using a small display device mounted on a mobile terminal, the invention eliminates the inconvenience that the user has to read small characters. Accordingly, the enlarged view in the window can improve the readability of the user data.
  • The various embodiments of the present invention enable a user to continuously view an image of a particular region, which has been selected from an image displayed, in an image reproduction apparatus (e.g., a TV or a mobile terminal), according to a desired position, size, and resolution.
  • While the present invention has been shown and described with reference to certain embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present invention as defined by the appended claims and equivalents of the appended claims.

Claims (10)

1. A method for displaying a enlarged target region of an image reproduced by an image reproduction apparatus, the method comprising:
setting generation information on a window to be displayed on top of the reproduced image;
designating a target region in the reproduced image;
generating the window using the generation information on the window; and
displaying an enlarged image of the designated target region in the generated window.
2. The method as claimed in claim 1, wherein the generation information on the window includes at least one of a generation position, a size, a zoom magnification, and a resolution of the window.
3. The method as claimed in claim 1, further comprising:
determining if the window overlaps the designated target region; and
changing a position of the window, when the window overlaps the designated target region.
4. The method as claimed in claim 1, wherein displaying the enlarged image of the designated target region comprises:
extracting a target object in the designated target region;
tracking the extracted target object; and
continuously displaying the tracked target object in the window.
5. The method as claimed in claim 1, wherein a user sets the generation information on the window using a setting menu or a touch pen.
6. An apparatus for displaying an enlarged target region of a reproduced image, the apparatus comprising:
a window information generation unit for setting generation information on a window to be displayed on top of the reproduced image;
a target region selection unit for receiving, from a user, a target region in the reproduced image;
a window generation unit for generating the window using the generation information on the window; and
an image processing unit for displaying an image of the target region in the generated window.
7. The apparatus as claimed in claim 6, wherein the generation information on the window comprises at least one of a generation position, a size, a zoom magnification, and a resolution of the window.
8. The apparatus as claimed in claim 6, further comprising:
an exceptional situation processing unit for determining if the window overlaps the target region, and changing a position of the window when the window overlaps the target region.
9. The apparatus as claimed in claim 6, wherein the image processing unit extracts a target object in the target region, and tracks the extracted target object to continuously display the tracked target object in the window.
10. The apparatus as claimed in claim 6, wherein a user sets the generation information on the window using a setting menu or a touch pen.
US12/476,620 2008-07-30 2009-06-02 Apparatus and method for displaying an enlarged target region of a reproduced image Abandoned US20100026721A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/249,993 US9648269B2 (en) 2008-07-30 2014-04-10 Apparatus and method for displaying an enlarged target region of a reproduced image

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2008-0074623 2008-07-30
KR1020080074623A KR101009881B1 (en) 2008-07-30 2008-07-30 Apparatus and method for zoom display of target area from reproducing image

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/249,993 Continuation US9648269B2 (en) 2008-07-30 2014-04-10 Apparatus and method for displaying an enlarged target region of a reproduced image

Publications (1)

Publication Number Publication Date
US20100026721A1 true US20100026721A1 (en) 2010-02-04

Family

ID=41607878

Family Applications (2)

Application Number Title Priority Date Filing Date
US12/476,620 Abandoned US20100026721A1 (en) 2008-07-30 2009-06-02 Apparatus and method for displaying an enlarged target region of a reproduced image
US14/249,993 Active US9648269B2 (en) 2008-07-30 2014-04-10 Apparatus and method for displaying an enlarged target region of a reproduced image

Family Applications After (1)

Application Number Title Priority Date Filing Date
US14/249,993 Active US9648269B2 (en) 2008-07-30 2014-04-10 Apparatus and method for displaying an enlarged target region of a reproduced image

Country Status (5)

Country Link
US (2) US20100026721A1 (en)
EP (1) EP2314069A4 (en)
KR (1) KR101009881B1 (en)
CN (1) CN102106145A (en)
WO (1) WO2010013892A2 (en)

Cited By (53)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110078566A1 (en) * 2009-09-30 2011-03-31 Konica Minolta Systems Laboratory, Inc. Systems, methods, tools, and user interface for previewing simulated print output
US20110241991A1 (en) * 2009-10-07 2011-10-06 Yasunobu Ogura Tracking object selection apparatus, method, program and circuit
WO2012050561A1 (en) * 2010-10-11 2012-04-19 Hewlett-Packard Development Company, L.P. A first image and a second image on a display
US20120092529A1 (en) * 2010-10-19 2012-04-19 Samsung Electronics Co., Ltd. Method for processing an image and an image photographing apparatus applying the same
US20130076944A1 (en) * 2011-09-26 2013-03-28 Sony Mobile Communications Japan, Inc. Image photography apparatus
JP2013186733A (en) * 2012-03-08 2013-09-19 Sony Corp Display control device, display control method, and computer-readable recording medium
US20130249952A1 (en) * 2012-03-23 2013-09-26 Canon Kabushiki Kaisha Drawing data generation apparatus, drawing data generation method, program, and drawing data generation system
EP2646946A1 (en) * 2010-12-01 2013-10-09 EchoStar Technologies L.L.C. User control of the display of matrix codes
CN103365612A (en) * 2012-03-26 2013-10-23 联想(北京)有限公司 Information processing method and electronic instrument
US20130332958A1 (en) * 2012-06-12 2013-12-12 Electronics And Telecommunications Research Institute Method and system for displaying user selectable picture
US20140047340A1 (en) * 2012-08-09 2014-02-13 Lg Electronics Inc. Mobile terminal and control method therefor
US20140059457A1 (en) * 2012-08-27 2014-02-27 Samsung Electronics Co., Ltd. Zooming display method and apparatus
US8717500B1 (en) * 2012-10-15 2014-05-06 At&T Intellectual Property I, L.P. Relational display of images
CN103959369A (en) * 2011-11-30 2014-07-30 株式会社理光 Display control apparatus, display control system, display control method, and computer program product
US20140253693A1 (en) * 2011-11-14 2014-09-11 Sony Corporation Information processing apparatus, method, and non-transitory computer-readable medium
US20140282670A1 (en) * 2012-12-28 2014-09-18 Turner Broadcasting System, Inc. Method and system for detecting and resolving conflicts in an automatic content recognition based system
WO2014197331A1 (en) * 2013-06-04 2014-12-11 Hrl Laboratories, Llc A system for detecting an object of interest in a scene
US20150062434A1 (en) * 2013-08-27 2015-03-05 Qualcomm Incorporated Systems, devices and methods for displaying pictures in a picture
US8988467B2 (en) 2011-10-13 2015-03-24 Microsoft Technology Licensing, Llc Touchscreen selection visual feedback
CN104754223A (en) * 2015-03-12 2015-07-01 广东欧珀移动通信有限公司 Method for generating thumbnail and shooting terminal
US9148686B2 (en) 2010-12-20 2015-09-29 Echostar Technologies, Llc Matrix code-based user interface
US9280515B2 (en) 2010-12-03 2016-03-08 Echostar Technologies L.L.C. Provision of alternate content in response to QR code
US9329966B2 (en) 2010-11-23 2016-05-03 Echostar Technologies L.L.C. Facilitating user support of electronic devices using matrix codes
US9367669B2 (en) 2011-02-25 2016-06-14 Echostar Technologies L.L.C. Content source identification using matrix barcode
US20160173940A1 (en) * 2012-06-29 2016-06-16 Toyota Jidosha Kabushiki Kaisha Image information provision device, image information provision system, and image information provision method
US9571888B2 (en) 2011-02-15 2017-02-14 Echostar Technologies L.L.C. Selection graphics overlay of matrix code
US9596500B2 (en) 2010-12-17 2017-03-14 Echostar Technologies L.L.C. Accessing content via a matrix code
US9652108B2 (en) 2011-05-20 2017-05-16 Echostar Uk Holdings Limited Progress bar
US9652180B2 (en) 2013-01-28 2017-05-16 Samsung Electronics Co., Ltd. Memory device, memory system, and control method performed by the memory system
US9686584B2 (en) 2011-02-28 2017-06-20 Echostar Technologies L.L.C. Facilitating placeshifting using matrix codes
US9706106B2 (en) 2014-09-05 2017-07-11 Lg Electronics Inc. Mobile terminal and method of controlling the same
US9736469B2 (en) 2011-02-28 2017-08-15 Echostar Technologies L.L.C. Set top box health and configuration
WO2017164656A2 (en) 2016-03-24 2017-09-28 Lg Electronics Inc. Display device and operating method thereof
US9781465B2 (en) 2010-11-24 2017-10-03 Echostar Technologies L.L.C. Tracking user interaction from a receiving device
US9792612B2 (en) 2010-11-23 2017-10-17 Echostar Technologies L.L.C. Facilitating user support of electronic devices using dynamic matrix code generation
US20170347153A1 (en) * 2015-04-16 2017-11-30 Tencent Technology (Shenzhen) Company Limited Method of zooming video images and mobile terminal
EP3120228A4 (en) * 2014-03-21 2018-02-28 Amazon Technologies, Inc. Object tracking in zoomed video
CN107852531A (en) * 2015-08-25 2018-03-27 Lg电子株式会社 Display device and its control method
US10002430B1 (en) 2013-06-04 2018-06-19 Hrl Laboratories, Llc Training system for infield training of a vision-based object detector
US10075673B2 (en) 2012-07-17 2018-09-11 Samsung Electronics Co., Ltd. System and method for providing image
EP3311582A4 (en) * 2015-06-17 2018-10-31 LG Electronics Inc. Display device and operating method thereof
EP3316592A4 (en) * 2015-06-29 2018-12-05 LG Electronics Inc. Display device and control method therefor
US10185976B2 (en) * 2014-07-23 2019-01-22 Target Brands Inc. Shopping systems, user interfaces and methods
US20190221184A1 (en) * 2016-07-29 2019-07-18 Mitsubishi Electric Corporation Display device, display control device, and display control method
EP3471424A4 (en) * 2016-06-13 2020-07-01 LG Electronics Inc. -1- Display device and display system including same
US11030420B2 (en) * 2011-10-19 2021-06-08 Microsoft Technology Licensing, Llc Translating language characters in media content
US11150787B2 (en) * 2015-11-20 2021-10-19 Samsung Electronics Co., Ltd. Image display device and operating method for enlarging an image displayed in a region of a display and displaying the enlarged image variously
US11404021B2 (en) 2015-06-18 2022-08-02 Samsung Electronics Co., Ltd. Electronic device and method of processing notification in electronic device
US20230010078A1 (en) * 2021-07-12 2023-01-12 Avago Technologies International Sales Pte. Limited Object or region of interest video processing system and method
US20230082451A1 (en) * 2021-09-10 2023-03-16 Acer Incorporated Intelligent zooming method and electronic device using the same
WO2023057040A1 (en) * 2021-10-04 2023-04-13 Jsc Yukon Advanced Optics Worldwide Enhanced picture-in-picture
US11884155B2 (en) * 2019-04-25 2024-01-30 Motional Ad Llc Graphical user interface for display of autonomous vehicle behaviors
US11908340B2 (en) * 2019-07-24 2024-02-20 Arris Enterprises Llc Magnification enhancement of video for visually impaired viewers

Families Citing this family (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102789777A (en) * 2011-05-17 2012-11-21 宏碁股份有限公司 Method and device for displaying of multiple displayers
CN102300067B (en) * 2011-09-09 2014-02-05 冠捷显示科技(厦门)有限公司 System capable of demonstrating television function information and demonstration method
US20130106888A1 (en) * 2011-11-02 2013-05-02 Microsoft Corporation Interactively zooming content during a presentation
KR20130119129A (en) * 2012-04-23 2013-10-31 삼성전자주식회사 Method for controlling window size and an electronic device thereof
WO2014100966A1 (en) * 2012-12-25 2014-07-03 华为技术有限公司 Video play method, terminal and system
CN104123520B (en) * 2013-04-28 2017-09-29 腾讯科技(深圳)有限公司 Two-dimensional code scanning method and device
KR102019128B1 (en) * 2013-05-10 2019-09-06 엘지전자 주식회사 Mobile terminal and controlling method thereof
CN104378545A (en) * 2013-08-16 2015-02-25 中兴通讯股份有限公司 Photographing processing method and system
JP6419421B2 (en) * 2013-10-31 2018-11-07 株式会社東芝 Image display device, image display method, and program
CN104333689A (en) * 2014-03-05 2015-02-04 广州三星通信技术研究有限公司 Method and device for displaying preview image during shooting
WO2015157584A1 (en) * 2014-04-10 2015-10-15 Fract, Inc. Systems and methods for identifying a region of interest on a map
US10091411B2 (en) * 2014-06-17 2018-10-02 Lg Electronics Inc. Mobile terminal and controlling method thereof for continuously tracking object included in video
CN107077823B (en) * 2014-09-16 2019-05-14 株式会社理光 Display device, display system and display control program
KR102304305B1 (en) 2015-01-21 2021-09-23 엘지전자 주식회사 Mobile terminal and method for controlling the same
US20160381297A1 (en) 2015-06-26 2016-12-29 Jsc Yukon Advanced Optics Worldwide Providing enhanced situational-awareness using magnified picture-in-picture within a wide field-of-view optical image
KR102384521B1 (en) * 2015-07-20 2022-04-08 엘지전자 주식회사 Display device and controlling method thereof
KR102384520B1 (en) * 2015-06-29 2022-04-08 엘지전자 주식회사 Display device and controlling method thereof
KR102369588B1 (en) * 2015-08-19 2022-03-03 엘지전자 주식회사 Digital device and method of processing data the same
CN105872816A (en) * 2015-12-18 2016-08-17 乐视网信息技术(北京)股份有限公司 Method and device for amplifying video image
CN105824422B (en) * 2016-03-30 2019-07-26 联想(北京)有限公司 A kind of information processing method and electronic equipment
CN107547727B (en) * 2016-06-24 2020-12-29 中兴通讯股份有限公司 Image preview method and device
CN106412431A (en) * 2016-09-30 2017-02-15 珠海市魅族科技有限公司 Image display method and device
KR102561305B1 (en) 2016-11-03 2023-07-31 한화비전 주식회사 Apparatus for Providing Image and Method Thereof
CN106791684A (en) * 2016-12-31 2017-05-31 深圳市乐信兴业科技有限公司 Electronic equipment tracking and relevant apparatus and system and mobile terminal
EP3677042A1 (en) * 2017-08-30 2020-07-08 Vid Scale, Inc. Tracked video zooming
CN108268200A (en) * 2018-01-22 2018-07-10 北京市商汤科技开发有限公司 Image processing method and device, electronic equipment, computer program and storage medium
CN111630486B (en) * 2018-01-30 2023-07-07 富士胶片株式会社 Electronic album device, method of operating electronic album device, and recording medium
US10796157B2 (en) * 2018-03-13 2020-10-06 Mediatek Inc. Hierarchical object detection and selection
CN108259838B (en) * 2018-03-19 2024-01-19 杭州度康科技有限公司 Electronic vision aid and image browsing method for same
CN110324690A (en) * 2018-03-30 2019-10-11 青岛海信电器股份有限公司 A kind of method of displaying target object, display device and display equipment
US11039196B2 (en) 2018-09-27 2021-06-15 Hisense Visual Technology Co., Ltd. Method and device for displaying a screen shot
CN109922363A (en) * 2019-03-15 2019-06-21 青岛海信电器股份有限公司 A kind of graphical user interface method and display equipment of display screen shot
CN110278481B (en) * 2019-06-25 2022-06-10 努比亚技术有限公司 Picture-in-picture implementation method, terminal and computer readable storage medium
CN113012028B (en) * 2019-12-19 2023-03-24 浙江宇视科技有限公司 Image processing method, device, equipment and storage medium
CN111107418B (en) * 2019-12-19 2022-07-12 北京奇艺世纪科技有限公司 Video data processing method, device, computer equipment and storage medium
CN112308780A (en) * 2020-10-30 2021-02-02 北京字跳网络技术有限公司 Image processing method, device, equipment and storage medium
CN113259743A (en) * 2020-12-28 2021-08-13 维沃移动通信有限公司 Video playing method and device and electronic equipment
CN113739927A (en) * 2021-07-26 2021-12-03 武汉高德智感科技有限公司 Infrared image display method and system
CN113805773A (en) * 2021-08-24 2021-12-17 上海联影医疗科技股份有限公司 Image display method, image display device, computer equipment and storage medium
KR20230034064A (en) * 2021-09-02 2023-03-09 삼성전자주식회사 Display apparatus and operating method thereof
CN113852757B (en) * 2021-09-03 2023-05-26 维沃移动通信(杭州)有限公司 Video processing method, device, equipment and storage medium

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6388684B1 (en) * 1989-07-14 2002-05-14 Hitachi, Ltd. Method and apparatus for displaying a target region and an enlarged image
US20020075407A1 (en) * 2000-12-15 2002-06-20 Philips Electronics North America Corporation Picture-in-picture repositioning and/or resizing based on video content analysis
US20040258152A1 (en) * 2003-06-19 2004-12-23 Herz William S. System and method for using motion vectors for object tracking
US6958749B1 (en) * 1999-11-04 2005-10-25 Sony Corporation Apparatus and method for manipulating a touch-sensitive display panel
US6970181B1 (en) * 2001-04-27 2005-11-29 Cisco Technology, Inc. Bandwidth conserving near-end picture-in-picture video applications
US20060072811A1 (en) * 2002-11-29 2006-04-06 Porter Robert Mark S Face detection
US20070008338A1 (en) * 2005-05-28 2007-01-11 Young-Chan Kim Display system, display apparatus, and method of controlling video source and display apparatus
US20070040898A1 (en) * 2005-08-19 2007-02-22 Yen-Chi Lee Picture-in-picture processing for video telephony
US20070098229A1 (en) * 2005-10-27 2007-05-03 Quen-Zong Wu Method and device for human face detection and recognition used in a preset environment
US20070109324A1 (en) * 2005-11-16 2007-05-17 Qian Lin Interactive viewing of video
US20090009531A1 (en) * 2007-07-03 2009-01-08 Canon Kabushiki Kaisha Image display control apparatus and method

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100301024B1 (en) * 1999-07-15 2001-11-01 윤종용 Apparatus for enlarging a particular part of main picture for digital television receiver
US20040189804A1 (en) * 2000-02-16 2004-09-30 Borden George R. Method of selecting targets and generating feedback in object tracking systems
US6597736B1 (en) * 2000-03-29 2003-07-22 Cisco Technology, Inc. Throughput enhanced video communication
KR20020056976A (en) * 2000-12-30 2002-07-11 윤종용 Method for displaying magnification window on screen of display in portable terminal
KR20040075517A (en) * 2003-02-21 2004-08-30 엘지전자 주식회사 apparatus and method for zoom of display device
US20060045381A1 (en) * 2004-08-31 2006-03-02 Sanyo Electric Co., Ltd. Image processing apparatus, shooting apparatus and image display apparatus
JP4880292B2 (en) * 2005-11-24 2012-02-22 富士フイルム株式会社 Image processing method, image processing program, and image processing apparatus
JP4826316B2 (en) * 2006-03-31 2011-11-30 ソニー株式会社 Image processing apparatus and method, program, and recording medium
JP4719641B2 (en) * 2006-07-27 2011-07-06 ソニー株式会社 A moving image data providing method, a moving image data providing method program, a recording medium recording the moving image data providing method program, a moving image data providing apparatus, and a moving image data providing system.
CN100397411C (en) 2006-08-21 2008-06-25 北京中星微电子有限公司 People face track display method and system for real-time robust
KR20080040414A (en) 2006-11-03 2008-05-08 삼성전자주식회사 Apparatus and method for displaying screen in portable terminal
US8169495B2 (en) * 2006-12-01 2012-05-01 Broadcom Corporation Method and apparatus for dynamic panoramic capturing
KR101338977B1 (en) 2006-12-08 2013-12-09 삼성전자주식회사 Method and Apparatus for simultaneously viewing pictures of Terrestrial Digital Multimedia Broadcast
US8264545B2 (en) * 2006-12-11 2012-09-11 Nikon Corporation Electronic camera

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6388684B1 (en) * 1989-07-14 2002-05-14 Hitachi, Ltd. Method and apparatus for displaying a target region and an enlarged image
US6958749B1 (en) * 1999-11-04 2005-10-25 Sony Corporation Apparatus and method for manipulating a touch-sensitive display panel
US20020075407A1 (en) * 2000-12-15 2002-06-20 Philips Electronics North America Corporation Picture-in-picture repositioning and/or resizing based on video content analysis
US6970181B1 (en) * 2001-04-27 2005-11-29 Cisco Technology, Inc. Bandwidth conserving near-end picture-in-picture video applications
US20060072811A1 (en) * 2002-11-29 2006-04-06 Porter Robert Mark S Face detection
US20040258152A1 (en) * 2003-06-19 2004-12-23 Herz William S. System and method for using motion vectors for object tracking
US20070008338A1 (en) * 2005-05-28 2007-01-11 Young-Chan Kim Display system, display apparatus, and method of controlling video source and display apparatus
US20070040898A1 (en) * 2005-08-19 2007-02-22 Yen-Chi Lee Picture-in-picture processing for video telephony
US20070098229A1 (en) * 2005-10-27 2007-05-03 Quen-Zong Wu Method and device for human face detection and recognition used in a preset environment
US20070109324A1 (en) * 2005-11-16 2007-05-17 Qian Lin Interactive viewing of video
US20090009531A1 (en) * 2007-07-03 2009-01-08 Canon Kabushiki Kaisha Image display control apparatus and method

Cited By (90)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110078566A1 (en) * 2009-09-30 2011-03-31 Konica Minolta Systems Laboratory, Inc. Systems, methods, tools, and user interface for previewing simulated print output
US20110241991A1 (en) * 2009-10-07 2011-10-06 Yasunobu Ogura Tracking object selection apparatus, method, program and circuit
US8432357B2 (en) * 2009-10-07 2013-04-30 Panasonic Corporation Tracking object selection apparatus, method, program and circuit
US9324130B2 (en) 2010-10-11 2016-04-26 Hewlett-Packard Development Company, L.P. First image and a second image on a display
WO2012050561A1 (en) * 2010-10-11 2012-04-19 Hewlett-Packard Development Company, L.P. A first image and a second image on a display
GB2497878A (en) * 2010-10-11 2013-06-26 Hewlett Packard Development Co A first image and a second image on a display
US20120092529A1 (en) * 2010-10-19 2012-04-19 Samsung Electronics Co., Ltd. Method for processing an image and an image photographing apparatus applying the same
CN102572261A (en) * 2010-10-19 2012-07-11 三星电子株式会社 Method for processing an image and an image photographing apparatus applying the same
US9792612B2 (en) 2010-11-23 2017-10-17 Echostar Technologies L.L.C. Facilitating user support of electronic devices using dynamic matrix code generation
US9329966B2 (en) 2010-11-23 2016-05-03 Echostar Technologies L.L.C. Facilitating user support of electronic devices using matrix codes
US10382807B2 (en) 2010-11-24 2019-08-13 DISH Technologies L.L.C. Tracking user interaction from a receiving device
US9781465B2 (en) 2010-11-24 2017-10-03 Echostar Technologies L.L.C. Tracking user interaction from a receiving device
EP2646946A4 (en) * 2010-12-01 2015-04-01 Echostar Technologies Llc User control of the display of matrix codes
EP2646946A1 (en) * 2010-12-01 2013-10-09 EchoStar Technologies L.L.C. User control of the display of matrix codes
US9280515B2 (en) 2010-12-03 2016-03-08 Echostar Technologies L.L.C. Provision of alternate content in response to QR code
US9596500B2 (en) 2010-12-17 2017-03-14 Echostar Technologies L.L.C. Accessing content via a matrix code
US10015550B2 (en) 2010-12-20 2018-07-03 DISH Technologies L.L.C. Matrix code-based user interface
US9148686B2 (en) 2010-12-20 2015-09-29 Echostar Technologies, Llc Matrix code-based user interface
US9571888B2 (en) 2011-02-15 2017-02-14 Echostar Technologies L.L.C. Selection graphics overlay of matrix code
US9367669B2 (en) 2011-02-25 2016-06-14 Echostar Technologies L.L.C. Content source identification using matrix barcode
US9736469B2 (en) 2011-02-28 2017-08-15 Echostar Technologies L.L.C. Set top box health and configuration
US9686584B2 (en) 2011-02-28 2017-06-20 Echostar Technologies L.L.C. Facilitating placeshifting using matrix codes
US10165321B2 (en) 2011-02-28 2018-12-25 DISH Technologies L.L.C. Facilitating placeshifting using matrix codes
US10015483B2 (en) 2011-02-28 2018-07-03 DISH Technologies LLC. Set top box health and configuration
US9652108B2 (en) 2011-05-20 2017-05-16 Echostar Uk Holdings Limited Progress bar
US20150350559A1 (en) * 2011-09-26 2015-12-03 Sony Corporation Image photography apparatus
US10771703B2 (en) * 2011-09-26 2020-09-08 Sony Corporation Image photography apparatus
US20130076944A1 (en) * 2011-09-26 2013-03-28 Sony Mobile Communications Japan, Inc. Image photography apparatus
US9137444B2 (en) * 2011-09-26 2015-09-15 Sony Corporation Image photography apparatus for clipping an image region
US11252332B2 (en) * 2011-09-26 2022-02-15 Sony Corporation Image photography apparatus
US8988467B2 (en) 2011-10-13 2015-03-24 Microsoft Technology Licensing, Llc Touchscreen selection visual feedback
US11030420B2 (en) * 2011-10-19 2021-06-08 Microsoft Technology Licensing, Llc Translating language characters in media content
US10469767B2 (en) * 2011-11-14 2019-11-05 Sony Corporation Information processing apparatus, method, and non-transitory computer-readable medium
US20140253693A1 (en) * 2011-11-14 2014-09-11 Sony Corporation Information processing apparatus, method, and non-transitory computer-readable medium
CN103959369A (en) * 2011-11-30 2014-07-30 株式会社理光 Display control apparatus, display control system, display control method, and computer program product
EP2786368A1 (en) * 2011-11-30 2014-10-08 Ricoh Company, Ltd. Display control apparatus, display control system, display control method, and computer program product
EP2786368A4 (en) * 2011-11-30 2014-10-08 Ricoh Co Ltd Display control apparatus, display control system, display control method, and computer program product
JP2013186733A (en) * 2012-03-08 2013-09-19 Sony Corp Display control device, display control method, and computer-readable recording medium
US9159117B2 (en) * 2012-03-23 2015-10-13 Canon Kabushiki Kaisha Drawing data generation apparatus, drawing data generation method, program, and drawing data generation system for changing magnification of displayed images
US20130249952A1 (en) * 2012-03-23 2013-09-26 Canon Kabushiki Kaisha Drawing data generation apparatus, drawing data generation method, program, and drawing data generation system
CN103365612A (en) * 2012-03-26 2013-10-23 联想(北京)有限公司 Information processing method and electronic instrument
US20130332958A1 (en) * 2012-06-12 2013-12-12 Electronics And Telecommunications Research Institute Method and system for displaying user selectable picture
US9693108B2 (en) * 2012-06-12 2017-06-27 Electronics And Telecommunications Research Institute Method and system for displaying user selectable picture
US20160173940A1 (en) * 2012-06-29 2016-06-16 Toyota Jidosha Kabushiki Kaisha Image information provision device, image information provision system, and image information provision method
US10075673B2 (en) 2012-07-17 2018-09-11 Samsung Electronics Co., Ltd. System and method for providing image
KR101917695B1 (en) * 2012-08-09 2018-11-13 엘지전자 주식회사 Mobile terminal and control method for the mobile terminal
US20140047340A1 (en) * 2012-08-09 2014-02-13 Lg Electronics Inc. Mobile terminal and control method therefor
EP2696278A3 (en) * 2012-08-09 2015-04-15 LG Electronics Inc. Mobile terminal and control method therefor
US20140059457A1 (en) * 2012-08-27 2014-02-27 Samsung Electronics Co., Ltd. Zooming display method and apparatus
US8902361B2 (en) 2012-10-15 2014-12-02 At&T Intellectual Property I, L.P. Relational display of images
US8717500B1 (en) * 2012-10-15 2014-05-06 At&T Intellectual Property I, L.P. Relational display of images
US9100616B2 (en) 2012-10-15 2015-08-04 At&T Intellectual Property I, L.P. Relational display of images
US9538116B2 (en) 2012-10-15 2017-01-03 At&T Intellectual Property I, L.P. Relational display of images
US9313444B2 (en) 2012-10-15 2016-04-12 At&T Intellectual Property I, Lp Relational display of images
US20140282670A1 (en) * 2012-12-28 2014-09-18 Turner Broadcasting System, Inc. Method and system for detecting and resolving conflicts in an automatic content recognition based system
US9154841B2 (en) * 2012-12-28 2015-10-06 Turner Broadcasting System, Inc. Method and system for detecting and resolving conflicts in an automatic content recognition based system
US9652180B2 (en) 2013-01-28 2017-05-16 Samsung Electronics Co., Ltd. Memory device, memory system, and control method performed by the memory system
US10002430B1 (en) 2013-06-04 2018-06-19 Hrl Laboratories, Llc Training system for infield training of a vision-based object detector
US9378420B1 (en) 2013-06-04 2016-06-28 Hrl Laboratories, Llc System for detecting an object of interest in a scene
WO2014197331A1 (en) * 2013-06-04 2014-12-11 Hrl Laboratories, Llc A system for detecting an object of interest in a scene
EP3039858A1 (en) * 2013-08-27 2016-07-06 Qualcomm Incorporated Systems, devices and methods for displaying pictures in a picture
JP2016538601A (en) * 2013-08-27 2016-12-08 クゥアルコム・インコーポレイテッドQualcomm Incorporated System, device, and method for displaying picture-in-picture
US20150062434A1 (en) * 2013-08-27 2015-03-05 Qualcomm Incorporated Systems, devices and methods for displaying pictures in a picture
US9973722B2 (en) * 2013-08-27 2018-05-15 Qualcomm Incorporated Systems, devices and methods for displaying pictures in a picture
EP3120228A4 (en) * 2014-03-21 2018-02-28 Amazon Technologies, Inc. Object tracking in zoomed video
US10664140B2 (en) 2014-03-21 2020-05-26 Amazon Technologies, Inc. Object tracking in zoomed video
US10185976B2 (en) * 2014-07-23 2019-01-22 Target Brands Inc. Shopping systems, user interfaces and methods
US9706106B2 (en) 2014-09-05 2017-07-11 Lg Electronics Inc. Mobile terminal and method of controlling the same
EP3190496A4 (en) * 2014-09-05 2018-01-24 LG Electronics Inc. Mobile terminal and control method therefor
CN104754223A (en) * 2015-03-12 2015-07-01 广东欧珀移动通信有限公司 Method for generating thumbnail and shooting terminal
US20170347153A1 (en) * 2015-04-16 2017-11-30 Tencent Technology (Shenzhen) Company Limited Method of zooming video images and mobile terminal
US10397649B2 (en) * 2015-04-16 2019-08-27 Tencent Technology (Shenzhen) Company Limited Method of zooming video images and mobile display terminal
EP3311582A4 (en) * 2015-06-17 2018-10-31 LG Electronics Inc. Display device and operating method thereof
US11404021B2 (en) 2015-06-18 2022-08-02 Samsung Electronics Co., Ltd. Electronic device and method of processing notification in electronic device
EP3316592A4 (en) * 2015-06-29 2018-12-05 LG Electronics Inc. Display device and control method therefor
US10194112B2 (en) 2015-06-29 2019-01-29 Lg Electronics Inc. Display device and control method therefor
CN107852531A (en) * 2015-08-25 2018-03-27 Lg电子株式会社 Display device and its control method
EP3343939A4 (en) * 2015-08-25 2019-01-23 LG Electronics Inc. Display device and control method therefor
US11150787B2 (en) * 2015-11-20 2021-10-19 Samsung Electronics Co., Ltd. Image display device and operating method for enlarging an image displayed in a region of a display and displaying the enlarged image variously
EP3434005A4 (en) * 2016-03-24 2020-01-15 LG Electronics Inc. -1- Display device and operating method thereof
WO2017164656A2 (en) 2016-03-24 2017-09-28 Lg Electronics Inc. Display device and operating method thereof
US10162432B2 (en) 2016-03-24 2018-12-25 Lg Electronics Inc. Display device and operating method thereof
EP3471424A4 (en) * 2016-06-13 2020-07-01 LG Electronics Inc. -1- Display device and display system including same
US10983745B2 (en) 2016-06-13 2021-04-20 Lg Electronics Inc. Display device and display system including same
US20190221184A1 (en) * 2016-07-29 2019-07-18 Mitsubishi Electric Corporation Display device, display control device, and display control method
US11884155B2 (en) * 2019-04-25 2024-01-30 Motional Ad Llc Graphical user interface for display of autonomous vehicle behaviors
US11908340B2 (en) * 2019-07-24 2024-02-20 Arris Enterprises Llc Magnification enhancement of video for visually impaired viewers
US20230010078A1 (en) * 2021-07-12 2023-01-12 Avago Technologies International Sales Pte. Limited Object or region of interest video processing system and method
US20230082451A1 (en) * 2021-09-10 2023-03-16 Acer Incorporated Intelligent zooming method and electronic device using the same
WO2023057040A1 (en) * 2021-10-04 2023-04-13 Jsc Yukon Advanced Optics Worldwide Enhanced picture-in-picture

Also Published As

Publication number Publication date
EP2314069A2 (en) 2011-04-27
US9648269B2 (en) 2017-05-09
US20140218611A1 (en) 2014-08-07
WO2010013892A3 (en) 2010-03-25
EP2314069A4 (en) 2011-08-31
WO2010013892A2 (en) 2010-02-04
KR101009881B1 (en) 2011-01-19
KR20100013106A (en) 2010-02-09
CN102106145A (en) 2011-06-22

Similar Documents

Publication Publication Date Title
US9648269B2 (en) Apparatus and method for displaying an enlarged target region of a reproduced image
US7876978B2 (en) Regions of interest in video frames
CN107770627B (en) Image display apparatus and method of operating the same
US20080307458A1 (en) Multichannel display method and system for a digital broadcast-enabled mobile terminal
JP4956216B2 (en) Digital broadcast program display device and digital broadcast program display program
KR101851630B1 (en) Mobile terminal and image converting method thereof
EP1793309A1 (en) Display apparatus and searching method
US20200186887A1 (en) Real-time broadcast editing system and method
CN110929054B (en) Multimedia information application interface display method and device, terminal and medium
US11716300B2 (en) Techniques for optimizing the display of videos
EP1825472B1 (en) Method and apparatus for video editing on small screen with minimal input device
JP2012520052A (en) Interactive access to media or other content related to the program you are currently viewing
US20120060187A1 (en) Method for providing channel list and display apparatus applying the same
US20120301030A1 (en) Image processing apparatus, image processing method and recording medium
CN101242474A (en) A dynamic video browse method for phone on small-size screen
KR102029604B1 (en) Editing system and editing method for real-time broadcasting
US20070083893A1 (en) Display apparatus and control method thereof
KR101481563B1 (en) Terminal and method of controlling broadcasting therein
JP2005080020A (en) Mobile information device
JP2015037290A (en) Video control device, video display device, video control system, and method
KR101893146B1 (en) Mobile terminal and method for controlling the same
WO2014122798A1 (en) Image processing device, and image processing method
JP2007104394A (en) Image displaying device and image recording device
US20070080942A1 (en) Method for searching data in a wireless terminal
KR100787573B1 (en) Method for processing of channel guide in mobile communication terminal and mobile communication terminal therefor

Legal Events

Date Code Title Description
AS Assignment

Owner name: SAMSUNG ELECTRONICS CO., LTD.,KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PARK, HYUN-HEE;KIM, JONG-MAN;LEE, MIN-WOO;AND OTHERS;REEL/FRAME:022807/0985

Effective date: 20090526

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION