CN105007531A - Image display device and control method thereof - Google Patents
Image display device and control method thereof Download PDFInfo
- Publication number
- CN105007531A CN105007531A CN201510196687.5A CN201510196687A CN105007531A CN 105007531 A CN105007531 A CN 105007531A CN 201510196687 A CN201510196687 A CN 201510196687A CN 105007531 A CN105007531 A CN 105007531A
- Authority
- CN
- China
- Prior art keywords
- progress bar
- video
- fragment
- image display
- playback
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 95
- 239000012634 fragment Substances 0.000 claims description 276
- 230000004044 response Effects 0.000 claims description 19
- 230000036410 touch Effects 0.000 description 103
- 238000004891 communication Methods 0.000 description 59
- 230000008569 process Effects 0.000 description 30
- 238000010586 diagram Methods 0.000 description 24
- 230000006870 function Effects 0.000 description 22
- 230000005236 sound signal Effects 0.000 description 20
- 230000008859 change Effects 0.000 description 17
- 239000000284 extract Substances 0.000 description 17
- 230000003287 optical effect Effects 0.000 description 16
- 238000005516 engineering process Methods 0.000 description 12
- 210000003811 finger Anatomy 0.000 description 11
- 238000010295 mobile communication Methods 0.000 description 9
- 230000005540 biological transmission Effects 0.000 description 8
- 230000000694 effects Effects 0.000 description 8
- 230000001815 facial effect Effects 0.000 description 7
- 239000011521 glass Substances 0.000 description 7
- 108090000565 Capsid Proteins Proteins 0.000 description 6
- 239000011469 building brick Substances 0.000 description 6
- 239000012528 membrane Substances 0.000 description 6
- 238000005452 bending Methods 0.000 description 5
- 238000010168 coupling process Methods 0.000 description 5
- 239000004973 liquid crystal related substance Substances 0.000 description 5
- 230000000007 visual effect Effects 0.000 description 5
- NBIIXXVUZAFLBC-UHFFFAOYSA-N Phosphoric acid Chemical compound OP(O)(O)=O NBIIXXVUZAFLBC-UHFFFAOYSA-N 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 4
- 230000000712 assembly Effects 0.000 description 4
- 238000000429 assembly Methods 0.000 description 4
- 230000008878 coupling Effects 0.000 description 4
- 238000005859 coupling reaction Methods 0.000 description 4
- 238000005286 illumination Methods 0.000 description 4
- 238000007689 inspection Methods 0.000 description 4
- 239000011159 matrix material Substances 0.000 description 4
- 238000005096 rolling process Methods 0.000 description 4
- 241000256844 Apis mellifera Species 0.000 description 3
- 238000012217 deletion Methods 0.000 description 3
- 230000037430 deletion Effects 0.000 description 3
- 238000004898 kneading Methods 0.000 description 3
- 230000033001 locomotion Effects 0.000 description 3
- 230000007774 longterm Effects 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 239000002184 metal Substances 0.000 description 3
- 229910052751 metal Inorganic materials 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 229920003002 synthetic resin Polymers 0.000 description 3
- 239000000057 synthetic resin Substances 0.000 description 3
- 239000010409 thin film Substances 0.000 description 3
- 230000003213 activating effect Effects 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 239000003990 capacitor Substances 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 230000001143 conditioned effect Effects 0.000 description 2
- 239000004020 conductor Substances 0.000 description 2
- 230000005611 electricity Effects 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 239000010408 film Substances 0.000 description 2
- 229910001220 stainless steel Inorganic materials 0.000 description 2
- 239000010935 stainless steel Substances 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 239000010936 titanium Substances 0.000 description 2
- 210000000707 wrist Anatomy 0.000 description 2
- KLDZYURQCUYZBL-UHFFFAOYSA-N 2-[3-[(2-hydroxyphenyl)methylideneamino]propyliminomethyl]phenol Chemical compound OC1=CC=CC=C1C=NCCCN=CC1=CC=CC=C1O KLDZYURQCUYZBL-UHFFFAOYSA-N 0.000 description 1
- 101150012579 ADSL gene Proteins 0.000 description 1
- 102100020775 Adenylosuccinate lyase Human genes 0.000 description 1
- 108700040193 Adenylosuccinate lyases Proteins 0.000 description 1
- RTAQQCXQSZGOHL-UHFFFAOYSA-N Titanium Chemical compound [Ti] RTAQQCXQSZGOHL-UHFFFAOYSA-N 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 239000002775 capsule Substances 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 230000009351 contact transmission Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 201000001098 delayed sleep phase syndrome Diseases 0.000 description 1
- 208000033921 delayed sleep phase type circadian rhythm sleep disease Diseases 0.000 description 1
- 230000004069 differentiation Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000005672 electromagnetic field Effects 0.000 description 1
- 230000005674 electromagnetic induction Effects 0.000 description 1
- 238000001962 electrophoresis Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 210000005224 forefinger Anatomy 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000006698 induction Effects 0.000 description 1
- 238000001746 injection moulding Methods 0.000 description 1
- 239000011229 interlayer Substances 0.000 description 1
- 239000010410 layer Substances 0.000 description 1
- 238000004020 luminiscence type Methods 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 230000006386 memory function Effects 0.000 description 1
- 238000002156 mixing Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 239000000178 monomer Substances 0.000 description 1
- 210000003205 muscle Anatomy 0.000 description 1
- 230000010355 oscillation Effects 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 229920001296 polysiloxane Polymers 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 239000005060 rubber Substances 0.000 description 1
- 238000007789 sealing Methods 0.000 description 1
- 230000035807 sensation Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 239000007921 spray Substances 0.000 description 1
- 230000000638 stimulation Effects 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 230000001629 suppression Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 229910052719 titanium Inorganic materials 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/472—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
- H04N21/47217—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for controlling playback functions for recorded or on-demand content, e.g. using progress bars, mode or play-point indicators or bookmarks
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/36—Monitoring, i.e. supervising the progress of recording or reproducing
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/02—Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
- G11B27/031—Electronic editing of digitised analogue information signals, e.g. audio or video signals
- G11B27/034—Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/102—Programmed access in sequence to addressed parts of tracks of operating record carriers
- G11B27/105—Programmed access in sequence to addressed parts of tracks of operating record carriers of operating discs
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/19—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
- G11B27/28—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/34—Indicating arrangements
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B33/00—Constructional parts, details or accessories not provided for in the other groups of this subclass
- G11B33/12—Disposition of constructional parts in the apparatus, e.g. of power supply, of modules
- G11B33/121—Disposition of constructional parts in the apparatus, e.g. of power supply, of modules the apparatus comprising a single recording/reproducing device
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/91—Television signal processing therefor
- H04N5/93—Regeneration of the television signal or of selected parts thereof
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Databases & Information Systems (AREA)
- Human Computer Interaction (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
The invention relates to an image display device and a control method thereof. The image display device includes a display unit for playing video and a control method thereof, and the method may include selecting at least one of characters contained in video, searching at least one region containing the selected character within the entire region of a frame using at least one frame provided in the video, extracting a major playback section containing the selected character based on at least one of the size and location of the retrieved at least one region, and playing the extracted major playback section.
Description
Technical field
The disclosure relates to and a kind ofly comprises image display device for the display unit of displaying video and control method thereof.
Background technology
Mobility terminal based on it can be classified into two types, such as mobile/handheld terminal and fixed terminal.In addition, two types can be classified into further, such as handheld terminal and car-mounted terminal based on whether it can directly be carried mobile terminal by user.
Along with it becomes multi-functional, such as, allow capture terminal rest image or moving image, play music or video file, play games, receive broadcast etc., make to be implemented as integrated multimedia player.Such as, terminal may be implemented as the image display device for receiving with displaying video and audio frequency.
In hardware or software, carry out various trial newly so that realize such complex function in multimedia player.Such as, a kind of user interface environment be associated with the video playback such as coiled fast or recoil video is provided.
In addition, changes of function and the enhancing for allowing user to use image display device is more easily required.Can consider that image display device for playing main playback fragment and the user interface that is associated with main playback fragment are as changes of function with one of strengthen.
Summary of the invention
An aspect of the present disclosure is to provide a kind of image display device and control method thereof, and this image display device optionally can play the main playback fragment be associated with at least one in the role shown in video.
In addition, another aspect of the present disclosure is to provide a kind of image display device and control method thereof, and this image display device can use different progress bars to notify the playback progress status of main playback fragment.
In addition, another aspect of the present disclosure is to provide a kind of image display device, and this image display device can indicate the feature of role while the progress bar of playback progress status that video is shown.
In order to complete aforesaid object, provide a kind of method for controlling display unit, and the method can comprise: the part identifying the video comprising the people that user selects; By the main playback piecewise definition comprising video for comprise in the part of identified video one or more and meet the criterion preset; And on the display of display unit the main playback fragment of display video.
According to embodiment, the method comprises further, while display video, show the progress bar of the playback state of instruction video, wherein progress bar comprises first progress bar corresponding with the whole playback fragment of video and second progress bar corresponding with main playback fragment.
According to embodiment, wherein pre-set criteria at least comprises, comprise the ratio of the selected people in the position of selected people in the length of the scene of the video of selected people or part, the scene of video or part or the scene of video or part, and main playback fragment comprises the multiple fragments be spaced from each other.
According to embodiment, the method comprises further: show the first progress bar and the second progress bar overlappingly; With show the Part I of overlapping first progress bar the second progress bar with distinguishing, the Part II making the second progress bar in the first progress bar and expression not correspond to the first progress bar of the fragment of the video of main playback fragment is differentiable.
According to embodiment, wherein the first progress bar and the second progress bar are shown independently, make the position of the current broadcasting of the video of the first progress bar instruction in the whole playback fragment of video, and the position of current broadcasting of the second progress bar instruction in main playback fragment and each in multiple fragment corresponds in the second progress bar.
According to embodiment, the method comprises further: show multiple indicating device, each indicating device is with corresponding in the multiple fragments by the second progress bar instruction, and each indicating device instruction is relative to the corresponding importance information of in multiple fragments of selected people; Multiple indicating device is shown with all size with according to each importance information in multiple fragment.
According to embodiment, wherein at least based on position or the size determination importance information of the people selected by the scene or part of video.
According to embodiment, wherein show main playback fragment and comprise the frame that display is continuously included in multiple fragment.
According to embodiment, the method comprises further: in response to the main playback fragment of the user's input editing received via progress bar.
According to embodiment, wherein the selection of people comprises: the people in the time-out scene of the video that the playback that identification is included in video is suspended or the face of people; The screen be suspended shows the Drawing Object for selecting the people be identified; And the people that is identified is selected in response to the touch being applied to Drawing Object.
In addition, in order to complete aforesaid object, the image display device according to embodiment of the present disclosure can comprise: display, and this display is configured to display video; And controller, this controller is configured to identify the part of the video comprising the people selected by user, by the main playback piecewise definition of video be comprise in the part be identified of video one or more and meet pre-set criteria and make the main playback fragment of display display video.
According to embodiment, the progress bar that its middle controller is configured to make display to show the playback state of instruction video further display video and progress bar comprises first progress bar corresponding with the whole playback fragment of video and second progress bar corresponding with main playback fragment simultaneously.
According to embodiment, wherein pre-set criteria at least comprises, comprise the ratio of the selected people in the position of selected people in the length of the scene of the video of selected people or part, the scene of video or part or the scene of video or part, and main playback fragment comprises the multiple fragments be spaced from each other.
According to embodiment, its middle controller is configured to further: make display to show the first progress bar and the second progress bar overlappingly, and the Part II that the second progress bar distinguishing the Part I of overlapping first progress bar of ground display makes the second progress bar and expression in the first progress bar not correspond to the first progress bar of the fragment of the video of main playback fragment is differentiable.
According to embodiment, wherein the first progress bar and the second progress bar are made the position of the current broadcasting of the video of the first progress bar instruction in the whole playback fragment of video by display independently, and the position of current broadcasting of the second progress bar instruction in main playback fragment and each in multiple fragment corresponds in the second progress bar.
According to embodiment, its middle controller is configured to further: make display show multiple indicating device, each indicating device is with corresponding in the multiple fragments by the second progress bar instruction, each indicating device instruction about the corresponding importance information of in multiple fragments of selected people, and shows multiple indicating device according to each importance information in multiple fragment with all size.
According to embodiment, wherein at least based on position or the size determination importance information of the people selected by the scene or part of video.
According to embodiment, its middle controller is configured to make display show the frame be included in multiple fragment continuously further.
According to embodiment, its middle controller is configured to further, in response to the main playback fragment of the user's input editing received via progress bar.
According to embodiment, its middle controller is configured to comprise further: the people in the time-out scene of the video that the playback that identification is included in video is suspended or the face of people, the screen be suspended shows the Drawing Object for selecting the people be identified, and selects the people that is identified in response to the touch being applied to Drawing Object.
In addition, while progress displaying bar, personage's importance can be calculated based at least one comprising in the size in region of role and position according to the image display device of embodiment of the present disclosure, and show different images according to calculated personage's importance.Because along with the relatively larger image of the increase display of importance, so user can use and be shown as the different images adjacent with progress bar and check the weight of specific role or the relation with other role.
Accompanying drawing explanation
Accompanying drawing is included to provide a further understanding of the present invention, and to be incorporated in this specification and to form the part of this specification, and accompanying drawing illustrates embodiments of the invention and together with the description principle of the present invention that lays down a definition.
In the accompanying drawings:
Figure 1A is the block diagram illustrating the mobile terminal be associated with the disclosure;
Figure 1B and Fig. 1 C is the conceptual view of the example wherein seeing the mobile terminal be associated with the disclosure from different directions;
Fig. 2 is the conceptual view for explaining the modified example according to mobile terminal 200 of the present disclosure;
Fig. 3 is the perspective view of the example illustrating the watch style mobile terminal 300 be associated with another embodiment of the present disclosure;
Fig. 4 is the perspective view of the example of the image display device 400 illustrating the fixed terminal type be associated with another embodiment of the present disclosure;
Fig. 5 is the block diagram of the external input device 500 illustrated particularly in fig. 2;
Fig. 6 is the flow chart illustrating control method of the present disclosure typically;
Fig. 7 A, Fig. 7 B and Fig. 7 C are the views for explaining the method extracting main playback fragment in the control method of Fig. 6 particularly;
Fig. 8 A, Fig. 8 B, Fig. 8 C, Fig. 8 D and Fig. 8 E are the conceptual view of the method for the example of the operation that diagram selects specific role to realize as the control method by Fig. 6;
Fig. 9 A, Fig. 9 B and Fig. 9 C are the conceptual view of diagram for the various embodiments of progress displaying bar;
Figure 10 A and Figure 10 B are the conceptual view of the control illustrating the progress bar corresponding with main playback fragment;
Figure 11 A, Figure 11 B and Figure 11 C show the conceptual view of the method for main playback fragment while of being and being shown in progress displaying bar;
Figure 12 A, Figure 12 B and Figure 12 C are the conceptual view that diagram edits the method for the main playback fragment on progress bar;
Figure 13 A, Figure 13 B and Figure 13 C are for explaining the view performing the method for specific role playback mode according to an embodiment of the invention in image display device at the playback of video;
Figure 14, Figure 15 A, Figure 15 B and Figure 15 C is the view that in image display device change the method for progress bar of diagram according to embodiment of the present disclosure; And
Figure 16 is the conceptual view that in image display device from user receive the method for criterion for extract main playback fragment of diagram according to embodiment of the present disclosure.
Embodiment
Will be described in detail in exemplary embodiment disclosed in this with reference to accompanying drawing now.In order to describe briefly with reference to accompanying drawing, identical or equivalent assembly will be provided with identical Reference numeral, and its description can not be repeated.The suffix " module " of element disclosed in being used in the following description and " unit " are only intended to the simple description for this specification, and suffix itself does not give any specific meaning or function.In the description disclosure, if be considered to unnecessarily shift spirit of the present disclosure for the detailed explanation of relevant known function or structure, then omit such explanation but one skilled in the art should appreciate that.Accompanying drawing is used to help easily to understand technological thought of the present disclosure, and should understand thought of the present disclosure and do not limit by accompanying drawing.Theory of the present disclosure should be interpreted as extending to any change, the equivalent except accompanying drawing and substitute.
Although will be understood that this can use term first, second etc. to describe various key element, these terms should not limit these key elements.These terms are only used to differentiation key element and another key element.
Will be understood that, when key element is called as " connection " another key element, key element can be connected another key element or also can there is intermediate elements.On the contrary, when key element is called as " directly connecting " another key element, there is not intermediate elements.
Odd number represents and can comprise complex representation, unless its expression is different from contextual clear and definite meaning.
Term is used " to comprise " or " having " at this, they are intended to indicate the existence of disclosed in this manual several assembly or step, and are also appreciated that the assembly that the part of assembly or step cannot be included or add or step can be included further.
Image display device described here can comprise cell phone, smart phone, laptop computer, digital broadcast terminal, personal digital assistant (PDA), portable media player (PMP), navigator, flat computer, dull and stereotyped PC, super, wearable device (such as, intelligent watch, intelligent glasses, head mounted display (HMD)) etc.
But those skilled in the art can easily understand, except only can be applicable to the situation of image display device, also can be applied to the fixed terminal of such as digital TV, desktop PC etc. according to the configuration of the exemplary embodiment of this specification.
Referring to figs. 1 to the block diagram that Fig. 1 C, Figure 1A are the image display devices 100 according to embodiment of the present disclosure, and Figure 1B and Fig. 1 C is the conceptual view of an example of the image display device seen from different directions.
Image display device 100 can comprise assembly, such as wireless communication unit 110, input unit 120, sensing cell 140, output unit 150, interface unit 160, memory 170, controller 180, power subsystem 190 etc.Fig. 1 diagram has the image display device of various assembly, but is understandable that, realizing all assemblies be illustrated is not requirement.More or less assembly can be realized alternatively.
In further detail, wireless communication unit 110 in these assemblies can comprise one or more module usually, it allows between image display device 100 and wireless communication system, image display device 100 and another image display device 100 or image display device 100 and another image display device 100 (or external server) network between radio communication.
Such as, wireless communication unit 110 can comprise at least one in broadcast reception module 111, mobile communication module 112, wireless Internet module 113, short range communication module 114, positional information module 115 etc.
Input unit 120 can comprise the camera 121 for received image signal; Microphone 122, this microphone 122 or dio Output Modules, for input audio signal; And for allowing the user input unit 123 (such as, membrane keyboard, promotion key, (or numerical key) etc.) of user's input information.The voice data collected by input unit 120 or view data can be analyzed and processed by the control command of user.
Sensing cell 140 can comprise at least one transducer of at least one in the internal information of sensing image display unit, the surrounding environment of image display device and user profile.Such as, sensing cell 140 can comprise proximity transducer 141, illumination sensor 142, touch sensor, acceleration transducer, Magnetic Sensor, G transducer, gyro sensor, motion sensor, RGB transducer, infrared (IR) transducer, finger scan, sonac, optical pickocff (such as, reference camera 121), microphone 122, battery gauge, environmental sensor (such as, barometer, hygrometer, thermometer, Radiation monitoring transducer, heat sensor, gas sensor etc.), and chemical sensor (such as, Electronic Nose, medical treatment transducer, biology sensor etc.).On the other hand, such mode of information that image display device disclosed herein can sense with at least two that are combined through in these transducers utilizes information.
Output unit 150 can be configured to output audio signal, vision signal or haptic signal.Output unit 150 can comprise display unit 151, dio Output Modules 152, sense of touch module 153, optical output module 154 etc.Display unit 151 can have and makes to realize touch-screen with the interlayer structure of touch sensor or integrated morphology.Touch-screen can provide output interface between image display device 100 and user, and is used as the user input unit 123 providing inputting interface between image display device 100 and user.
Interface unit 160 can as the interface of various types of external equipments being connected image display device 100.Such as, interface unit 160 can comprise wired or wireless headset port, external power source port or wireless data communications port, memory card port, for connecting the port, audio frequency I/O (I/O) port, video i/o port, ear port etc. of the device with identification module.In response to the external equipment being connected to interface unit 160, image display device 100 can perform the suitable control be associated with the external equipment be connected.
Memory 170 can be stored in the application program (or application) of execution in image display device 100, the data, instruction word etc. for the operation of image display device 100.At least some these application programs can be downloaded from external server via radio communication.When loading and transporting in some application some other can be installed in image display device 100, for image display device 100 basic function (such as, receipt of call, call, receipt message, transmission message etc.).On the other hand, application program can be stored in memory 170, is installed in image display device 100, and is performed by controller 180 with the operation of carries out image display unit 100 (or function).
Controller 180 can control the integrated operation of image display device 100 usually, except the operation be associated with application program.Controller 180 can with by processing the signal, data, information etc. that are inputted by aforesaid assembly or exported, or the mode activating the application program be stored in memory 170 provides or processes the information or function that are suitable for user.
Controller 180 can control in FIG in illustrated assembly at least partially so that drive the application program be stored in memory 170.In addition, controller 180 can determine application program by combination at least two in the assembly being included in image display device 100 that operates.
Power subsystem 190 can receive external power or internal power and supply under the control of controller 180 and be included in the respective element in image display device 100 and the suitable electric power required by assembly for operation.Power subsystem 190 can comprise battery, and battery can be embedded into battery or removable battery.
Can be combined at least partially in these elements and assembly realizes the control method of operation according to the image display device of various exemplary embodiment described here and control or image display device.And, the control method of image display device can be realized in the mode activating at least one application program be stored in memory 170 in image display device.
Hereinafter, before explaining the various exemplary embodiments that the image display device 100 by having configuration realizes, with reference to Figure 1A, each aforesaid assembly will be described in further detail.
First, wireless communication unit 110 will be described.The information that the broadcast reception module 111 of wireless communication unit 110 can be associated from external broadcasting management entity receiving broadcast signal and/or broadcast via broadcasting channel.Broadcasting channel can comprise satellite channel and land channel.At least two broadcast reception modules 111 can be arranged in image display device 100 to receive at least two broadcasting channels or to switch broadcasting channel simultaneously.
Mobile communication is used for (such as in basis, global system for mobile communications (GSM), code division multiple access (CDMA), wideband CDMA (WCDMA), high-speed downlink packet access (HSDPA), Long Term Evolution (LTE) etc.) technical standard or transmission method structure mobile communications network on, wireless signal can be sent to network entity by mobile communication module 112, such as, base station, external image display unit, in server etc. at least one/from network entity, such as, base station, external image display unit, at least one in server etc. receives wireless signal.
At this, according to the transmission/reception of text/Multimedia Message, wireless signal can comprise the data of audio call signal, video (phone) call signal or various form.
Wireless Internet module 113 represents the module being used for Wi-Fi (Wireless Internet Access).This module internally or externally can be coupled to image display device 100.Wireless Internet module 113 can according to wireless Internet technologies via communication network transmission/or reception wireless signal.
The example of such Wi-Fi (Wireless Internet Access) can comprise WLAN (WLAN), Wireless Fidelity (Wi-Fi) Direct, DLNA (DLNA), WiMAX (Wibro), Worldwide Interoperability for Microwave access interoperability (Wimax), high-speed downlink packet access (HSDPA), High Speed Uplink Packet access (HSUPA), Long Term Evolution (LTE) etc.Wireless Internet module 113 can according to even comprise by the scope of aforesaid Internet technology at least one wireless Internet technologies sending/receiving data.
From the angle performed via mobile communications network according to WiBro, HSDPA, GSM, CDMA, WCDMA, LTE etc. Wi-Fi (Wireless Internet Access), wireless Internet module 113 performs and performs via mobile communications network the type that Wi-Fi (Wireless Internet Access) can be understood to mobile communication module 112.
Short range communication module 114 represents the module being used for junction service.The technology being suitable for realizing such junction service can comprise: bluetooth (BLUETOOTHTM), radio-frequency (RF) identification (RFID), Infrared Data Association (IrDA), ultra broadband (UWB), purple honeybee, near-field communication (NFC), Wireless Fidelity (Wi-Fi), Wi-Fi direct etc.Via WLAN (wireless local area network), short range communication module 114 can be supported between image display device 100 and wireless communication system, the communication between image display device 100 and another image display device 100 or image display device and another image display device 100 (or external server) network between communication.
At this, another image display device 100 can be wearable device, such as, intelligent watch, intelligent glasses or head mounted display (HMD), its can with image display device 100 swap data (or cooperating with image display device 100).Short range communication module 114 can sense the wearable device near (identification) image display device 100, and it can communicate with image display device.In addition, when the wearable device sensed be authorized with device communicate with according to the image display device 100 of embodiment of the present disclosure time, the data at least partially processed in image display device 100 can be sent to wearable device via short range communication module 114 by controller 180.Therefore, the user of wearable device can use the data of the process in image display device 100 on wearable device.Such as, when receiving calling in image display device 100, user can use wearable device answering call.And, when receiving message in image display device 100, the message that user can use wearable device inspection to receive.
Positional information module 115 represents for detecting or the module of position of computed image display unit.The example of positional information module 115 can comprise GPS system (GPS) module or Wi-Fi module.Such as, when image display device uses GPS module, the position of the signal acquisition image display device sent from gps satellite can be used.As another example, when image display device uses Wi-Fi module, can based on wireless signal being sent to Wi-Fi module or receiving the position of the relevant acquisition of information image display device of the WAP (wireless access point) (AP) of wireless signal from Wi-Fi module.
Hereinafter, input unit 120 will be described in further detail.Input unit 120 can be configured to provide the audio frequency or vision signal (or information) that are imported into image display device or the information inputted by user to image display device.For the input of audio-frequency information, image display device 100 can one or more camera 121.Camera 121 can process the picture frame of still picture or the video obtained by imageing sensor under video call mode or acquisition mode.Processed picture frame can be displayed on display unit 151.On the other hand, the multiple cameras 121 be disposed in image display device 100 can be arranged with matrix configuration.By using the camera 121 with matrix configuration, multiple image informations with various angle or focus can be imported in image display device 100.And multiple camera 121 can be arranged to obtain the left image for realizing stereo image and right image with stereo arrangement.
External audio signal can be processed into electrical audio data by microphone 122.The voice data be processed can be realized in every way according to the function (or the application program be performed) performed in image display device 100.On the other hand, microphone 122 can obtain denoising algorithm and receive to remove the noise produced in the process of external audio signal.
User input unit 123 can receive the information of user's input.When inputting information by user input unit 123, controller 180 can control the operation of image display device 100 to correspond to the information of input.User input unit 123 can comprise mechanical input element (or mechanical keys, such as, the button, thin film switch, scroll wheel, rolling switch etc. on the front/rear surface of image display device 100 or side surface is positioned at) and the input unit of touch-sensitive.As an example, the input unit of touch-sensitive can be shown virtual key, soft key or visible keys on the touchscreen by software process or be disposed in the membrane keyboard in part in addition to a touch.On the other hand, virtual key or visible keys can with such as, and the various shapes of figure, text, icon, video or its combination can show on the touchscreen.
Sensing cell 140 can at least one in the internal information of sensing image display unit, the ambient condition information of image display device and user profile, and generates sensing signal corresponding thereto.The data processing that the operation or perform that controller 180 can control image display device 100 based on sensing signal is associated with the application program be installed in image display device, function or operation.Hereinafter, the description of the various representative transducer of the various transducers that can be included in sensing cell 140 will be provided in further detail.
First, proximity transducer 141 refers to transducer, its when there is no Mechanical Contact by use magnetic field or infrared sensing close to the object on the surface that will be sensed or be disposed in the near surface wanting sensed object or do not exist.Proximity transducer 141 can be disposed in the interior zone place of the image display device covered by touch-screen, or near touch-screen.The utilance that proximity transducer 141 can have the life-span longer than contact pickup and more strengthen.
Such as, proximity transducer 141 can comprise transmission-type photoelectric sensor, directly reflective photoelectric sensor, mirror reflective photoelectric sensor, higher-order of oscillation proximity transducer, capacitive proximity sensor, magnetic-type proximity transducer, infrared ray proximity transducer etc.When touch-screen is implemented as capacitor type, proximity transducer 141 can sense close relative to touch-screen of indicating device by the change of electromagnetic field, and it is in response to object close with conductivity.Under these circumstances, touch-screen (touch sensor) also can be classified into proximity transducer.
Hereinafter, in order to succinctly explain, indicating device will be called as " close to touching " with the state when not having to contact close to touch-screen by position, but the state that indicating device contacts touch-screen substantially will be called as " contact touches ".For with the indicating device on touch-screen close to touching corresponding position, such position will correspond to the close touch of indicating device back pointer perpendicular to touch screen panel to position.Proximity transducer 141 can sense close to touching, and close to touch mode (such as, distance, direction, speed, time, position, mobile status etc.).On the other hand, controller 180 can process with sensed by proximity transducer 141 close to touch and close to the corresponding data (or information) of touch mode, and the visual information that output is corresponding with deal with data on the touchscreen.In addition, be that controller 180 can control image display device 100 to perform different operations or to process different data (or information) close to touch or contact touches according to whether relative to the touch of the identical point on touch-screen.
Use at least one in various types of touch methods of such as resistor-type, capacitor type, infra red type, field type etc., touch sensor can sense and be applied to touch-screen (or the touch of display unit 151 (or touch input).
As an example, the change that touch sensor can be configured to the pressure of the specific part by being applied to display unit 151 converts electrical input signal to, or converts the electric capacity that the particular portion office from display unit 151 occurs to electrical input signal.And touch sensor also can be configured to not only sense the position be touched and the region be touched, and sensing touch pressure.At this, touch to as if will the object inputting and be applied to touch sensor be touched.The example touching object can comprise finger, felt pen, stylus, indicating device etc.
When by the input of touch sensor sensing touch, corresponding signal can be sent to touch controller.Touch controller can process the signal received, and then corresponding data is sent to controller 180.Therefore, controller 180 can sense which region of touch sensitive display unit 151.At this, touch controller can be the assembly or controller 180 itself that are separated with controller 180.
On the other hand, controller 180 can perform different control or identical control according to the type of the object touching touch-screen (or the membrane keyboard arranged in addition to a touch).Based on the current operation status of image display device 100 or the application of current execution, can determine whether perform different control or identical control according to providing the object touching input.
Simultaneously, touch sensor and proximity transducer can be performed or are combined execution individually, to sense various types of touch, such as short (or rapping) touches, long touch, many touchs, drag touch, flick touchs, pinch contracting touch, pinch put touchs, scan touch, touch etc. of spiraling.
Sonac can be configured to the positional information by using ultrasonic wave identification relevant with sensed object.Such as, controller 180 can calculate based on the information sensed by illumination sensor and multiple sonac the position that ripple generates source.Because light ratio ultrasonic wave is faster, so the time that light arrives optical pickocff can be shorter than the time of ultrasonic wave arrival sonac far away.Use this fact can calculate the position in ripple generation source.In further detail, can calculate by using the time difference of the time arrived with ultrasonic wave the position that ripple generates source as with reference to signal based on light.
The camera 121 of structure input unit 120 can be a kind of camera sensor.Camera sensor can comprise at least one in optical sensor and laser sensor.
Camera 121 and laser sensor can be combined the touch of the object detected relative to 3D stereo-picture.Optical sensor can be laminated on the display apparatus.Optical sensor can be configured to scan the movement of sensing close to the object of touch-screen.In further detail, optical sensor can be included in the photodiode at row and column place and transistor with by using the electric signal changed according to the quantity of the light be applied in scan the content be placed on optical sensor.That is, optical sensor can obtain the positional information of sensed object according to the coordinate of the change calculations sensed object of light.
Display unit 151 can export the information of process in image display device 100.Such as, display unit 151 can be presented in response to performing screen message the execution screen message of application program or user interface (UI) and graphic user interface (GUI) information that drive in image display device 100.
Display unit 151 also may be implemented as the stereoscopic display unit for showing stereo-picture.
Stereoscopic display unit 152 can adopt the stereo display scheme of such as stereoscopic scheme (glasses scheme), automatic stereo scheme (glasses scheme), projection scheme (holographic scheme) etc.
Dio Output Modules 152 can receive or be stored in the voice data memory 160 in the middle output of call signal receiving mode, call model, logging mode, speech recognition mode, broadcast reception mode etc. from wireless communication unit 110.And dio Output Modules 152 also can provide the audio output signal relevant with the specific function performed by image display device 100 (such as, call signal receives sound, message sink sound etc.).Dio Output Modules 152 can comprise receiver, loud speaker, buzzer etc.
Sense of touch module 153 can produce the various haptic effects that user can feel.The typical case of the haptic effect produced by sense of touch module 153 can be vibration.Can be selected by user or the pattern etc. that can control the vibration produced by sense of touch module 155 is set by controller.Such as, can in combination or order mode sense of touch module 153 export different vibrations.
Except vibration, sense of touch module 153 can generate other haptic effect various, comprise by such as relative to contact skin vertically the stimulation of the faller gill row of movement effect, by the contact, electrostatic force etc. of the jet power of the air of spray orifice or suction inlet or suction, touch on skin, electrode, by effect of the sensation that uses the element reproduction cold-peace that can absorb or produce heat to warm up etc.
Sense of touch module 153 can be implemented to allow user to feel haptic effect by the finger of such as user or the muscles of the arm, and by direct contact transmission haptic effect.Configuration according to image display device 100 also can arrange two or more sense of touch module 153.
Optical output module 154 can use the light output of light source to be used to indicate the signal of event generation.The example of the event produced in image display device 100 can comprise message sink, call signal reception, missed call, warning, schedule notice, e-mail reception, the receives information passing through application etc.
The signal that exported by optical output module 154 can be realized in the mode of light sending monochromatic light or there is multiple color with image display device.Such as, can termination signal export when the event of image display device sensing user checks.
Interface unit 160 can be used as the interface with each external equipment being connected image display device 100.Such as, interface unit 160 can receive the data sent from external equipment, receives electric power to be sent to each element in image display device 100, or the internal data of image display device 100 is sent to external equipment.Such as, interface unit 160 can comprise wired or wireless headset port, external power source port, wired or wireless data communications port, memory card port, for connecting the port, audio frequency I/O (I/O) port, video i/o port, ear port etc. of the device with identification module.
Identification module can be the chip of the various information of the authority stored for certification use image display device 100 and can comprise subscriber identification module (UIM), Subscriber Identity Module (SIM), general subscriber identification module (USIM) etc.In addition, the device with identification module (hereinafter, being called as " recognition device ") can adopt the form of smart card.Therefore, recognition device can be connected with terminal 100 via interface unit 160.
When image display device 100 is connected with external bracket, interface unit 160 can be used as to allow electric power be fed to the passage of image display device 100 from bracket or can be delivered to the passage of image display device as permission user from the various command signals that bracket inputs.The various command signal inputted from bracket or electric power can as the signal operations be correctly mounted for recognition image display unit bracket.
Memory 170 can store the program for the operation of controller 180 and store input/output data (such as, telephone directory, message, rest image, video etc.) provisionally.Memory 170 can store the data relevant with audio frequency with the vibration of the various patterns in response to the touch input and output on touch-screen.
Memory 170 can comprise the storage medium of at least one type, comprise that flash memory, hard disk, multimedia are miniature, card-type memory (such as, SD or XD memory etc.), random access memory (RAM), static RAM (SRAM), read-only memory (ROM), Electrically Erasable Read Only Memory (EEPROM), programmable read only memory (PROM), magnetic memory, disk and CD.And, also can about the network storage device operation image display device 100 of the memory function of execute store 170 on the internet.
As the aforementioned, controller 180 can control the general operation of image display device 100 usually.Such as, can arrange or discharge when the state of image display device meets pre-conditioned Time Controller 180 for limited subscriber input and the lock-out state applying relevant control command.
Controller 180 also can perform the control and process that are associated with audio call, data communication, video call etc., or pattern recognition process is considered as character or image respectively so that input is drawn in the handwriting input performed on the touchscreen or picture.In addition, controller 180 can control in these assemblies or combine so that realize various exemplary embodiment disclosed herein on image display device 100.
Power subsystem 190 can receive external power or internal power and supply under the control of controller 180 and be included in the respective element in image display device 100 and the suitable electric power required by assembly for operation.Power subsystem 190 can comprise battery.Battery is embedded battery normally, and it is chargeable or is removably coupled to terminal body, for charging.
Power subsystem 190 can comprise connectivity port.Connectivity port can be configured to an example of interface unit 160, is electrically connected to this interface unit 160 for supplying electric power with the external charger of storage battery.
As another example, power subsystem 190 can be configured to when not using connectivity port storage battery wirelessly.At this, use the inductance coupling high method based on magnetic induction or the electromagnetic resonance coupling process based on electromagnetic resonance, power subsystem 190 can receive the electric power transmitted from external wireless electric power transmitter.
Such as software, hardware or its any combination can be used, with computer-readable medium or media implementation various embodiment described here similarly.
With reference to Figure 1B and Fig. 1 C, image display device 100 disclosed herein can be equipped with monoblock-type terminal main body.But, the disclosure can be not limited thereto, but also can be applied to such as watch style, ear-holder type, glasses type or collapsible, flip-shell, sliding cover type, swing, rotary etc. various structures, wherein two or more main body is combined mutually in relatively moveable mode.
At this, terminal body can be understood to the concept of indicating image display unit 100 as at least one assembly.
Image display device 100 can comprise the housing (shell, outer cover, lid etc.) of the outward appearance forming terminal.In the present embodiment, housing can be divided into procapsid 101 and back casing 102.Various electronic building brick can be incorporated in the space formed between procapsid 101 and back casing 102.At least one intermediate case can be additionally arranged between front casing and rear casing 101 and 102.
With output information on the front surface that display unit 151 can be disposed in terminal body.As illustrated, the window 151a of display unit 151 can be installed to the front surface that procapsid 101 makes to be formed terminal body together with procapsid 101.
In some cases, electronic building brick also can be installed to back casing 102.The example being installed to such electronic building brick of back casing 102 can comprise detachable battery, identification module, storage card etc.At this, can be removably coupled to back casing 102 for the bonnet 103 covering mounted electronic building brick.Therefore, when dismantling bonnet 103 from back casing 102, the electronic building brick being installed to back casing 102 can externally be exposed.
As illustrated, when bonnet 103 is coupled to back casing 102, the side surface of back casing 102 can be partially exposed.In some cases, after coupling, back casing 102 also fully can be shielded by back casing 103.On the other hand, bonnet 103 can comprise the opening for externally exposing camera 121b or dio Output Modules 152b.
Housing 101,102,103 can be formed by injection moulding synthetic resin or can be formed by the metal of such as stainless steel (STS), titanium (Ti) etc.
Be different from the example that wherein multiple housing forms the inner space for holding various assembly, image display device 100 can be configured such that a housing forms inner space.In this example, form the image display device 100 with disjunctor with synthetic resin or metal from the mode that side surface extends to rear surface also can be implemented.
On the other hand, image display device 100 can comprise waterproof unit (not shown), is incorporated into terminal body for anti-sealing.Such as, waterproof unit can comprise flashing member, and it is between window 151a and procapsid 101, between housing 101 and back casing 102 or between back casing 102 and bonnet 103, with the covering internal space hermetically when these housings are coupled.
Image display device 100 can comprise display unit 151, first and second dio Output Modules 152a and 152b, proximity transducer 141, illumination sensor 142, optical output module 154, first and second camera 121a and 121b, first and actuation unit 123a and 123b, microphone 122, interface unit 160 etc.
Hereinafter, with reference to Figure 1B and Fig. 1 C, the description of exemplary image display device 100 will be provided, wherein display unit 151, first dio Output Modules 152a, proximity transducer 141, illumination sensor 142, optical output module 154, first camera 121a and the first actuation unit 123a and 123b is disposed on the front surface of terminal body, second actuation unit 123b, microphone 122 and interface unit 160 are disposed on the side surface of terminal body, and the second dio Output Modules 152b and second camera 121b is disposed on the rear surface of terminal body.
At this, the restriction that these assemblies can not be arranged, but can be excluded or be disposed in if necessary on other surface.Such as, the first actuation unit 123a cannot be disposed on the front surface of terminal body, and the second dio Output Modules 152b can be disposed on the side surface except the rear surface of terminal body.
Display unit 151 can export the information of process in image display device 100.Such as, display unit 151 can perform the execution screen message of the application program that information displaying drives in image display device 100 or user interface (UI) and graphic user interface (GUI) information in response to screen.
Display unit 151 can comprise at least one in liquid crystal display (LCD), thin film transistor-liquid crystal display (TFT-LCD), Organic Light Emitting Diode (OLED), flexible display, 3 dimension (3D) displays and electronic ink display.
Configuration aspect according to image display device 100 can realize two or more display unit 151 on number.Such as, multiple display unit 151 can be arranged on a surface to be separated from each other or integrated, or can be disposed on different surfaces.
Display unit 151 can comprise touch sensor, and the touch input on the display unit of this touch sensor sensing makes to receive control command with touch manner.When touch is imported into display unit 151, touch sensor can be configured to sense this touch and controller 180 can generate and touch corresponding control command.Can be text or numerical value with the content of touch manner input, or the menu item that can indicate in various patterns or specify.
Touch sensor can to have the form configuration of the film of touch patterns.Touch sensor can be metal wire, and it is disposed between the display (not shown) on the rear surface of window 151a and window 151a, or is directly patterned on the rear surface of window 151a.Or, touch sensor can with indicator integral formed.Such as, touch sensor can be disposed in display substrate on or in display.
Display unit 151 can form touch-screen together with touch sensor.At this, touch-screen can be used as user input unit 123 (see Figure 1A).Therefore, touch-screen can replace at least some in the function of the first actuation unit 123a.
The first dio Output Modules 152a can be realized to the receiver of the ear of user or the form that is used for the horn-type loudspeaker exporting various alarm sound or multimedia reproduction sound for by voice transfer.
The window 151a of display unit 151 can comprise the sound holes for sending the sound produced from the first dio Output Modules 152a.At this, the disclosure can be not limited thereto.It also can be configured to make the assembly clearance (gap such as, between window 151a and procapsid 101) along between main structure body discharge sound.In this example, being separately formed can not be in sight or be otherwise hidden in appearance with the hole of output audio sound, thus the outward appearance of simplified image display unit 100 further.
Optical output module 154 can export the light that the event that is used to indicate produces.The example of the event produced in image display device 100 comprises message sink, call signal reception, missed call, warning, schedule notice, e-mail reception, the receives information passing through application etc.When sense user event check time, controller can control both optical output unit 154 to stop the output of light.
First camera 121 can process the frame of video of the static or moving image obtained by imageing sensor under video call mode or acquisition mode.Processed picture frame can be displayed on display unit 151 or be stored in memory 170.
First and second actuation unit 123a and 123b are the examples of user input unit 123, and it can be handled for the order of the operation controlling image display device 100 by user's input.First and second actuation unit 123a and 123b also can be collectively referred to as actuating element, and if it allows user then can adopt any method with the tactile manner such as touching, promote, roll etc. sense of touch execution manipulation.
Based on the first actuation unit 123a be membrane keyboard diagram accompanying drawing, but the disclosure unnecessarily can be limited to this.Such as, the first actuation unit 123a can be configured with the combination of mechanical keys or membrane keyboard and promotion key.
The input that the first and second actuation unit 123a and 123b receive can be arranged through in every way.Such as, user can use the first actuation unit 123a to input the order of such as menu, main screen key, cancellation, search etc., and user can use the second actuation unit 123b such as to control the volume level exported from the first or second dio Output Modules 152a or 152b with input, the order of touch recognition pattern being switched to display unit 151 etc.
On the other hand, as another example of user input unit 123, rear input unit (not shown) can be positioned on the rear surface of terminal body.User can handle rear input unit with input for controlling the order of the operation of image display device 100.The content of input can be set in a variety of ways.Such as, the such as electric power on/off that user can use rear input unit to export from the first or second dio Output Modules 152a or 152b with input, end, rolling etc., control volume level, be switched to the order of the touch recognition pattern of display unit 151.Rear input unit may be implemented as the form allowing to touch input, promote input or its combination.
Rear input unit can be disposed in the display unit 151 of overlapping front surface in the thickness direction of terminal body.As an example, the upper part that rear input unit can be disposed in the rear surface of terminal body makes when user use one hand-tight hold terminal body time user forefinger can be used easily to handle it.But the disclosure can be not limited thereto, and the position of rear input unit can be changeable.
When on the rear surface that rear input unit is disposed in terminal body, use rear input unit can realize new user interface.And aforesaid touch-screen or rear input unit can substitute the function of the first actuation unit 123a be positioned on the front surface of terminal body at least partially.Therefore, when the first actuation unit 123 is not disposed on the front surface of terminal, display unit 151 may be implemented as has larger screen.
On the other hand, image display device 100 can comprise finger scan, the finger mark of this finger scan scanning user.Controller 180 can use the finger mark information sensed by finger scan as authentication means.Finger scan can be installed in display unit 151 or user input unit 123.
Microphone 122 can be formed to receive voice, other the sound etc. of user.Microphone 122 can be arranged on multiple place place, and is configured to receive stereo sound.
Interface unit 160 can be used as the path allowing image display device 100 and external equipment swap data.Such as, interface unit 160 can be for being connected to another device (such as, earphone, external loudspeaker etc.) connection terminal, for near-field communication port (such as, Infrared Data Association (IrDA) port, bluetooth port, WLAN port etc.) or for supply power to image display device 100 power terminal at least one.Interface unit 160 can realize for holding such as Subscriber Identity Module (SIM), subscriber identification module (UIM) or the form for the slot of the external card of the storage card of information storage.
Second camera 121b can be installed to the rear surface of terminal body further.Second camera 121b can have the image capture direction relative substantially with the direction of first camera unit 121a.
Second camera 121b can comprise the multiple lens arranged along at least one line.Multiple lens also can be arranged with matrix configuration.Camera can be called as " array camera ".When second camera 121b is implemented as array camera, multiple lens can be used to catch image in every way and the image of better quality can be obtained.
It is adjacent that photoflash lamp 124 can be arranged to second camera 121b.When catching the image of object by camera 121b, photoflash lamp 124 can throw light on theme.
Second dio Output Modules 152b can be disposed in further and be positioned in terminal body.Second dio Output Modules 152b can realize stereo function in conjunction with the first dio Output Modules 152a (with reference to Figure 1A), and also can be used to realize the speakerphone mode for call communication.
At least one antenna for radio communication can be disposed in terminal body.Antenna can be installed in terminal body or be formed on housing.Such as, the antenna of a part for configuration broadcast receiver module 111 (see Figure 1A) can be scalable in terminal body.Alternatively, antenna can be formed the inner surface being attached to bonnet 103 in membrane form, or the housing comprising electric conducting material can be used as antenna.
Power subsystem 190 for supplying power to image display device 100 can be disposed in terminal body.Power subsystem 190 can comprise battery 191, and this battery 191 is installed in the outside of in terminal body or being removably coupled to terminal body.
Battery 191 can receive electric power via the feed cable being connected to interface unit 160.And, use wireless charger wirelessly can (again) to charge by battery 191.Wireless charging can be realized by electromagnetic induction or electromagnetic resonance.
On the other hand, accompanying drawing diagram bonnet 103, it is coupled to the back casing 102 for shielding battery 191, makes the separation preventing battery 191, and protects battery 191 from external impact or foreign material.When from terminal body detachable battery 191, back casing 103 can be removably coupled to back casing 102.
Annex for the protection of the function of outward appearance or assistance or expanded images display unit 100 can be arranged on image display device 100 further.As an example of annex, lid or the bag at least one surface covering or hold image display device 100 can be provided for.Lid or bag can cooperate with display unit 151 with the function of expanded images display unit 100.Another example of annex can be for cooperating or expanding the felt pen to the touch input on touch-screen.
Hereinafter, will describe according to of the present disclosure by the exercisable communication system of display unit 100.
First, such communication system utilizes different air interfaces and/or physical layer.The example of the such air interface utilized by communication system comprises frequency division multiple access (FDMA), time division multiple access (TDMA), code division multiple access (CDMA), Universal Mobile Telecommunications System (UMTS) (especially, Long Term Evolution (LTE)), global system for mobile communications (GSM) etc.
By means of only non-limiting example, further describe and will relate to cdma communication system, but such instruction is applied to other system type comprising cdma wireless communication system comparably.
Cdma wireless communication system comprises one or more image display device 100, one or more base station (BS), one or more base station controller (BSC) and mobile switching centre (MSC).MSC is configured to dock with traditional PSTN (PSTN) and BSC.BSC is coupled to base station via flyback line.Flyback line can be configured according to any one comprising in the several known interface of such as E1/T1, ATM, IP, PPP, frame relay, HDSL, ADSL or xDSL.Therefore, multiple BSC can be comprised in cdma wireless communication system.
Each base station can comprise one or more sector, the antenna that each sector has omnidirectional antenna or points in radially away from the specific direction of base station.Alternatively, each sector can comprise two or more different antenna.Each base station can be configured to support that multi-frequency distributes by each parallel compensate, and each parallel compensate has specific frequency spectrum (such as, 1.25MHz, 5MHz etc.).
The intersection of sector and parallel compensate can be called as CDMA Channel.Base station also can be called as base station transceiver subsystem (BTS).In some cases, term " base station " may be used for being referred to as BSC, and one or more base station.Base station also can be represented as " cell site ".Alternatively, the single sector of given base station can be called as cell site.
Broadcast singal is sent to the image display device 100 at operate within systems by broadcasting transmitter (BT).The broadcast reception module 111 of Figure 1A is configured in mobile terminal 100 inside usually to receive the broadcast singal sent by BT.
For the position of position image display device 100 GPS system (GPS) satellite can with cdma wireless communication cooperative system.Such satellite 300 contributes to the position of searching image display device 100.Useful positional information can be obtained with the satellite more or more less than two satellites.It is to be understood that the location detecting technology (that is, except or the operable position technique of alternative GPS location technology) of other type can be realized alternatively.If desired, at least one in gps satellite can be configured to provide satellite dmb to transmit alternatively or in addition.
Such as software, hardware or its any combination can be used, with computer-readable medium, realize various embodiment described here.
For hardware implementation, embodiment described herein can be realized in one or more application-specific integrated circuit (ASIC) (ASIC), digital signal processor (DSP), digital signal processor (DSPD), programmable logic device (PLD), field-programmable gate array (FPGA), processor, controller, microcontroller, microprocessor, other electronic unit being designed for the said function of execution or the combination of its selectivity.Also such embodiment is realized by controller 180.
For implement software, the embodiment of such as process sum functions can be realized together with independent software module, each n-back test in software module and at least one in operation.The software application can write with any suitable programming language realizes software code.And, software code can be stored in memory 160, and be performed by controller 180.
Meanwhile, the disclosure can show the information using flexible display to process in image display device.Hereinafter, will be provided it in detail with reference to accompanying drawing to describe.
Fig. 2 is the conceptual view of diagram according to the exemplary variations of image display device 200 of the present disclosure.
Illustrated in fig. 2, display unit 251 can be out of shape by external force.Distortion can be that the change of display unit 251 is curved, bending, folding, distortion and at least one in rolling.Deformable display unit 251 can be called as " flexible display unit ".At this, flexible display unit 251 can comprise general flexible display and Electronic Paper.
General flexible display represents lightweight, non-friable display, and it presents the characteristic of traditional flat-panel monitor all the time and substantially can manufactured by the flexibility becoming curved, bending, folding, distortion or rolling.
And Electronic Paper is the Display Technique of the characteristic adopting general ink, and be different from traditional flat-panel monitor in use reverberation.Electronic Paper can by using twisting ball or using the electrophoresis of capsule to change information.
Under the state that flexible display unit 251 is not deformed, (such as, under the state of unlimited radius of curvature, hereinafter, be called as the first state), the viewing area of flexible display unit 251 can become smooth surface.The state of being out of shape from the first state by external force at flexible display unit 251 (such as, there is the state of finite radius, hereinafter, be called as the state of the second state) under, viewing area can become curved surface (or becoming curved surface).As illustrated, the information shown in the second state can be the visual information exported on curved surfaces.Visual information can be realized in the mode of the luminescence controlling each unit pixel (sub-pixel) arranged in matrix configuration independently.Unit pixel represents the element units for representing a kind of color.
In a first state, flexible display unit 251 can be placed in bending state (such as, the state bent from top to bottom or from right to left), instead of in flat condition.Under these circumstances, when external force is applied to flexible display unit 251, flexible display unit 251 can change back to flat condition or more bending state.
On the other hand, flexible display unit 251 can by realizing flexible touch screen with the combination of touch sensor.When on flexible touch screen, input touches, controller 180 (see Figure 1A) can perform and input corresponding control with touching.Flexible touch screen can be configured to even sensing touch input under the second and first state.
Image display device 200 according to exemplary variations can comprise deformation-sensor, the distortion of this deformation-sensor sensing flexible display unit 251.Deformation-sensor can be included in sensing cell 140 (see Figure 1A).
Deformation-sensor can be disposed in information relevant with the distortion of flexible display unit 251 with sensing in flexible display unit 251 or housing 201.At this, the information relevant with the distortion of flexible display unit 251 can be acceleration of being resumed of direction, the degree of distortion, the position of distortion, the time of distortion, the flexible display unit 251 of distortion be out of shape etc.In addition, such information can be in response to the bending various information that can sense of flexible display unit 251.
And, based on the information relevant with the distortion of the flexible display unit 251 sensed by deformation-sensor, controller 180 can change the information be displayed on flexible display unit 251, or generates the control signal of the function for controlling image display device 200.
Image display device 200 according to exemplary embodiment can comprise housing 201, and this housing 201 is for holding flexible display unit 251.Consider the characteristic of flexible display unit 251, housing 201 and flexible display unit 251 can together be out of shape.
Consider the characteristic of flexible display unit 251, the battery (not shown) be disposed in image display device 200 also can be out of shape together with flexible display unit 251.In order to realize battery, the heap superimposition method for folding of stacked battery cells can be employed.
On the other hand, image display device can extend to and can be worn on human body, surmounts usually by the wearable device using image display device with the hand-tight user holding it.The example of wearable device can comprise, intelligent watch, intelligent glasses, head mounted display (HMD) etc.Hereinafter, the description of the image display device extending to wearable device will be provided.
Wearable device can with another image display device 100 swap data (or cooperation).Short range communication module 114 can sense the wearable device that (identification) can communicate with the image display device near image display device 100.In addition, when the wearable device sensed is the authorized device communicated with according to image display device 100 of the present disclosure, at least part of data of process in image display device 100 can be sent to wearable device via short range communication module 114 by controller 180.Therefore, the user of wearable device can use the data of process in image display device 100 on wearable device.Such as, when receiving calling in image display device 100, user can use wearable device answering call.And, when receiving message in image display device 100, the message that user can use wearable device inspection to receive.
Fig. 3 is the perspective view of diagram according to an example of the watch style mobile terminal 300 of another exemplary embodiment.
As illustrated in Figure 3, watch style mobile terminal 300 comprises the main body 301 with display unit 351 and is connected to main body 301 can be worn on the band 302 in wrist.
Main body 301 can comprise the housing limiting outward appearance.As shown in the figure, housing can comprise the first housing 301a and the second housing 301b of the inner space defined collaboratively for holding various electronic building brick.But the disclosure may be not limited thereto.A housing can be configured to define inner space, thus realizes the mobile terminal 300 with one.
Watch style mobile terminal 300 can be allowed to perform radio communication, and can be installed in main body 301 for the antenna of radio communication.Antenna can use housing to extend its function.Such as, the housing comprising electric conducting material can be electrically connected to antenna and extension grounding surface is amassed or swept area.
With output information thereon on the front surface that display unit 351 can be disposed in main body 301.Display unit 351 can be equipped with touch sensor and make to realize touch-screen.As illustrated, the window 351a of display unit 351 can be installed to the first housing 301a to form the front surface of terminal body together with the first housing 301a.
Dio Output Modules 352, camera 321, microphone 322 and user input unit 323 etc. can be disposed in main body 301.When display unit 351 is implemented as touch-screen, it can be used as user input unit 323, and it may cause getting rid of the independent key in main body 301.
Can be worn in wrist in the mode of surrounding with 302.Can be made up of the flexible material for contributing to wearing with 302.As an example, band 302 can be made up of fur, rubber, silicon, synthetic resin etc.Band 302 also can be configured to from main body 301 detachable.Therefore, can be replaced by various types of band according to the preference band 302 of user.
On the other hand, band 302 can be used to the performance extending antenna.Such as, band can comprise wherein and is electrically connected to antenna to extend the ground connection extension (not shown) of ground area.
Band 302 can be provided with securing member 302a.Securing member 302a may be implemented as buckle-type, fastens hook structure, Velcro type etc., and comprises flexible portion or material.Accompanying drawing diagram securing member 302a is implemented as the example of buckle-type.
On the other hand, in this manual, the image display device as fixed terminal can comprise for receiving with the device of display of broadcast data, for recording and the device of playing moving images, the device for recording and reconstruction voice data.Hereinafter, TV will be described to the example of image display device.
Fig. 4 is the block diagram illustrated according to fixed terminal type image display device 400 of the present disclosure and external input device 500.With reference to figure 4, it is the block diagram of diagram fixed terminal type image display device 400 and external input device 500.Image display device 400 can comprise tuner 410, decoder 420, signal I/O unit 430, interface 440, controller 450, memory cell 460, output unit 470 and A/V (audio/video) input unit 480.External input device 500 can be the device be separated with image display device 400, or can be the assembly of image display device 400.
With reference to figure 4, tuner 110 can select radio frequency (RF) broadcast singal corresponding to the channel that user selects in the middle of the RF broadcast singal that received by antenna, and converts selected RF broadcast singal to intermediate-freuqncy signal or baseband images (video)/audio signal.Such as, when RF broadcast singal is digital broadcast signal, RF broadcast singal can be converted to digital IF signal (DIF) by tuner 410.On the other hand, when RF broadcast singal is analog broadcast signal, RF broadcast singal can be converted to Analog Baseband video/audio signal (CVBS/SIF) by tuner 410.Therefore tuner 410 can be the mixing tuner that can process digital broadcast signal and analog broadcast signal.
The digital IF signal (DIF) exported from tuner 410 can be imported into decoder 420, and can be imported into controller 450 from the Analog Baseband video/audio signal (CVBS/SIF) that tuner 410 exports.Tuner 410 can receive the single carrier RF broadcast singal according to Advanced Television Systems Committee (ATSC) standard or the multicarrier RF broadcast singal according to digital video broadcasting (DVB) standard.
Although accompanying drawing illustrates a tuner 410, the present invention is not limited thereto.Image display device 400 can comprise multiple tuner, such as, and the first tuner and second tune device.Under these circumstances, first tuner can receive a RF broadcast singal corresponding with the broadcast channel that user selects, and second tune device can with sequentially or periodic manner receive the two RF broadcast singal corresponding with the broadcast channel of pre-stored.Similar with the first tuner, RF broadcast singal can be converted to digital IF signal (DIF) or Analog Baseband video or audio signal (CVBS/SIF) by second tune device.
Decoder 420 can receive the digital IF signal (DIF) and the signal received of decoding changed by tuner 410.Such as, when the DIF exported from tuner 410 is the signal according to ATSC standard, decoder 420 can perform 8 residual sidebands (8-VSB) demodulation.At this, decoder 420 also can perform the channel-decoding of such as lattice decoding, deinterleaving, RS-decode etc.For this reason, decoder 420 can comprise lattice decoder, deinterleaver, reed-solomon decoder etc.
As another example, when the digital IF signal (DIF) exported from tuner 410 is the signal according to DVB standard, decoder 420 can perform coded Orthogonal Frequency Division modulation (COFDMA) demodulation.At this, decoder 420 also can perform convolution decoder, deinterleaving, RS-decode etc.For this reason, decoder 420 can comprise convolutional decoder, deinterleaver, reed-solomon decoder etc.
Signal I/O unit 430 can by being connected to the input and output of external equipment executive signal.For this reason, signal I/O unit 430 can comprise A/V I/O unit (not shown) and wireless communication unit (not shown).
A/V I/O unit can comprise ethernet terminal, usb terminal, synchronous (CVBS) terminal of Composite Video Baseband, component terminal, S video terminal (simulation), digital visual interface (DVI) terminal, high-definition media interface (HDMI) terminal, mobile high definition link (MHL) terminal, RGB terminal, D-USB terminal, IEEE 1394 terminal, liquid crystal HD terminal etc.The digital signal being connect input by these terminals can be forwarded to controller 450.At this, the analog signal inputted by CVBS terminal and S video terminal, after being converted into digital signal by analog-digital converter, can be forwarded to controller.
Wireless communication unit can perform Wi-Fi (Wireless Internet Access).Such as, wireless communication unit can use WLAN (WLAN) (Wi-Fi), WiMAX (Wibro), World Interoperability for Microwave Access, WiMax (Wimax), high-speed downlink packet access (HSDPA) etc. to perform Wi-Fi (Wireless Internet Access).Wireless communication unit also can perform the short-distance wireless communication with other electronic equipment.Such as, wireless communication unit can use bluetooth, radio-frequency (RF) identification (RFID), Infrared Data Association (IrDA), ultra broadband (UWB), purple honeybee etc. to perform short-distance wireless communication.
Vision signal, audio signal and data-signal that external equipment from such as digital versatile disc (DVD) player, Blu-ray player, game player, camcorder, computer (notebook), portable terminal, smart phone etc. can provide by signal I/O unit 430 are transferred to controller 450.And the vision signal of the various media files in the external memory be stored in such as memory, hard disk etc., audio signal and data-signal can be transferred to controller 450 by signal I/O unit 430.In addition, the vision signal processed by controller 450, audio signal and data-signal can be outputted to other external equipment by signal I/O unit 430.
Signal I/O unit 430 can by being connected to the Set Top Box executive signal input and output operation of such as Internet protocol TV (IPTV) Set Top Box via at least one in these various terminals.Such as, vision signal, audio signal and the data-signal treated by IPTV set top box can be transferred to controller 450 by signal I/O unit 430, so that can two-way communication be carried out, and also by the Signal transmissions that processed by controller 450 to IPTV set top box.At this, IPTV can comprise ADSL-TV, VDSL-TV, FTTH-TV divided according to transmission network etc.
The digital signal exported from decoder 420 and signal I/O unit 430 can comprise stream signal (TS).Stream signal (TS) can be the wherein signal that is re-used of vision signal, audio signal and data-signal.Such as, flowing signal (TS) can be MPEG-2 transmission class (TS) signal obtained by multiplexing MPEG-2 vision signal and Dolby AC-3 audio signal.MPEG-2TS signal can comprise 4 byte header and 184 byte payloads.
Interface unit 440 can receive from external input device 500 input signal being used for Electric control, Channel assignment, screen setting etc., or the signal processed by controller 450 is sent to external input device 500.Interface 440 and external input device 500 can be connected to each other in wired or wireless manner.
As the example of interface unit 140, transducer can be provided.Transducer may be implemented as the remote controller of sensing input signal.
Network interface unit (not shown) is provided for the interface be connected with the wire/radio network comprising Internet by image display device 400.Network interface unit 530 can be provided with the ethernet terminal etc. for being connected with cable network.In order to be connected with wireless network, WLAN (WLAN) (Wi-Fi), Wibro (WiMAX), Wimax (World Interoperability for Microwave Access, WiMax), HSDPA (high-speed downlink packet access) etc. can be used.
Network interface unit (not shown) can pass through the appointed webpage of access to netwoks.That is, network interface unit can receive and dispatch (send and receive) data with corresponding server.Network interface unit can receive the content or data that provide from content supplier or Virtual network operator.That is, network interface unit can be provided from content supplier or Virtual network operator by network reception such as film, advertisement, game, VOD and broadcast singal content, and the information relevant with it.Network interface unit can receive lastest imformation and upgrade the file provided from Virtual network operator.Data can be sent to Internet provider, content supplier or Virtual network operator by network interface unit.
Network interface unit (not shown) can be selected by network and receive the application desired by the middle of application open to the public.
Controller 450 can control the integrated operation of image display device 400.More specifically, controller 450 can control generation and the output of image.Such as, controller 450 can control tuner 410 with tuning with the channel selected by user or the corresponding RF broadcast singal of the channel prestored.Although be not illustrated, controller 450 can comprise demodulation multiplexer, video processor, audio process, data processor, on-chip study (OSD) maker etc.By hardware, controller 450 can comprise CPU, ancillary equipment etc.
Controller 450 can by stream signal (TS), and such as, MPEG-2TS signal, demultiplexes into vision signal, audio signal and data-signal.
Controller 450, to by picture signal carries out image process repeatedly, such as, is decoded.More specifically, controller 450 by using the picture signal of MPEG-2 decoders decode Moving Picture Experts Group-2 coding, and by using H.264 decoders decode according to the picture signal of the H.264 standard code of DMB (DMB) standard or digital video broadcasting-hand-held (DVB-H) standard.In addition, controller 450 can to regulate the brightness of picture signal, the mode of tone and color performs imaging.Display unit 472 can be transferred to by the picture signal of controller 450 image procossing or be transferred to external output devices (not shown) by external output port.
Controller 450 can perform speech processes to by voice signal repeatedly, such as, and decoding.More specifically, controller 450 is by using the voice signal of MPEG-2 decoders decode Moving Picture Experts Group-2 coding, by using MPEG 4 decoders decode according to the voice signal of MPEG 4 Bit Sliced Arithmetic coding (BSAC) standard code of DMB standard, and by using AAC decoders decode according to the voice signal of MPEG-2 Advanced Audio Coding (AAC) standard code of satellite dmb standard or digital video broadcasting-hand-held (DVB-H) standard.In addition, controller 450 can perform bass process, high pitch process and volume process.The voice signal processed by controller 450 in this way can be transferred to audio output unit 471, such as, and loud speaker, or can external output devices be transferred to.
Controller 450 can treatment of simulated base band video/audio signal (CVBS/SIF).At this, the Analog Baseband video/audio signal (CVBS/SIF) being imported into controller 450 can be the Analog Baseband video/audio signal exported from tuner 410 or signal I/O unit 430.Processed vision signal is displayed on display unit 472, and exports processed audio signal by audio output unit 471.
Controller 450 can process, such as, to demultiplexed decoded data signal.At this, data-signal can comprise electronic program guides (EPG) information, and this EPG information can comprise the broadcast message relevant with the broadcast program of broadcasting on each channel, such as time started, end time etc.EPG information can comprise ATSC program and system information protocol (ATSC-PSIP) information and DVB information on services (DVB-SI) information.ATSC-PSIP information or DVB-SI information can be contained in MPEG-4TS header (4 byte).
Controller 450 can perform on-chip study (OSD) process.In further detail, the osd signal for by various information displaying being figure or text data can be generated based at least one controller 450 in the vision signal received from external input device 500 and data-signal or input signal.Osd signal can comprise various data, such as user interface (UI) screen and the various menu screen, widget, icon etc. of image display device 400.
Memory cell 460 can be stored the various program for signal transacting and be controlled by controller 450, and also can store processed video, audio frequency and data-signal.Memory cell 460 can comprise at least one in the storage medium of flash memory type, hard disk type storage medium, the miniature storage medium of multimedia card, card-type memory (such as, SD or XD memory), random access memory (RAM), read-only memory (ROM), Electrically Erasable Read Only Memory (EEPROM), magnetic storage, disk and CD.
Output unit 470 is configured to visual mode, the output generating data in mode of audio frequency etc.Output unit 470 comprises audio output unit 471 and display unit 472.
Audio output unit 471 exports the voice signal processed by controller 450, such as, and stereophonic signal or 5.1 sound channel signals.Audio output unit 471 can be realized with various types of loud speaker.
Display unit 472 by processed vision signal, processed data-signal and can convert rgb signal to by the osd signal that controller 450 provides, thus generates drive singal.Under such a configuration, display unit 472 output image.The various modes that display unit 472 may be implemented as such as Plasmia indicating panel, liquid crystal display (LCD), thin-film transistor LCD (TFT-LCD), Organic Light Emitting Diode (OLED), flexible display, three-dimensional (3D) display, electronic ink display etc. realize.Display unit 471 also may be implemented as touch-screen and therefore can be used as input unit.
A/V input unit 480 is configured to audio reception or vision signal, and can comprise camera 481, microphone 482 etc.Camera 481 processes the picture frame of rest image or the moving image such as obtained by imageing sensor under Video Capture pattern or image capture mode.Processed picture frame can be displayed on display unit 472.
The picture frame processed by camera 481 can be stored in memory 460 or via radio communication unit (not shown) and send.Two or more camera 481 can be provided according to user environment.
Camera 481 also can catch user.Camera 481 may be implemented as single camera.But camera 481 may be implemented as multiple camera.Information about the image of being caught by camera 481 is imported into controller 450.
In order to the gesture of sensing user, the sensing cell (not shown) with at least one in touch sensor, sound transducer, position transducer and operation sensor can be arranged on image display device 400 place further.The signal sensed by sensing cell (not shown) can be transferred to controller 450 by interface unit 440.
The image that controller 450 can be caught based on camera 481, or based on the signal sensed by sensing cell (not shown) or the gesture by its combination sensing user.
Such acoustic processing via microphones sound (Audiotex), and is become electricity voice data by microphone 482 under phone call mode, logging mode, speech recognition mode etc.Microphone 482 can realize various types of noise and eliminate (or suppress) algorithm and receiving and sending to offset (or suppression) noise or interference that produce in the process of audio signal.
Refer again to Fig. 4, power subsystem (not being illustrated) supplies power to image display device 400.Particularly, power subsystem supplies power to the controller 450 of the form of system on chip (SOC) realization, for showing the display unit 472 of image and the audio output unit 471 for output audio.
For this reason, power subsystem (not being illustrated) can comprise transducer (not being illustrated), and this transducer converts DC electric power to AC electric power.Such as, if display unit 472 is implemented as the liquid crystal panel comprising multiple backlight, then power subsystem may further include inverter (not being illustrated), wherein for brightness change with to dim PWM operation driving be possible.
In addition, power subsystem (not shown) can comprise battery (or monomer), for the power conversion using chemistry or physical reactions to calculate is become electric energy.Power subsystem can to battery charging simultaneously from outside supply electric power.In addition, when not supplying electric power from outside, power subsystem can use the energy be stored in the battery to supply power to image display device 400.Such as, when the plug or temporary break that remove image display device 400 from outside are supplied to shell electric, power subsystem can use the electric power operation image display device 400 be stored in the battery.
Battery can be configured to be integrated in the main body of image display device 400 or from the outside of main body can directly be dismantled.
On the other hand, the image display device 400 according to embodiment of the present disclosure may further include power-off sensing cell 451, and whether this power-off sensing cell 451 occurs power-off for sensing.Power-off sensing cell 451 can sense the electricity whether using the signal disconnection produced from power subsystem (not shown) to supply from outside.Such as, power-off sensing cell 451 can determine whether because user inputs cut-out electric power, removes plug or occur power-off from supply socket.
External input device 500 is by cable or be wirelessly connected to interface unit 440, and the input signal inputting generation according to user is sent to interface unit 440.External input device 500 can comprise remote controller, mouse, keyboard etc.By using Bluetooth communication, RF communication, IR communication, ultra broadband (UWB) communication, the communication of purple honeybee etc. remote controller, input signal is sent to interface unit 440.Remote controller may be implemented as space remote control.Space remote control can generate input signal by detecting the mobile of main body.
Image display device 400 may be implemented as the fixed digit broadcasting receiver of at least one in digital broadcasting for receiving ATSC type (8-VSB type), the digital broadcasting of DVB-T type (COFDM type) and the digital broadcasting of ISDB-T type (BST-OFDM type).Alternatively, the mobile model digit broadcasting receiver of at least one during image display device 400 may be implemented as receiving land ripple DMB type digital broadcasting, the digital broadcasting of satellite dmb type, the digital broadcasting of ATSC-M/H type, (COFEM type) digital broadcasting of DVB-H type and only media forward Linktype digital broadcasting.Alternatively, image display 400 may be implemented as the digit broadcasting receiver for cable communication, satellite communication or IPTV.
Fig. 5 is the block diagram of the external input device 500 of pictorial image 5 in detail.External input device 500 comprises wireless communication unit 510, user input unit 520, sensing cell 530, output unit 540, power subsystem 550, memory cell 560 and controller 570.
With reference to figure 5, wireless communication unit 510 can send signal to image display device 400 or from image display device 400 Received signal strength.For this reason, wireless communication unit 510 comprises RF module 511 and IR module 512.RF module 511 can be sent and Received signal strength according to RF communication standard by the interface 440 being connected to image display device 400.IF module 512 can be sent and Received signal strength according to IF communication standard by the interface 440 being connected to image display device 400.
User input unit 520 can comprise keyboard, key button, scroll key, shake key etc. as input element.User can handle user input unit 520 to input (key entry) order (instruction) relevant with image display device 400.Such as, order can be keyed in when user presses the hard key buttons of user input unit 500.
Sensing cell 530 can comprise gyro sensor 531 and acceleration transducer 532.The space that can sense external input device 500 based on X, Y and Z axis gyro sensor 531 is moved.Acceleration transducer 532 can sense translational speed of external input device 500 etc.
Output unit 540 can export information in response to the manipulation of user input unit 530 and the information corresponding with the signal sent by image display device 400.Therefore, user can identify the operation state of user input unit 530 or the state of a control of image display device 400 by output unit 530.Such as, output unit 540 can comprise LED module 541, and this LED module 541 is switched on and cuts off; Vibration module 542, this vibration module 542 generates vibration; Dio Output Modules 543, this dio Output Modules 543 output sound; And display module 544, the manipulation of this display module 544 in response to user input unit 530 or the transmission by the signal of wireless communication unit 210 and reception and output image.
Power subsystem 550 can supply power to the various electronic components of external input device 500.When external input device 500 does not move within the predetermined time, the supply of electric power of power subsystem 550 can be stopped, thus reduces power consumption.When the predetermined key of external input device 500 is handled, power subsystem 550 can supply electric power again.
Memory cell 560 can store the various programs, application, band information etc. of control and the operation relating to external input device 500.Controller 570 can perform the overall control operation of external input device 500.
External input device 500 can be sent to image display device 400 according to RF role's standard or receive from image display device 400.Control signal Control-Menu according to external input device 500 can be displayed on the screen of image display device 400.External input device 500 can be provided with multiple button, generates external input signal with the push-button control according to user.
Hereinafter, will be described in reference to accompanying drawing the operation realized in the image display device of at least one with aforesaid element in further detail.But conveniently explain, in Fig. 1, illustrated terminal is described to example, but the disclosure cannot be limited to this necessarily.In other words, the present invention also can be applied to all if display wrist-watch of image or the wearable device of glasses and mobile terminal or fixed terminal.
Specific role's playback mode can be performed according to image display device of the present disclosure.At this, specific role's playback mode represents that the specific fragment of at least one role wherein comprised except the whole playback fragment of video is by the pattern of optionally playback.More specifically, controller contains the replay segment fragment of specific role from the whole playback fragment search package of video, and based on preset standard from the main playback fragment of the replay segment snippet extraction be retrieved.Then, controller can be play the main playback fragment that is extracted or show the progress bar be associated with the main playback fragment be extracted on the display unit.
Hereinafter, will describe and select specific role to extract main playback fragment when performing specific role's playback, and control the flow process of the method for the function be associated with the main playback fragment be extracted.
Fig. 6 is the flow chart illustrating control method of the present disclosure typically, and Fig. 7 A, Fig. 7 B and Fig. 7 C are the views for explaining the method extracting main playback fragment in the control method of Fig. 6 particularly.
First, when performing specific role's playback mode, controller 180 is selected at least one (S610) in involved role in video.
At this, representation of video shot is stored in the content in the external equipment of such as server etc. and determines the dynamic image content of total playback duration for it.At least one video selected by controller 180, and selects by least one in the role shown in the video selected.
At this moment, controller 180 can automatically be selected at least one role according to pre-set criteria or select based on user's input.
Such as, pre-set criteria can be set to the personage in the whole playback fragment of video with the longest total time for competiton.For another example, pre-set criteria can be set to the personage corresponding with the dominant role of video.When the metadata of main role can be specified to be included in video, controller 180 can use at least one role of metadata selected.
In addition, at least one role can be selected based on user's input control device 180.Such as, controller 180 can use biometric technology or the use metadata be included in video to be created on the list of the role shown in video.In addition, controller 180 can show on the display unit in the state of the list of the role be generated and allows user to select at least one role.For another example, controller 180 can allow user to key in role's title to select at least one role.And for example, controller 180 can allow user to select to be included at least one in the role in screen.Method based on user's input selection role will be described in detail below with reference to figure 8.
On the other hand, when at least one role is selected, controller 180 comprises by least one region (S630) of the role selected from the whole range searching of frame.At this, " role by selecting " instruction is according to pre-set criteria or at least one role according to user's input selection.
The whole playback fragment of controller 180 scan video contains by the fragment of the role selected with search package.More specifically, controller 180 search for wherein by the role that selects within the time of being longer than reference time section by the fragment illustrated continuously.
The fragment be retrieved can comprise isolated multiple frame in time respectively.In addition, the fragment be retrieved can correspond to the different playback duration point in the whole playback fragment of video, and can have different total playback durations.Such as, the fragment be retrieved can comprise, and comprises the first fragment from the first playback duration o'clock to the frame of the second playback duration point and comprises the second fragment from the 3rd playback duration o'clock to the frame of the 4th playback duration point.
On the other hand, controller 180 uses at least one frame search in frame to comprise at least one region of the whole regioselective role from frame.Such as, when ten frames are provided in video, and when being comprised in the first and second frames by the role selected, controller 180 search package is containing at least one region of the role selected from the first and second frames.
At this moment, controller 180 uses biometric technology search package to contain by the region of the role selected.Such as, biometric technology can comprise iris recognition and face recognition.Hereinafter, for ease of explaining, face recognition function search package will be used to describe the disclosure containing the example of the facial zone of eyes, nose and face.Be the technology be disclosed from the technology of picture search facial zone, and therefore will omit its detailed description.
When retrieving facial zone from the frame of specific playback duration point, controller 180 can be the area information building database of the frame information for frame and the size about the facial zone be retrieved and position.
On the other hand, controller 180 can comprise the region of role to calculate retrieval in units of fragment instead of in units of frame fast.Under these circumstances, controller 180 extracts wherein by appearance fragment that the role selected is illustrated during the whole playback fragment of video.More specifically, controller 180 extract and illustrate in the interval being shorter than threshold value by select the time of role to be connected fragment as appearance fragment.Then, controller 180 uses the frame be included in the appearance fragment be extracted to be that each appearance fragment computations comprises by the mean size in the region of the role selected and mean place.Database can be established as by about the area information of calculated mean size and mean place and frag info.
Next, comprise by the main playback fragment (S650) of the role selected from the whole playback snippet extraction of video.
Main playback fragment is extracted based on pre-set criteria controller 180.More specifically, controller 180 adds the frame meeting pre-set criteria to main playback fragment and deletes the frame not meeting pre-set criteria from main playback fragment.In other words, meet some pre-conditioned frames in the middle of controller 180 extracts and is included in the whole playback fragment of video frame, and use the frame be extracted to generate main playback fragment.As a result, main playback fragment can be included in video whole playback fragment on isolated multiple fragment in time.
On the other hand, to be pre-conditionedly associated by least one in the size in the region of role selected and position with in the whole region being included in frame.Such as, when comprise be greater than reference size by the size in the region of the role selected time, relevant frame can be added to main playback fragment.For another example, when multiple role is in a frame involved, and when comprising predetermined ratio less of than the region comprising another role the size in the region of the role selected, relevant frame can be deleted from main playback fragment.
Based on such as comprise by the absolute size in the region of the role selected, in display by the position at the role place selected the distance of the center of frame (or from), comprise by the size variation amount in the region of the role selected (such as, when the size in the region compared with the first frame in continuous print first and second frame on the second frame change be greater than reference value time) and by the feature of time for competiton of role selected, controller 180 can extract main playback fragment.In addition, based on the number of the role be such as comprised in frame, compared with other role by the relative size of the role that selects, the feature showing the position at other role place and the time for competiton of other role, controller 180 can extract main playback fragment.
Controller 180 can use in the mode of collective and comprises by the size in the region of the role selected and position and extracted main playback fragment by the voice of the role selected.Such as, be can be contained in by the role selected in the region of the size with the reference value be greater than on frame, but can not be identified to be in by the voice of the role selected on the audio frequency at identical playback duration point place.Under these circumstances, relevant frame can be deleted from main playback fragment.Otherwise, although be not comprised in frame by the role selected, can be only identified to be in by the voice of the role selected on the audio frequency at identical playback duration point place.In other words, when being only identified by the voice of the role selected, relevant frame can be added to main playback fragment.
In other words, controller 180 can extract a main playback fragment or extract multiple main playback fragment.Controller 180 can calculate personage's importance for each frame based at least one in the size of the role be comprised in frame and position.
Personage's importance can be given step by step from first by from high importance to the order of small significance.
Such as, personage's importance can be given based on the position of the role be comprised in frame and size.More specifically, the first condition corresponding with first personage's importance, second condition corresponding with second personage's importance etc. are stored in memory, and can determine personage's importance of role based on the condition controller 180 be stored in memory.In other words, give first personage's importance when the position of role and size meet first condition, and, give second personage's importance when the position of role and size meet second condition.
For another example, personage's importance can be given based on the number of the role be comprised in frame, the position of each role and size.More specifically, when only the first role is comprised in frame, give the first role by first personage's importance.In addition, when the first and second roles are comprised in frame, and when the first role is greater than the second role and is positioned at the center closer to frame, the first role can be given by the first importance and second personage's importance gives the second role.
With reference to figure 7, comprise and be illustrated example as being associated with the disclosure by nine frame of video of the role selected.Controller 180 can use facial recognition techniques search package containing being specified the region be retrieved by the region of role selected.Such as, when display frame on the display unit, can by rectangle specify comprise by the region of role selected or can highlighted specific region to be different from other region.Hereinafter, comprise and be called as in " region be retrieved " by the region of the role selected.
Can be provided in based at least one controller 180 in the position in the region be retrieved and size on the frame be associated with the region be retrieved by the personage's importance of role selected.Such as, when the region be retrieved meets the first condition on illustrated such frame in fig. 7, this frame can be given by first personage's importance corresponding with first condition.Otherwise, when the region be retrieved meets the second condition about illustrated such frame in figure 7b, this frame can be given by second personage's importance corresponding with second condition.Meanwhile, when retrieving by the role that selects when but the region be retrieved does not meet the first and second condition about illustrated such frame in fig. 7 c, this frame can being given by the 3rd personage's importance corresponding with Article 3 part.According to embodiment, the condition for providing personage's importance can be set in various ways.
Subsequently, different main playback fragments can be extracted based on the personage's importance controller 180 being provided to frame.Such as, the first main playback fragment can be configured with the frame with first personage's importance, and the second main playback fragment can be configured with the frame with first and second personage's importance.In an identical manner, the 3rd main playback fragment can be configured with the frame with first to the 3rd personage's importance.In other words, the 3rd main playback fragment can comprise all comprising by the frame of the role selected.Any one can broadcasting selectively in the first to the 3rd main playback fragment is inputted based on user.
As mentioned above, personage's importance can be used to refer to the relation with the feature of role in video or role, is also used to extract different main playback fragments.Such as, when showing the progress bar of replayed section state of instruction video on the display unit, neighboringly can show by personage's importance of the role selected with progress bar.More specifically, progress bar can play the work being used as the x-axis of variable the time, in order to neighboringly to show with progress bar, personage's importance is used as the figure of variable.
In addition, the image of personage's importance of a reflection position can be shown in the position of a position vicinity with progress bar.Different images can be shown at the diverse location place of progress bar.In other words, image that can be different according to the position display of a position.More specifically, because frame is according to the position of playback duration point and the size in region be wherein retrieved by the role selected and change in location.Therefore, can be changed according to the playback duration point of video by personage's importance of the role selected.Different images can be shown based on the personage's importance controller 180 calculated a position.At this, different images means the image with different shapes, length, size, color etc.The embodiment of the feature using personage's importance instruction role will be described in further detail with reference to Figure 11.
Next, the process of the main playback fragment of playback is performed (S670) continuously.
Isolated multiple fragment in time in the whole playback fragment that main playback fragment can be included in video.When main playback fragment is played, the frame be included in multiple fragment play continuously by controller 180.In other words, the frame be only included in main playback fragment is played.As a result, user can watch by controller 180 the main playback fragment being extracted as main scene selectively.
On the other hand, when playing main playback fragment, controller 180 can show at least one in the main progress bar corresponding with the whole playback fragment of video and the sub-progress bar corresponding with main playback fragment on the display unit.When multiple main playback fragment is extracted, sub-progress bar can comprise multiple sub-progress bar corresponding with main playback fragment respectively.Hereinafter, " progress bar " is used to have the meaning comprising main progress bar and sub-progress bar.
On the other hand, the progress bar be shown on the display unit can be changed based on the user's input control device being applied to progress bar.Such as, when sensing the long touch to the first progress bar in the state be shown at the first progress bar, controller 180 can show the first and second progress bars.Again such as, when clicking input in the state be shown at the first progress bar and moving to the second place from primary importance continuously, the first progress bar can be replaced and show the second progress bar.In addition, when sensing the input of the dragging for moving down any one progress bar in the state be shown at multiple progress bar, controller 180 can select any one progress bar and another progress bar, and changes any one display position with another progress bar.
When any one progress bar is input by a user selection, playback main body can change.At this, the whole playback fragment that playback main body can comprise video and the main playback fragment extracted by controller 180, and multiple main playback fragment can be extracted.
Controller 180 can in response to play by any one progress bar of selecting with by corresponding any one the playback main body of the progress bar selected.Such as, the whole playback fragment of playback continuously when main progress bar is selected, and when any one in group progress bar is selected continuously playback with by the corresponding main playback fragment of any one progress bar of selecting.In other words, select any one progress bar based on user's input control device 180, and play with by the corresponding playback main body of the progress bar selected.
On the other hand, can be shown or by the main playback fragment of playback in the state selected at main progress bar.Under these circumstances, the fragment be included in main playback fragment is highlighted to be different from other fragment (or the fragment be not comprised in main playback fragment).Because the part be only highlighted in the whole fragment of main progress bar is played, so user can identify that main playback fragment is played.
In addition, when progress bar is shown, controller 180 can show in the position contiguous with progress bar the image be associated with personage's importance.Image can be the image of instruction role, and such as, represents the photo (or thumbnail image) of role.In addition, the information (such as, about the information of such as his or his the actor of name or the detailed of actress, the information etc. about the role of the name of such as role) be associated with role can be shown on image.
Subsequently, when progress displaying bar, controller 180 can show different images according to playback position.At this, different images can mean the image with different shapes, length, size, color etc.
Different images can with being associated by personage's importance of the role selected from isolated multiple fragment computations in time.More specifically, controller 180 can calculate personage's importance of the role from multiple Piece Selection respectively, and shows the image of different size based on calculated personage's importance.
Such as, when by the personage's importance of role selected being the first order during the first fragment in multiple fragment, and during the second level during the second fragment, controller 180 can show the image with the first size in the position corresponding with the first fragment, and shows the image with the second size being different from the first size in the position corresponding with the second fragment.In other words, they are identical images, but can show according to personage's importance image with different size.
In addition, when during the first fragment together with when being illustrated different roles by the role selected, controller 180 can calculate personage's importance of different roles, and display and the image that is associated of different roles.Such as, when being the first order by personage's importance of the role selected and personage's importance of different roles is the second level during specific fragment, controller 180 can show first image corresponding with the first role in the position corresponding with specific fragment by the first size, and shows second image corresponding with the second role by the second size being different from the first size.
As mentioned above, can show by personage's importance of the role selected with figure or image together with progress bar on the display unit.Therefore, user can easily know at each playback duration point place of video by the importance of role selected.In other words, user easily can know that each which personage of time point place of video be main role.In addition, controller 180 can use and be displayed on different image inspection around progress bar by the relation between the role that selects and another role.
On the other hand, multiple progress bar can be shown on the display unit.Multiple progress bar corresponds to different playback number of times.Such as, main progress bar corresponds to total playback duration of the whole playback fragment of video, and sub-progress bar corresponds to total playback duration of main playback fragment.
Multiple progress bar can comprise the different mark being used to guide each playback position.Multiple progress bar can have different total playback number of times, and therefore different marks can be displayed on different positions according to total playback duration of progress bar.Such as, when the total playback duration at the first progress bar is 3 minutes, and when total playback duration of the second progress bar is playback video in the state of 2 minutes, first can be shown mark in the position corresponding with a minute on the first progress bar, and can showing second with one point of 30 seconds corresponding position and mark on the second progress bar.
On the other hand, when the user sensed when playing main playback fragment for the second mark being moved to a position of the second progress bar inputs, controller 180 shows the frame of the playback position corresponding with the position be moved on the display unit.During this time, the first mark is moved to the position corresponding with playback position on the first progress bar by controller 180.
Otherwise when sensing the user for the first mark being moved to a position of the first progress bar and inputting, controller 180 shows the frame of the playback position corresponding with the position be moved.Control the second progress bar not to be shown when the frame of playback position is not comprised in main playback fragment Time Controller 180, and when the frame of playback position is comprised in main playback fragment, the second mark is moved to the position corresponding with playback position on the second progress bar.
On the other hand, when sensing the long touch of antithetical phrase progress bar in the state showing main progress bar and sub-progress bar at the same time, it is moveable that controller 180 controls sub-progress bar.In other words, when touch after long touch, to start it mobile time, controller 180 is according to the display position of the mobile mover progress bar of touch.As a result, user can his or the position locator progress bar of his expectation.
By this way, in specific role's playback mode, can be extracted and the main playback fragment of playback by least one in the size in the region of the role selected and position according to comprising.In addition, at least one display in the sub-progress bar corresponding with main playback fragment and the main progress bar corresponding with whole playback fragment is possible.As a result, terminal can with new form displaying video.
The aforesaid control method be associated with the disclosure can be implemented in a variety of manners illustrated in Fig. 8 to Figure 16.Hereinafter, according to the specific embodiment that will be described below, identical or similar Reference numeral is indicated as identical with aforesaid example or similar configuration, and its description will be replaced by data more early.
Fig. 8 is the conceptual view of the method for the example of the operation that diagram selects specific role to realize as the control method by Fig. 6.
As the example of this method, with reference to figure 8A, when performing specific role's playback mode, the character list for allowing user to select at least one role can be shown on display unit 151.Under these circumstances, the menu for performing specific role's playback mode can be preset.When being selected the menu for performing specific role's playback mode by touch input, outside input unit etc., display character list.
More specifically, controller 180 can scan the character list that the frame that is provided in video is arranged by the order be illustrated continually to generate wherein role.For such example, this accompanying drawing illustrates wherein Fig. 1 to Fig. 4 and is comprised in example in character list.In other words, character list can comprise the different personage icon corresponding from Fig. 1 to Fig. 4.
Different personage's icons can be the Drawing Object representing role.Each Drawing Object can comprise the information be associated with role, and the photograph image be associated with role can be shown as background image.
Next, can by touching at least one personage's icon of input selection.Be formed to allow display unit sensing touch to input by image display device disclosed herein.But the disclosure can be not necessarily limited to this, at least one role can be selected by the various input schemes of the external input device of such as Long-distance Control, voice command etc.
When selecting role, controller 180 extracts the main playback fragment be associated with by the role selected, and plays the main playback fragment be extracted.
As the example of another method, with reference to figure 8B, controller 180 can receive the name of role from user.More specifically, when specific role's playback mode is performed, input window and the dummy keyboard being formed the name receiving specific role can be shown on display unit 151.After receiving name by dummy keyboard, controller 180 search mates the role of the name received.During this time, controller 180 can use involved in video metasearch coupling role or to be connected to the Internet server request coupling Role Information.
Then, controller 180 shows the character list comprising the role be retrieved on display unit 151.In addition, at least one role can be selected based on touch input control device 180.
As the example of another method, with reference to figure 8C, when the playback of video is suspended, controller 180 can by the main progress bar 810 corresponding with the whole playback fragment of video of display on the screen of part of fixing tentatively.During this time, controller 180 can show the Drawing Object for performing specific role's playback mode.More specifically, when video is suspended, controller 180 can search for the role in the screen being comprised in and being suspended, and display is formed the Drawing Object of the role selecting to be retrieved.Such as, when the first and second personages are comprised in the screen be suspended, first and second personage's icons 810,822 corresponding thereto can be shown.Then, when sensing the touch input to first personage's icon 820, controller 180 performs the specific role playback mode corresponding with first personage's icon.
As the example of another method, with reference to figure 8D, in the state that controller 180 can be suspended in the playback of video, sensing is used to specify user's input of subregion.Such as, illustrated in Fig. 8 D, the touch input from a position can be moved continuously and then turn back to this position.The role that controller 180 search is comprised in the region formed by touch track, and perform the specific role's playback mode for the role be retrieved.
As or the example of another method, with reference to figure 8E, when the playback of video is suspended, controller 180 can extract main playback fragment for each role be included in the screen that is suspended, and the progress bar that display is corresponding with each main playback fragment on the screen be suspended.More specifically, when the playback of video is suspended, the role that controller 180 search is comprised in the screen be suspended.In addition, controller 180 is that each role be retrieved extracts main playback fragment, and generates the sub-progress bar corresponding with the main playback fragment be extracted.Such as, illustrated in Fig. 8 E, when the first and second personages are comprised in the screen be suspended, the be associated with the first personage first sub-progress bar 830 and the second sub-progress bar 832 associated with the second figure picture are displayed on display unit 151.In addition, its feature is, each progress bar can comprise the mark and playback icon that are used to guide the playback position be suspended.When be comprised in the playback diagram in the first sub-progress bar 830 put on sense touch input time, controller 180 performs and is used for specific role's playback mode of the first personage.
On the other hand, when playing main playback fragment, controller 180 can show at least one in the main progress bar corresponding with the whole playback fragment of video and the sub-progress bar corresponding with main playback fragment.Fig. 9 is the conceptual view of diagram for the various embodiments of progress displaying bar.
With reference to figure 9A, the main progress bar 910 corresponding with the whole playback fragment of video can be shown on display unit 151.Main progress bar 910 can comprise the first mark 912 for showing playback position.
On the other hand, fragment 930a, the 930b be comprised in main playback fragment is highlighted to be different from the fragment be not comprised in main playback fragment.
When main playback fragment is played, the frame of the last frame of the first fragment 930a and the beginning of the second fragment 930b is interconnected continuously.In other words, the frame of the beginning of the second fragment 930b is played after the last frame of the first fragment 930a.
On the other hand, main progress bar 910 and sub-progress bar 920 may be shown as illustrated in figures 9 b and 9 simultaneously, or as only shown sub-progress bar 920 illustrated in Fig. 9 C.
When to show main progress bar and sub-progress bar simultaneously, illustrated in figures 9 b and 9, sub-progress bar 920 can be shown in the position contiguous with main progress bar 910.Otherwise, although not shown in the drawings, the sub-progress bar of display around the position that also can show the role by being associated with sub-progress bar.In other words, the display position of sub-progress bar can be shown the change in location at place according to role.User can check that sub-progress bar is shown the position at place, thus easily finds that role is shown the position at place.
On the other hand, can perform based on the user's input control device 180 being applied to sub-progress bar the operation be associated with sub-progress bar.Figure 10 A and Figure 10 B are the conceptual view of the control illustrating the sub-progress bar corresponding with main playback fragment.
With reference to figure 10A, controller 180 can sense the dragging input of the second mark 922 of the playback position to the sub-progress bar 920 of instruction.Playback position can be changed according to dragging input, and the screen corresponding with reformed playback position can be shown on display unit 151.Meanwhile, the display position of the first mark 912 of the playback position of the main progress bar 910 of instruction also can be changed according to the change of playback position.
In addition, with reference to figure 10B, controller 180 can sense the dragging input of antithetical phrase progress bar 920.Controller 180 changes the position at the sub-progress bar place of display according to dragging input (920 → 920 ').As a result, sub-progress bar can be moved to the position of his or his expectation by user.
Figure 11 is diagram shows the method for main playback fragment conceptual view when showing main progress bar.The example that first to fourth wherein corresponding with main playback fragment fragment 1110a to 1110d of this accompanying drawing diagram is highlighted.In addition, main progress bar 910 can comprise the first mark 912 being used to guide playback position.
When showing main progress bar 910, controller 180 can show in the position contiguous from main progress bar 910 the different image be associated with main playback fragment.Image can be represent the profile photograph of role or comprise the thumbnail image of role, and can show different images according to playback position.At this, different image instructions has the image of different shapes, length, size, color etc.
In addition, the information be associated with role can be shown on image.Such as, information can comprise the name, the effect of role, the name of the performer of figure, the resume of performer etc. of role.
According to this accompanying drawing, image is illustrated as the shape of rectangle, and is illustrated in the information that the numeral in shape is associated with role.Such as, be the image be associated with the first role by the rectangle shown in numeral 1, and be the image be associated with the first role by the rectangle of numeral shown in 1, and be the image be associated with the second role by the rectangle of numeral shown in 2.
On the other hand, controller 180 is personage's importance of each calculating role in time in isolated multiple fragment 1110a to 1110d.According to embodiment, at least one controller 180 in the size in the region be shown based on wherein role and position can calculate personage's importance.In addition, controller 180 can have the image of different size according to calculated importance display.Such as, illustrated in Figure 11 A, when personage's importance of the first personage is the first order during the first fragment 110a and is the second level during the second fragment 110b, controller 180 can show the image with the first size in the position corresponding with the first fragment, and shows the image with the second size being different from the first size in the position corresponding with the second fragment.Because relatively large image is shown as increasing personage importance, so can illustrate to user and be comprised in the personage's importance (or weight) in the multiple fragments in main playback fragment.
Again such as, personage's importance of the role during the controller 180 whole playback fragment that can calculate at video at each playback position place.In addition, illustrated in Figure 11 B, importance Figure 113 0 that can be corresponding with whole playback fragment according to the importance display be calculated.For importance Figure 113 0, controller 180 can show and as variable, the horizontal direction of main progress bar 910 is set to x-axis wherein service time, and importance Figure 113 0 that personage's importance changes in the y-axis direction.
In addition, controller 180 can calculate personage's importance for the multiple roles be comprised in specific fragment, and the different image that display is corresponding from each role around main progress bar 910.Such as, illustrated in Figure 11 C, when personage's importance of the first and second personages is identical, controller 180 can show the different image with formed objects.When only the first role is comprised in the 3rd fragment 1110c, controller 180 can only show the image corresponding with the first role.In addition, when be the first order in personage's importance of the 4th fragment 1110d period first role, personage's importance of the 3rd role be the second level and personage's importance of the 4th role is the third level time, controller 180 can show the different image with the size corresponding from personage's importance respectively.
In addition, controller 180 can be total time for competiton that each role calculates during specific fragment, and show image when increasing total time for competiton in the position contiguous with main progress bar 910, and when reducing total time for competiton at the position display image away from main progress bar.In other words, the distance (white space) when reducing total time for competiton between main progress bar 910 and image increases.
As a result, user can use and be displayed on different image inspection around main progress bar by the relation between the role that selects and another role.
On the other hand, main playback fragment can be input by a user editor.Figure 12 A, Figure 12 B and Figure 12 C are the conceptual view of the method for the main playback fragment illustrated on the main progress bar of editor in Figure 11 A, Figure 11 B and Figure 11 C.
When long touch being applied to main progress bar 910 in the state be highlighted at the multiple fragment 1110a to 1110d be comprised in main playback fragment, the edit pattern for main playback fragment is performed.
When edit pattern is performed, illustrated in fig. 12, can delete multiple fragment 1110a to 1110d deletion icon (such as, "
") be shown.When any one in deletion icon sensing touch input, delete icon corresponding fragment from main playback fragment deletion with any one as illustrated in Figure 12 B.
In addition, when moving up dragging input in left or right side from any one end of multiple fragment in the state be performed in edit pattern, fragment can be extended according to the displacement dragging input or be shortened.
Such as, when sensing the dragging input moved up in left from the right-hand member of the second fragment 1110b illustrated in such as in Figure 12 B, the second fragment can be shortened (1110b → 1110b ').
Otherwise such as, when the right-hand member from the second fragment 1110b moves up dragging input in right, the second fragment can be extended.During this time, when dragging the left end inputted close to the 4th fragment 1110d, the second fragment 1110b and the 4th fragment 1110d can be merged into a fragment.
Figure 13 A, Figure 13 B and Figure 13 C are the views for explaining in the method performing specific role's playback mode according to the playback device at video in the image display device of embodiment of the present disclosure.
With reference to figure 13A, at least one role can be selected at the playback of video based on the user's input control device 180 being applied to display unit 151.
Such as, when such as illustrated in figure 13a video playback by long touch be applied to a position of display unit 151 time, controller 180 search is displayed on to comprise and longly touches the role be applied in the region of the position at place.When role is retrieved, illustrated in Figure 13 B, controller 180 can show the Drawing Object 1320 corresponding with the role be identified on display unit 151.
When in the state be shown at Drawing Object 1320 long touch move to from the touch location started the region that wherein main progress bar 1310 is shown continuously time, the role be identified selected by controller 180, and performs the specific role's playback mode be used for by the role selected.In other words, controller 180 extracts the main playback fragment be associated with by the role selected, and plays the main playback fragment be extracted.
When specific role's playback mode is performed, the fragment be comprised in the main playback fragment on main progress bar is highlighted to be different from the fragment (1310 → 1310 ') be not comprised in main playback fragment.
On the other hand, when specific role's playback mode is performed, display unit 151 shows at least one in main progress bar and sub-progress bar.Select any one progress bar based on user's input control device 180, and play with by the corresponding playback fragment of the progress bar selected.Such as, the whole playback fragment of playback video continuously when main progress bar is selected, and the main playback fragment of playback continuously when group progress bar is selected.
Hereinafter, with reference to Figure 14, Figure 15 A and Figure 15 C, the method (or selecting the method for any one progress bar) changing progress bar will be described in further detail.Figure 14, Figure 15 A and Figure 15 C is the method that diagram changes progress bar.
Controller 180 can show main progress bar 1410 and sub-progress bar 1420 on display unit 151 simultaneously.Such as, illustrated in fig. 14, controller 180 can show main progress bar 1410 on the surface in any one of hexahedron 1400, and shows sub-progress bar 1420 on the surface being different from this surperficial another.
During this time, controller 180 can control the progress bar corresponding with the current playback fragment be played to be positioned on the direction before display unit 151.Such as, illustrated in fig. 14, when main playback fragment is played, sub-progress bar 1420 is positioned on direction before hexahedron 1400 (or relevant to display unit 151 before direction), and main progress bar 1410 can be positioned on the in-plane of hexahedron 1400.
Various types of polyhedron can be used to mention, and hexahedron shows multiple progress bar.When any one progress bar is positioned on the direction before display unit 151, another progress bar can be positioned on different directions according to polyhedral type.
On the other hand, the gesture of rotary hexahedral 1400 can be applied.More specifically, when the touch being applied to hexahedron 1400 is moved with curve shape, can rotary hexahedral 1400 based on the moving direction controller 180 touched.Such as, when the touch of the upper surface being applied to hexahedron 1400 is mobile with counter clockwise direction, hexahedron 1400 rotates as illustrated in Figure 14.According to the rotation of hexahedron 1400, the progress bar on the direction be positioned in before display unit 151 is become main progress bar 1410 from sub-progress bar 1420.In other words, main progress bar 1410 is by the whole playback fragment selected with displaying video.
On the other hand, when specific role's playback mode is performed, controller 180 can extract a main playback fragment or extract multiple main playback fragment.More specifically, personage's importance can be calculated based at least one controller 180 in the size of the role be comprised in frame and position, and extract different main playback fragments based on personage's importance.
Hereinafter, will selection be described in detail, such as, select the method for any one progress bar extracting the first to the 3rd playback fragment.First main playback fragment can comprise the frame with first personage's importance, and the second main playback fragment can comprise the frame with second personage's importance, and the 3rd main playback fragment can comprise the frame with the 3rd personage's importance.Therefore, when reducing the rank of personage's importance, total playback duration increases.
With reference to figure 15A, sense in the state that can be illustrated at the second sub-progress bar 1510 corresponding with the second main playback fragment and pinch to the second sub-progress bar 1510 gesture of letting go.
At this, pinch gesture of letting go and represent and make and two of screen contact gestures that finger broadens, and kneading gesture represents making and two of screen contact gestures that finger narrows as the gesture contrary with pinching gesture of letting go.
Controller 180 can in response to pinching gesture display of the letting go first sub-progress bar 1520 corresponding with the first main playback fragment with the rank higher than the second main playback fragment be shown.In addition, controller 180 plays the first playback fragment instead of the second main playback fragment continuously.In other words, the frame with relative higher significant is pinched gesture playback of letting go.
Otherwise, illustrated in Figure 15 B, in the state that can be illustrated at the second sub-progress bar 1510, sense the kneading gesture to the second sub-progress bar 1510.Controller 180 can show the three sub-progress bar 1530 corresponding with the 3rd main playback fragment with the rank lower than the second main playback fragment be shown in response to kneading gesture.In addition, controller 180 plays the 3rd playback fragment instead of the second main playback fragment continuously.
On the other hand, with reference to figure 15C, controller 180 can show the fragment adjustment bar 1550 that can change playback main body.More specifically, controller 180 can show and is formed to adjust by the fragment of the importance of personage in sight adjustment bar 1550.
The left end of fragment adjustment bar 1550 corresponds to minimum rank (or minimum stage), and its right-hand member corresponds to maximum rank (or maximum stage).In other words, the main playback fragment that playback is corresponding with minimum rank when selecting its left end on fragment adjustment bar 1550, and the main playback fragment that playback is corresponding with maximum rank when selecting its right-hand member.
Therefore, the fragment change in the main playback fragment (fragment such as, main progress bar is highlighted) be displayed on boss's progress bar 1510 is comprised in.Such as, when reducing closer to the fragment be comprised in during maximum level in main playback fragment, and when increasing closer to the fragment be comprised in during minimal level in main playback fragment.User can use fragment to adjust bar 1550 and select continuously to expect the stage (or rank) in sight.
Preset for the criterion extracting main playback fragment on the other hand and arranged as factory default.But criterion can be changed by the user.Figure 16 is the conceptual view being shown in the method receiving the criterion for extracting main playback fragment in the image display device according to embodiment of the present disclosure from user.
Controller 180 can show the criterion for extracting main playback fragment before extracting main playback fragment on display unit 151.More specifically, the number being comprised in the role in frame, the time period of the role be illustrated continuously, the facial size of role etc. can be shown as the criterion for extracting main progress bar on display unit 151, to extract main playback fragment.Such as, when being set to by the number of role time " one or be less than one ", the frame comprising two or more role is deleted from main playback fragment.In addition, when being set to " being greater than one minute " time for competiton, only wherein role is illustrated the frame being greater than a minute and is added to main playback fragment.In addition, when facial size is set to " being greater than 16 square centimeters ", the frame that only wherein the face area of role is greater than 16 square centimeters is added to main playback fragment.
In addition, controller 180 can allow user to use fragment to adjust bar selection by personage's importance in sight.As a result, user-friendly user interface is provided to be possible.
Although not shown in the drawings, when by when touching the region inputting and be applied to wherein for the main progress bar 910 of display, stop the edit pattern be performed.
On the other hand, controller 180 can be play main playback fragment and is stored as the independent file being different from original video.In addition, the video corresponding with main playback fragment can be sent to external server to share with other device.
As described above, the disclosure can play the progress bar be associated with specific role, never provides novel user convenient.In addition, can solve for only play wherein specific personage by the focusing on relevant personage by the demand of the user of the part of feature.More specifically, all fragments of specific personage shown in it can be extracted optionally to edit or displaying video based on the size in region and position comprising role.Such as, when being used according to image display device of the present disclosure, user can only extract, store and play the fragment of the upper phase of wherein his or his child from the whole video wherein taking campus play.Again such as, when being used according to image display device of the present disclosure, user only can extract, store and play from the whole video wherein taking baseball game the fragment that wherein specific sportsman is illustrated.
Aforesaid the present invention be may be implemented as on the medium write by program by computer-readable code.Computer-readable medium can comprise the recording equipment wherein storing all kinds fetched data by computer system-readable.The example of computer-readable medium can comprise hard drive (HDD), solid-state disk (SSD), silicone disc driving (SDD), ROM, RAM, CD-ROM, tape, floppy disk and optical data storage devices etc., and also comprise the device realized with the form of carrier wave (such as, via the transmission of the Internet).In addition, computer can comprise the controller 180 of mobile terminal.Therefore, its describe in detail should not be interpreted as can limiting in all but be considered to be illustrative.Scope of the present invention should be determined and all changes falling into equivalent scope of the present invention are included within the scope of this invention by the reasonable dismissal of the claim of enclosing.
Claims (20)
1., for controlling a method for display unit, described method comprises:
Identify the video section comprising the people that user selects;
By the main playback piecewise definition of described video for comprise in identified video section one or more and meet preset criterion; And
The display of described display unit shows the main playback fragment of described video.
2. method according to claim 1, comprises further:
The progress bar of the playback state indicating described video is shown while the described video of display,
Wherein, described progress bar comprises first progress bar corresponding with the whole playback fragment of described video and second progress bar corresponding with described main playback fragment.
3. method according to claim 2, wherein:
Described pre-set criteria at least comprises, and comprises the ratio of the selected people in the position of selected people in the length of the scene of the video of selected people or part, the scene of described video or part or the scene of described video or part; And
Described main playback fragment comprises the multiple fragments be spaced from each other.
4. method according to claim 3, comprises further:
Show described first progress bar and described second progress bar overlappingly; And
Distinguish described second progress bar of the Part I of overlapping described first progress bar of ground display, make described second progress bar in described first progress bar be differentiable with the Part II of the first progress bar representing the video segment not corresponding to described main playback fragment.
5. method according to claim 3, wherein:
Described first progress bar and described second progress bar are shown independently, make the position of the current broadcasting of the video of described first progress bar instruction in the whole playback fragment of described video, and the position of the described second current broadcasting of progress bar instruction in described main playback fragment; And
Each of corresponding in described second progress bar in described multiple fragment.
6. method according to claim 3, comprises further:
Show multiple indicating device, each indicating device is with corresponding in the multiple fragments by described second progress bar instruction, and each indicating device indicates relative to the corresponding importance information of in multiple fragments of selected people; With
According to each importance information in described multiple fragment with the described multiple indicating device of all size display.
7. method according to claim 6, wherein, at least determines described importance information based on the position of the people selected by the scene or part of described video or size.
8. method according to claim 3, wherein, shows described main playback fragment and comprises the frame that display is continuously included in described multiple fragment.
9. method according to claim 3, comprises further:
In response to playback fragment main described in the user's input editing received via described progress bar.
10. method according to claim 1, wherein, the selection of described people comprises:
The face of the people that the time-out scene of the video that the playback being identified in described video is suspended comprises or people;
The screen be suspended shows for select know others Drawing Object; And
Institute is selected to know others in response to the touch being applied to described Drawing Object.
11. 1 kinds of image display devices, comprising:
Display, described display is configured to display video; With
Controller, described controller is configured to:
Identify the video section comprising the people selected by user;
By the main playback piecewise definition of described video for comprise in identified video section one or more and meet pre-set criteria; And
Described display is made to show the main playback fragment of described video.
12. image display devices according to claim 11, wherein:
Described controller is configured to while the described video of display, make described display show the progress bar of the playback state of the described video of instruction further; And
Described progress bar comprises first progress bar corresponding with the whole playback fragment of described video and second progress bar corresponding with described main playback fragment.
13. image display devices according to claim 12, wherein:
Described pre-set criteria at least comprises, and comprises the ratio of the people selected by the length of the scene of the video of selected people or part, the position of people selected by the scene or part of described video or the scene of described video or part; And
Described main playback fragment comprises the multiple fragments be spaced from each other.
14. image display devices according to claim 13, wherein, described controller is configured to further:
Show described first progress bar and described second progress bar overlappingly, and
Distinguish described second progress bar of the Part I of overlapping described first progress bar of ground display, make described second progress bar in described first progress bar be differentiable with the Part II of the first progress bar representing the video segment not corresponding to described main playback fragment.
15. image display devices according to claim 13, wherein:
Described first progress bar and described second progress bar are shown independently, make the position of the current broadcasting of the video of described first progress bar instruction in the whole playback fragment of described video, and the position of the described second current broadcasting of progress bar instruction in described main playback fragment; And
Each of corresponding in described second progress bar in described multiple fragment.
16. image display devices according to claim 13, wherein, described controller is configured to make described display further:
Show multiple indicating device, each indicating device is with corresponding in the multiple fragments by described second progress bar instruction, and each indicating device indicates relative to the corresponding importance information of in multiple fragments of selected people; And
According to each importance information in described multiple fragment with the described multiple indicating device of all size display.
17. image display devices according to claim 16, wherein, at least determine described importance information based on the position of the people selected by the scene or part of described video or size.
18. image display devices according to claim 13, wherein, described controller is configured to that described display is shown continuously and is included in the frame in described multiple fragment further.
19. image display devices according to claim 13, wherein, described controller is configured to further in response to playback fragment main described in the user's input editing received via described progress bar.
20. image display devices according to claim 11, wherein, described controller is configured to further:
People in the time-out scene of the video that the playback that identification is included in described video is suspended or the face of people;
Make described display show on the described screen be suspended for select know others Drawing Object; And
Institute is selected to know others in response to the touch being applied to described Drawing Object.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2014-0048866 | 2014-04-23 | ||
KR1020140048866A KR20150122510A (en) | 2014-04-23 | 2014-04-23 | Image display device and control method thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105007531A true CN105007531A (en) | 2015-10-28 |
CN105007531B CN105007531B (en) | 2018-12-25 |
Family
ID=52875437
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510196687.5A Expired - Fee Related CN105007531B (en) | 2014-04-23 | 2015-04-23 | Image display device and its control method |
Country Status (4)
Country | Link |
---|---|
US (1) | US9484066B2 (en) |
EP (1) | EP2937864A1 (en) |
KR (1) | KR20150122510A (en) |
CN (1) | CN105007531B (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017084353A1 (en) * | 2015-11-18 | 2017-05-26 | 乐视控股(北京)有限公司 | Video clip quick search method, device, system, and computer readable medium |
WO2017166480A1 (en) * | 2016-03-31 | 2017-10-05 | 乐视控股(北京)有限公司 | Playing progress adjusting method and device |
CN109615953A (en) * | 2019-01-30 | 2019-04-12 | 北京儒博科技有限公司 | A kind of exchange method of educational robot, device, robot and storage medium |
CN110337009A (en) * | 2019-07-01 | 2019-10-15 | 百度在线网络技术(北京)有限公司 | Control method, device, equipment and the storage medium of video playing |
CN110446093A (en) * | 2019-08-15 | 2019-11-12 | 天脉聚源(杭州)传媒科技有限公司 | A kind of video progress bar display methods, device and storage medium |
CN110741652A (en) * | 2018-05-21 | 2020-01-31 | 青岛海信电器股份有限公司 | Display device with intelligent user interface |
JP6882584B1 (en) * | 2020-09-02 | 2021-06-02 | Kddi株式会社 | Content playback device and program |
CN113906498A (en) * | 2019-06-19 | 2022-01-07 | 三星电子株式会社 | Display device and control method thereof |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017030212A1 (en) * | 2015-08-18 | 2017-02-23 | 엘지전자(주) | Mobile terminal and method for controlling same |
US9871546B2 (en) * | 2015-09-11 | 2018-01-16 | Panasonic Intellectual Property Corporation Of America | Wearable terminal mountable on part of body of user |
JP2017102429A (en) * | 2015-11-19 | 2017-06-08 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | Wearable terminal and control method |
US20170285924A1 (en) * | 2016-03-31 | 2017-10-05 | Le Holdings (Beijing) Co., Ltd. | Method for adjusting play progress and electronic device |
KR20180018017A (en) * | 2016-08-12 | 2018-02-21 | 엘지전자 주식회사 | Mobile terminal and operating method thereof |
US10535371B2 (en) * | 2016-09-13 | 2020-01-14 | Intel Corporation | Speaker segmentation and clustering for video summarization |
US10449440B2 (en) | 2017-06-30 | 2019-10-22 | Electronic Arts Inc. | Interactive voice-controlled companion application for a video game |
US10621317B1 (en) | 2017-09-14 | 2020-04-14 | Electronic Arts Inc. | Audio-based device authentication system |
US20190087060A1 (en) * | 2017-09-19 | 2019-03-21 | Sling Media Inc. | Dynamic adjustment of media thumbnail image size based on touchscreen pressure |
CN110494835A (en) * | 2017-12-20 | 2019-11-22 | 华为技术有限公司 | A kind of control method and device |
CN108882024B (en) * | 2018-08-01 | 2021-08-20 | 北京奇艺世纪科技有限公司 | Video playing method and device and electronic equipment |
KR102179591B1 (en) * | 2018-12-12 | 2020-11-17 | 인하대학교 산학협력단 | Apparatus of character area extraction in video |
KR102179590B1 (en) * | 2018-12-12 | 2020-11-17 | 인하대학교 산학협력단 | Extraction apparatus of character conflict information in video |
US10926173B2 (en) * | 2019-06-10 | 2021-02-23 | Electronic Arts Inc. | Custom voice control of video game character |
JP2022088890A (en) * | 2020-12-03 | 2022-06-15 | 富士フイルムビジネスイノベーション株式会社 | Information processing device and program |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101552898A (en) * | 2008-03-31 | 2009-10-07 | Lg电子株式会社 | Methods and device for reproducing images |
CN103258557A (en) * | 2012-02-20 | 2013-08-21 | 索尼公司 | Display control device and display control method |
JP2013171599A (en) * | 2012-02-20 | 2013-09-02 | Sony Corp | Display control device and display control method |
CN103455241A (en) * | 2012-05-29 | 2013-12-18 | 三星电子株式会社 | Method and apparatus for playing video in portable terminal |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004505363A (en) * | 2000-07-25 | 2004-02-19 | アメリカ オンライン インコーポレーテッド | Video messaging |
US20060013556A1 (en) * | 2004-07-01 | 2006-01-19 | Thomas Poslinski | Commercial information and guide |
JP2006309867A (en) * | 2005-04-28 | 2006-11-09 | Hitachi Ltd | Image recording and reproducing device |
JP4894252B2 (en) * | 2005-12-09 | 2012-03-14 | ソニー株式会社 | Data display device, data display method, and data display program |
JP2007281680A (en) * | 2006-04-04 | 2007-10-25 | Sony Corp | Image processor and image display method |
JP2008017042A (en) * | 2006-07-04 | 2008-01-24 | Sony Corp | Information processing apparatus and method, and program |
US9773525B2 (en) * | 2007-08-16 | 2017-09-26 | Adobe Systems Incorporated | Timeline management |
JP2009181216A (en) * | 2008-01-29 | 2009-08-13 | Toshiba Corp | Electronic apparatus and image processing method |
WO2010035160A1 (en) | 2008-09-23 | 2010-04-01 | Koninklijke Philips Electronics N.V. | Method and apparatus for displaying a progress indicator for a content item |
US8666223B2 (en) * | 2008-09-25 | 2014-03-04 | Kabushiki Kaisha Toshiba | Electronic apparatus and image data management method |
JP2011239075A (en) * | 2010-05-07 | 2011-11-24 | Sony Corp | Display device, display method and program |
US8649573B1 (en) | 2010-06-14 | 2014-02-11 | Adobe Systems Incorporated | Method and apparatus for summarizing video data |
US20130275411A1 (en) * | 2012-04-13 | 2013-10-17 | Lg Electronics Inc. | Image search method and digital device for the same |
JP2014068290A (en) * | 2012-09-27 | 2014-04-17 | Sony Corp | Image processing apparatus, image processing method, and program |
KR102049855B1 (en) * | 2013-01-31 | 2019-11-28 | 엘지전자 주식회사 | Mobile terminal and controlling method thereof |
EP2767896B1 (en) * | 2013-02-14 | 2019-01-16 | LG Electronics Inc. | Mobile terminal and method of controlling the mobile terminal |
-
2014
- 2014-04-23 KR KR1020140048866A patent/KR20150122510A/en not_active Application Discontinuation
-
2015
- 2015-03-11 US US14/644,990 patent/US9484066B2/en active Active
- 2015-03-19 EP EP15159819.0A patent/EP2937864A1/en not_active Withdrawn
- 2015-04-23 CN CN201510196687.5A patent/CN105007531B/en not_active Expired - Fee Related
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101552898A (en) * | 2008-03-31 | 2009-10-07 | Lg电子株式会社 | Methods and device for reproducing images |
CN103258557A (en) * | 2012-02-20 | 2013-08-21 | 索尼公司 | Display control device and display control method |
JP2013171599A (en) * | 2012-02-20 | 2013-09-02 | Sony Corp | Display control device and display control method |
CN103455241A (en) * | 2012-05-29 | 2013-12-18 | 三星电子株式会社 | Method and apparatus for playing video in portable terminal |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017084353A1 (en) * | 2015-11-18 | 2017-05-26 | 乐视控股(北京)有限公司 | Video clip quick search method, device, system, and computer readable medium |
WO2017166480A1 (en) * | 2016-03-31 | 2017-10-05 | 乐视控股(北京)有限公司 | Playing progress adjusting method and device |
CN110741652A (en) * | 2018-05-21 | 2020-01-31 | 青岛海信电器股份有限公司 | Display device with intelligent user interface |
CN109615953A (en) * | 2019-01-30 | 2019-04-12 | 北京儒博科技有限公司 | A kind of exchange method of educational robot, device, robot and storage medium |
CN113906498A (en) * | 2019-06-19 | 2022-01-07 | 三星电子株式会社 | Display device and control method thereof |
CN113906498B (en) * | 2019-06-19 | 2024-05-28 | 三星电子株式会社 | Display device and control method thereof |
CN110337009A (en) * | 2019-07-01 | 2019-10-15 | 百度在线网络技术(北京)有限公司 | Control method, device, equipment and the storage medium of video playing |
CN110446093A (en) * | 2019-08-15 | 2019-11-12 | 天脉聚源(杭州)传媒科技有限公司 | A kind of video progress bar display methods, device and storage medium |
JP6882584B1 (en) * | 2020-09-02 | 2021-06-02 | Kddi株式会社 | Content playback device and program |
JP2022042186A (en) * | 2020-09-02 | 2022-03-14 | Kddi株式会社 | Content reproduction device and program |
Also Published As
Publication number | Publication date |
---|---|
US9484066B2 (en) | 2016-11-01 |
EP2937864A1 (en) | 2015-10-28 |
US20150310897A1 (en) | 2015-10-29 |
CN105007531B (en) | 2018-12-25 |
KR20150122510A (en) | 2015-11-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105007531B (en) | Image display device and its control method | |
US10095787B2 (en) | Mobile terminal and method for controlling the same | |
CN105393522B (en) | Mobile terminal and its control method | |
US9955064B2 (en) | Mobile terminal and control method for changing camera angles with respect to the fixed body | |
CN106412221B (en) | Mobile terminal and control method thereof | |
CN106559686A (en) | Mobile terminal and its control method | |
US20150253862A1 (en) | Glass type mobile terminal | |
CN108632444A (en) | Mobile terminal and its control method | |
CN105450848A (en) | Mobile terminal and controlling method thereof | |
CN104767871A (en) | Mobile terminal and controlling method thereof | |
CN105511601A (en) | Mobile terminal and controlling method thereof | |
CN105282316A (en) | Mobile terminal and controlling method thereof | |
EP3024206B1 (en) | Mobile terminal and control method thereof | |
EP3364405A1 (en) | Mobile terminal and controlling method thereof | |
US10264209B2 (en) | Mobile terminal and controlling method thereof | |
US9788074B2 (en) | Mobile terminal and method for controlling the same | |
US10631054B2 (en) | Terminal and method for controlling the same | |
US20160259622A1 (en) | Mobile terminal and method for controlling the same | |
US10149016B2 (en) | Mobile terminal and method for controlling the same | |
CN105474159A (en) | Mobile terminal | |
US10621749B2 (en) | Terminal detecting and displaying object position | |
CN107924284A (en) | Mobile terminal and its control method | |
US20170006235A1 (en) | Mobile terminal and method for controlling the same | |
CN106534474A (en) | Mobile terminal and method for controlling the same | |
KR20170059693A (en) | Mobile device and, the method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20181225 |