CN1921632B - Display apparatus and control method - Google Patents

Display apparatus and control method Download PDF

Info

Publication number
CN1921632B
CN1921632B CN 200610092547 CN200610092547A CN1921632B CN 1921632 B CN1921632 B CN 1921632B CN 200610092547 CN200610092547 CN 200610092547 CN 200610092547 A CN200610092547 A CN 200610092547A CN 1921632 B CN1921632 B CN 1921632B
Authority
CN
China
Prior art keywords
unit
data
auxiliary unit
view data
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN 200610092547
Other languages
Chinese (zh)
Other versions
CN1921632A (en
Inventor
近藤哲二郎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2001399867A external-priority patent/JP2003199092A/en
Application filed by Sony Corp filed Critical Sony Corp
Publication of CN1921632A publication Critical patent/CN1921632A/en
Application granted granted Critical
Publication of CN1921632B publication Critical patent/CN1921632B/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Abstract

When a great number of television receivers are connected and used, a higher function is implemented than when a television receiver is used alone. A main unit (television receiver) 1 displays a television broadcasting program and also detects its feature. In addition, the main unit 1 detects a scene change from the feature of the television broadcasting program, and controls a subsidiary unit (television receiver) 211 so as to display the image data of the frame immediately after the scene change. The subsidiary unit 211 displays the frame immediately after the scene change. Then, when another scene change occurs in image data displayed on the main unit 1, the image data of the frame immediately after the another scene change is displayed on the subsidiary unit 211, instead of the image data which has been displayed so far. The present invention can be applied, for example, to television receivers.

Description

Display device and control method
The application be that December 12, application number in 2002 are 02807430.0 the applying date, denomination of invention divides an application for the application for a patent for invention of " display device and control method ".
Technical field
The present invention relates to display device and control method, particularly for example allow when connecting and using a large amount of display device to realize when using display device separately more Premium Features and for example allow effectively to use display device and the control method that has television receiver now.
Background technology
For example, the television receiver received tv broadcast signal shows the television program image, and the sound of output accompanying image.
The traditional tv receiver is manufactured to alone works.Therefore, when the user bought new television receiver, the user need not to have old television receiver, even and can use them, but be also to abandon them under many circumstances.
If a large amount of connecting TV receivers provide more Premium Features than a television receiver, prevent that operable television receiver is dropped, thereby help efficient utilization of resource.
For example, in the family that baby, child or the elderly are arranged, be difficult to always monitor them, and usually the nursing staff is included in a personnel regularly or irregularly come baby-sit, child or the elderly.
Yet in these cases, nursing staff's workload is very large, therefore needs to reduce the means of nursing staff's workload.
For example, use to monitor camera system as these means, wherein, baby, child or the elderly are within doors made a video recording, and the display unit in another house shows image.
Yet, monitor in camera system in tradition, always need image regular or that casual inspection sends from video camera, although and reduced to a certain extent the workload that puts on the nursing staff, they are still larger.
Television receiver is used in a lot of rooms.When using the supervision camera system, except the television receiver that uses, also needing is that a display unit is installed by this system, thereby reduces room space.
The image that sends from the video camera of surveillance can show by the television receiver within being used for the room.
Yet when the television receiver within being used for the room showed from the image of the video camera transmission of surveillance, the nursing staff can not watch television program at television receiver.
The nursing staff also can be switched between television program and the image that sends from the video camera that monitors camera system.Yet this handover operation is trouble, in addition, the nursing staff also may occur and forgets this situation of switching and may not notice emergency in the worst case.
Summary of the invention
The present invention considers afore-mentioned and makes.The objective of the invention is to allow when connecting and using a large amount of display devices such as television receiver to realize when using display device separately more Premium Features, thereby and allow the user use existing television receiver easily the identified emergency situation effectively use and have television receiver now.
The first display device of the present invention is characterized in that comprising: the feature detection device, for detection of the feature of the input data that comprise at least view data; And control device, thereby being used for carrying out to control according to the result of the feature detection of being carried out by the feature detection device makes the image that is included in the input data be presented at least one in display unit and another display device.
The first control method of the present invention is characterized in that comprising: the feature detection step, and detection comprises the feature of the input data of view data at least; And the control step, make the image that is included in the input data be presented at least one in display unit and another display device thereby carry out to control according to the result of the feature detection of carrying out in the feature detection step.
The first program of the present invention is characterized in that comprising: the feature detection step, and detection comprises the feature of the input data of view data at least; And the control step, make the image that is included in the input data be presented at least one in display unit and another display device thereby carry out to control according to the result of the feature detection of carrying out in the feature detection step.
The first storage medium of the present invention is characterized in that wherein storing a kind of program that comprises the steps: the feature detection step, and detection comprises the feature of the input data of view data at least; And the control step, make the image that is included in the input data be presented at least one in display unit and another display device thereby carry out to control according to the result of the feature detection of carrying out in the feature detection step.
The second display device of the present invention is characterized in that comprising: storage device, for store a scene of the view data of the input data that comprise at least view data according to the control of another display device; And display unit, be used for being presented at the view data that storage device is stored.
The second control method of the present invention is characterized in that comprising: storing step, store a scene of the view data in the input data that comprise at least view data according to the control of another display device; And step display, be presented at the view data that storing step is stored.
The second program of the present invention is characterized in that comprising: storing step, store a scene of the view data in the input data that comprise at least view data according to the control of another display device; And step display, be presented at the view data that storing step is stored.
The second storage medium of the present invention is characterized in that wherein storing a kind of program that comprises the steps: storing step, store a scene of the view data in the input data that comprise at least view data according to the control of another display device; And step display, be presented at the view data that storing step is stored.
A kind of display system of the present invention is characterized in that the first display device comprises: the first display unit is used for display image data; The feature detection device is for detection of the feature of the input data that comprise at least view data; And control device, thereby being used for carrying out to control according to the feature detection result of input data makes the view data that is included in the input data be presented at least one in the first display device and one or more the second display device, and one or more the second display devices comprise: storage device, for a scene of the view data of storing the input data according to the control of the first display device; And second display unit, be used for being presented at the view data that storage device is stored.
In the first display device of the present invention, the first control method, the first program and the first storage medium, detect the feature of the input data comprise at least view data, thereby and carry out to control according to the feature detection result and make the image that is included in the input data be presented at least one in display unit and another display device.
In the second display device of the present invention, the second control method, the second program and the second storage medium, store a scene of the view data in the input data that comprise at least view data according to the control of another display device, and show the image of storing.
In display system of the present invention, in the first display device, detect the feature of the input data that comprise at least view data; Thereby and carry out to control according to testing result and make the view data that is included in the input data be presented at least one in the first display device and one or more the second display device.In one or more the second display devices, store a scene of the view data in the input data according to the control of the first display device; And show the view data of storing.
According to one aspect of the present invention, a kind of display device that is connected to another display device is provided, comprising: storage device, for store a scene of the view data of the input data that comprise at least view data according to the control of another display device; Display unit is used for being presented at the view data that storage device is stored, and authenticate device, be used for carrying out authentication with another display device, wherein, when successfully carrying out authentication, storage device is stored a scene that is included in the view data in the input data according to the control of another display device; And display unit is presented at the view data of storing in storage device.
According to another aspect of the present invention, a kind of control method for display device is provided, wherein, described display device is connected to another display device, described control method comprises: storing step, store a scene of the view data in the input data that comprise at least view data according to the control of another display device; Step display is presented at the view data that storing step is stored, and authenticating step, be used for carrying out authentication with another display device, wherein, when successfully carrying out authentication, store a scene that is included in the view data in the input data according to the control of another display device; And, be presented at the view data of storing in storing step.
Description of drawings
Figure 1A is the perspective view that the example structure of using an embodiment who expands television system of the present invention is shown;
Figure 1B is the perspective view that the example structure of using an embodiment who expands television system of the present invention is shown;
Fig. 2 is the perspective view that the exemplary external structure of master unit 1 is shown;
Fig. 3 A is the front view that the exemplary external structure of master unit 1 is shown;
Fig. 3 B is the top view that the exemplary external structure of master unit 1 is shown;
Fig. 3 C is the right side view that the exemplary external structure of master unit 1 is shown;
Fig. 3 D is the left side view that the exemplary external structure of master unit 1 is shown;
Fig. 3 E is the bottom view that the exemplary external structure of master unit 1 is shown;
Fig. 3 F is the dorsal view that the exemplary external structure of master unit 1 is shown;
Fig. 4 is the perspective view that the exemplary external structure of auxiliary unit 2 is shown;
Fig. 5 A is the front view that the exemplary external structure of auxiliary unit 2 is shown;
Fig. 5 B is the top view that the exemplary external structure of auxiliary unit 2 is shown;
Fig. 5 C is the right side view that the exemplary external structure of auxiliary unit 2 is shown;
Fig. 5 D is the left side view that the exemplary external structure of auxiliary unit 2 is shown;
Fig. 5 E is the bottom view that the exemplary external structure of auxiliary unit 2 is shown;
Fig. 5 F is the dorsal view that the exemplary external structure of auxiliary unit 2 is shown;
Fig. 6 is the perspective view of exemplary external structure that illustrates be used to holding composition and can expand the special machine cabinet of the master unit 1 of television system and auxiliary unit 2;
Fig. 7 is the plane graph that the exemplary external structure of controller 15 is shown;
Fig. 8 is the plane graph that the exemplary external structure of controller 35 is shown;
Fig. 9 is the plane graph that another exemplary external structure of controller 15 is shown;
Figure 10 is the block diagram that the example electrical structure of master unit 1 is shown;
Figure 11 is the block diagram that the example electrical structure of auxiliary unit 2 is shown;
Figure 12 is the figure that the hierarchy of IEEE-1394 communication protocol is shown;
Figure 13 is the figure that the address space of CSR framework is shown;
Figure 14 is the figure of offset address, title and function that CSR is shown;
Figure 15 is the figure that general ROM form is shown;
Figure 16 is the figure that the details of bus message piece, root and unit catalogue are shown;
Figure 17 is the figure that the PCR structure is shown;
Figure 18 A is the figure that the oMPR structure is shown;
Figure 18 B is the figure that the oPCR structure is shown;
Figure 18 C is the figure that the iMPR structure is shown;
Figure 18 D is the figure that the iPCR structure is shown;
Figure 19 is the figure that the data structure of the grouping of transmitting with asynchronous transfer mode in the AV/C order is shown;
Figure 20 A is the figure that command type in the AV/C order/response (ctype/response) field is shown;
Figure 20 B is the figure that subelement type (subunit type) field in the AV/C order is shown;
Figure 20 C is the figure that command code (opcode) field in the AV/C order is shown;
Figure 21 A is the figure that the example particular command in the AV/C order is shown;
Figure 21 B is the figure that the example specific response in the AV/C order is shown;
Figure 22 is the block diagram that the example electrical structure of controller 15 is shown;
Figure 23 is the figure that illustrates by the form of the frame data of controller 15 sending and receivings;
Figure 24 is the block diagram that the example electrical structure of controller 35 is shown;
Figure 25 is the block diagram that the example electrical structure of IR interface 135 is shown;
Figure 26 is the flow chart of the processing of explanation controller 15;
Figure 27 is the flow chart of the processing of explanation IR interface 135;
Figure 28 is the flow chart of the processing of explanation master unit 1;
Figure 29 is the flow chart of the authentication processing of explanation master unit 1;
Figure 30 is the flow chart that the processing of auxiliary unit 2 is described;
Figure 31 is the flow chart that the authentication processing of auxiliary unit 2 is described;
Figure 32 is the block diagram that the first example structure of signal processing 127 is shown;
Figure 33 is that explanation is by the flow chart of first virtual many viewpoints Graphics Processing of the master unit of signal processing unit 127 execution;
Figure 34 is the block diagram that the first example structure of signal processing 147 is shown;
Figure 35 is that explanation is by the flow chart of first virtual many viewpoints Graphics Processing of the auxiliary unit of signal processing unit 147 execution;
Figure 36 A illustrates composition can expand the master unit 1 of television system and the figure that the example in auxiliary unit 2 shows;
Figure 36 B illustrates composition can expand the master unit 1 of television system and the figure that the example in auxiliary unit 2 shows;
Figure 36 C illustrates composition can expand the master unit 1 of television system and the figure that the example in auxiliary unit 2 shows;
Figure 37 A illustrates composition can expand the master unit 1 of television system and the figure that the example in auxiliary unit 2 shows;
Figure 37 B illustrates composition can expand the master unit 1 of television system and the figure that the example in auxiliary unit 2 shows;
Figure 37 C illustrates composition can expand the master unit 1 of television system and the figure that the example in auxiliary unit 2 shows;
Figure 38 is the block diagram that the second example structure of signal processing 127 is shown;
Figure 39 is that explanation is by the flow chart of second virtual many viewpoints Graphics Processing of the master unit of signal processing unit 127 execution;
Figure 40 is the block diagram that the 3rd example structure of signal processing 127 is shown;
Figure 41 is that explanation is by the flow chart of the 3rd virtual many viewpoints Graphics Processing of the master unit of signal processing unit 127 execution;
Figure 42 is the block diagram that the second example structure of signal processing 147 is shown;
Figure 43 is that explanation is by the flow chart of second virtual many viewpoints Graphics Processing of the auxiliary unit of signal processing unit 147 execution;
Figure 44 is the block diagram that the 4th example structure of signal processing 127 is shown;
Figure 45 is that explanation is by the flow chart of the 4th virtual many viewpoints Graphics Processing of the master unit of signal processing unit 127 execution;
Figure 46 is the figure that illustrates by the command transfer of infrared communication from master unit 1 to auxiliary unit 2;
Figure 47 is the flow chart of explanation processing of performed controller 15 when order is sent to auxiliary unit 2 from master unit 1;
Figure 48 is the block diagram that the 5th example structure of signal processing 127 is shown;
Figure 49 is that explanation is by the flow chart of the 5th virtual many viewpoints Graphics Processing of the master unit of signal processing unit 127 execution;
Figure 50 is the block diagram that the 6th example structure of signal processing 127 is shown;
Figure 51 is that explanation is by the flow chart of the 6th virtual many viewpoints Graphics Processing of the master unit of signal processing unit 127 execution;
Figure 52 A illustrates composition can expand the master unit 1 of television system and the figure that the example in auxiliary unit 2 shows;
Figure 52 B illustrates composition can expand the master unit 1 of television system and the figure that the example in auxiliary unit 2 shows;
Figure 52 C illustrates composition can expand the master unit 1 of television system and the figure that the example in auxiliary unit 2 shows;
Figure 53 illustrates composition can expand the master unit 1 of television system and the figure that the example in auxiliary unit 2 shows;
Figure 54 illustrates composition can expand the master unit 1 of television system and the figure that the example in auxiliary unit 2 shows;
Figure 55 is the block diagram that the 7th example structure of signal processing 127 is shown;
Figure 56 is that explanation is by the flow chart of the 7th virtual many viewpoints Graphics Processing of the master unit of signal processing unit 127 execution;
Figure 57 is the block diagram that the 8th example structure of signal processing 127 is shown;
Figure 58 is that explanation is by the flow chart of the 8th virtual many viewpoints Graphics Processing of the master unit of signal processing unit 127 execution;
Figure 59 is the block diagram that the 9th example structure of signal processing 127 is shown;
Figure 60 is that explanation is by the flow chart of the 9th virtual many viewpoints Graphics Processing of the master unit of signal processing unit 127 execution;
Figure 61 A illustrates composition can expand the master unit 1 of television system and the figure that the example in auxiliary unit 2 shows;
Figure 61 B illustrates composition can expand the master unit 1 of television system and the figure that the example in auxiliary unit 2 shows;
Figure 61 C illustrates composition can expand the master unit 1 of television system and the figure that the example in auxiliary unit 2 shows;
Figure 62 is the block diagram that another example electrical structure of master unit 1 is shown;
Figure 63 A is the perspective view that the example structure of using another embodiment that expands television system of the present invention is shown;
Figure 63 B is the perspective view that the example structure of using another embodiment that expands television system of the present invention is shown;
Figure 64 is the block diagram that the example electrical structure of master unit 1 is shown;
Figure 65 is the block diagram that the example electrical structure of auxiliary unit 2 is shown;
Figure 66 is the block diagram that the example structure of an embodiment who uses safety system of the present invention is shown;
Figure 67 is the block diagram that the example structure of video camera 1162 is shown;
Figure 68 is the flow chart of the processing of explanation master unit 1;
Figure 69 is the flow chart of the authentication processing of explanation master unit 1;
Figure 70 is the flow chart that the processing of auxiliary unit 2 is described;
Figure 71 is the flow chart that the authentication processing of auxiliary unit 2 is described;
Figure 72 A illustrates the figure that the example in the master unit 1 that can expand television system 1161 shows;
Figure 72 B is the figure that the example full screen display that can expand in television system 1161 is shown;
Figure 73 is the block diagram that the example structure of data processing section 1137C is shown;
Figure 74 A is the flow chart that the view data of explanation data processing section 1137C is processed;
Figure 74 B is the flow chart that the voice data of explanation data processing section 1137C is processed;
Figure 75 is the block diagram that the example structure of warning processing section 1137D is shown;
Figure 76 is that the flow chart of processing is judged/warned to the abnormal thing of explanation warning processing section 1137D;
Figure 77 A is the figure that the specification exception thing is judged the processing of part 1213;
Figure 77 B is the figure that the specification exception thing is judged the processing of part 1213;
Figure 77 C is the figure that the specification exception thing is judged the processing of part 1213;
Figure 77 D is the figure that the specification exception thing is judged the processing of part 1213;
Figure 78 A is the figure that the specification exception thing is judged the processing of part 1213;
Figure 78 B is the figure that the specification exception thing is judged the processing of part 1213;
Figure 78 C is the figure that the specification exception thing is judged the processing of part 1213;
Figure 78 D is the figure that the specification exception thing is judged the processing of part 1213;
Figure 78 E is the figure that the specification exception thing is judged the processing of part 1213;
Figure 78 F is the figure that the specification exception thing is judged the processing of part 1213;
Figure 78 G is the figure that the specification exception thing is judged the processing of part 1213;
Figure 78 H is the figure that the specification exception thing is judged the processing of part 1213;
Figure 78 I is the figure that the specification exception thing is judged the processing of part 1213;
Figure 79 A is the figure that the warning of explanation safety governor 1137A is processed;
Figure 79 B is the figure that the warning of explanation safety governor 1137A is processed;
Figure 80 A is the figure that the warning of explanation safety governor 1137A is processed;
Figure 80 B is the figure that the warning of explanation safety governor 1137A is processed;
Figure 81 is the block diagram that another example electrical structure of master unit 1 is shown;
Figure 82 A is the figure that the warning of explanation safety governor 1137A is processed;
Figure 82 B is the figure that the warning of explanation safety governor 1137A is processed;
Figure 83 is the block diagram that another example electrical structure of master unit 1 is shown;
Figure 84 is the block diagram that the example structure of an embodiment who uses computer of the present invention is shown.
Embodiment
Figure 1A and Figure 1B are the perspective views that the example structure of an embodiment who uses the TV of expansion of the present invention (TV) system (system refers to the entity that a plurality of apparatus logics fit together, and whether is included in same cabinet irrelevant with these equipment) is shown.
In the embodiment shown in Figure 1A, can expand television system by nine television receivers 1,2 11, 2 12, 2 13, 2 21, 2 23, 2 31, 2 32With 2 33Form.In the embodiment shown in Figure 1B, can expand television system by 25 television receivers 1,2 11, 2 12, 2 13, 2 14, 2 15, 2 21, 2 22, 2 23, 2 24, 2 25, 2 31, 2 32, 2 34, 2 35, 2 41, 2 42, 2 43, 2 44, 2 45, 2 51, 2 52, 2 53, 2 54With 2 55Form.
The number that composition can be expanded the television receiver of television system is not limited to 9 or 25.In other words, can expand television system can be formed by any a plurality of television receivers.The arrangement that composition can be expanded the television receiver of television system is not limited to 3 * 3 or 5 * 5 horizontal vertical forms shown in Figure 1A or Figure 1B.In other words, the arrangement that composition can be expanded the television receiver of television system can be other horizontal vertical forms, as 1 * 2,2 * 1,2 * 3 horizontal vertical forms.In addition, the spread geometry that composition can be expanded the television receiver of television system is not limited to the matrix shape shown in Figure 1A and 1B, and can be Pyramid.
Can be like this consisted of for any a plurality of television receiver arbitrarily by the arrangement number on the horizontal vertical direction owing to can expanding television system, so it can be called " can expand " system.
In composition can be expanded the television receiver of television system, there are two kinds of television receivers: main television receiver (in case of necessity hereinafter referred to as making master unit), can control other television receivers; And auxiliary television receiver (being called auxiliary unit in case of necessity), can be controlled by another television receiver, but can not control other television receivers.
For carrying out various types of processing described later, can expand television system and need to have and follow the television receiver that to expand television system (below be called to expand use the unit), and needs are arranged in them is master units.Therefore, in each embodiment shown in Figure 1A and Figure 1B, in composition can be expanded the television receiver of television system, the television receiver that for example is positioned at central authorities was taken on master unit 1.
For above-mentioned reasons, using in forming the television receiver that can expand television system is not that in the time of can expanding the television receiver that uses the unit, this television receiver can not use the function that can expand television system.In addition, even be all in the time of can expanding the use unit when forming all television receivers that can expand television system, if but they are auxiliary unit entirely, they can not use the function that can expand television system.
Therefore, for using the function can expand television system, the user need to buy at least one or a plurality of master unit or a master unit and one or more auxiliary unit.
Because master unit can be taken on auxiliary unit, therefore can use a plurality of master units in forming the television receiver that to expand television system.
In the embodiment shown in Figure 1A, the television receiver 1 that is positioned at central authorities' (from left to right second and gone up second) in 3 * 3 television receivers is taken on master unit, and other eight television receivers 2 11, 2 12, 2 13, 2 21, 2 23, 2 31, 2 32With 2 33Take on auxiliary unit.In the embodiment shown in Figure 1B, the television receiver 1 that is positioned at central authorities' (from left to right the 3rd and gone up the 3rd) in 5 * 5 television receivers is taken on master unit, and other 24 television receivers 2 11, 2 12, 2 13, 2 14, 2 15, 2 21, 2 22, 2 23, 2 24, 2 25, 2 31, 2 32, 2 34, 2 35, 2 41, 2 42, 2 43, 2 44, 2 45, 2 51, 2 52, 2 53, 2 54With 2 55Take on auxiliary unit.
Therefore, in each embodiment shown in Figure 1A and Figure 1B, master unit 1 is positioned at the central authorities that composition can be expanded the television receiver of television system.The position of master unit 1 is not limited to form the central authorities of the television receiver that can expand television system.Master unit 1 can be positioned at any position, as the upper left corner or the lower right corner.
Even when master unit 1 is arranged in any position that can expand television system, each process of aftermentioned is all also in the situation that the television receiver that is positioned at central authorities is regarded as master unit and carried out.
For simplification the following describes, suppose that can expand television system is formed by 3 * 3 television receivers as shown in Figure 1A, and master unit 1 is positioned at the central authorities that composition can be expanded the television receiver of television system.
Be used for forming the auxiliary unit 2 that to expand television system ijIn subfix i and j refer to auxiliary unit 2 ijBe arranged in the capable and j of the i that can expand television system row (on play the capable and j row from left to right of i).
Below, if there is no need to distinguish auxiliary unit 2 ij, their general designations are made auxiliary unit 2.
Fig. 2 is the perspective view that the example structure of the television receiver of taking on master unit 1 is shown.
Master unit 1 is that display screen size is the television receiver of 14 inches or 15 inches for example, and the central authorities in the front are useful on CRT (the Cathode Ray Tube that shows image, cathode ray tube) 11, and left hand end and right hand end in the front loudspeaker unit 12L and the 12R that are useful on output sound.
Be included in by the image in the television broadcasting signal of unshowned antenna reception and be presented on CRT 11, and from L (left side) channel sound and R (right side) channel sound of loudspeaker unit 12L and 12R output accompanying image.
Master unit 1 is connected to the remote controller (being called controller in case of necessity) 15 for emission infrared ray IR.The user can change receiving channels and volume by operation control 15, and various command is sent to master unit 1.
Controller 15 is not limited to carry out the remote controller of infrared communication.Can also adopt and carry out radio communication such as BlueTooth (bluetooth) TMRemote controller.
Controller 15 not only can be controlled master unit 1, can also control auxiliary unit 2.
Next step, Fig. 3 A illustrates six faces of the example structure of master unit shown in Figure 21 to Fig. 3 F.
Fig. 3 A illustrates the front of master unit 1, and Fig. 3 B illustrates the end face of master unit 1, and Fig. 3 C illustrates the bottom surface of master unit 1, and Fig. 3 D illustrates the left hand side of master unit 1, and Fig. 3 E illustrates the right hand side of master unit 1, and Fig. 3 F illustrates the back side of master unit 1.
Fixed mechanism is installed on end face (Fig. 3 B), bottom surface (Fig. 3 C), left hand side (Fig. 3 D) and the right hand side (Fig. 3 E) of master unit 1.As described later, similarly release mechanism is installed on the end face, bottom surface, left hand side and right hand side of the television receiver of taking on auxiliary unit.When in the end face side of master unit 1, when bottom surface side, left hand side side or right hand side side settle auxiliary unit 2 or another master unit, be installed in the end face, bottom surface, left hand side of master unit 1 or the fixed mechanism on right hand side and for example mesh master unit 1 to be fixed to auxiliary unit 2 or another master unit with fixed mechanism on the opposite face that is installed in auxiliary unit 2 or another master unit, thereby be not easy separately.Like this, preventing from forming the television receiver that can expand television system misplaces.
Fixed mechanism can be formed by mechanical mechanism.They also can be formed by for example magnet.
At the back side of master unit 1, as shown in Fig. 3 F, be installed on terminal panel 21, antenna terminal 22, input terminal 23 and lead-out terminal 24.
Terminal panel 21 has eight IEEE (Institute of Electrical and Electronics Engineers, Institute of Electrical and Electric Engineers), 1394 terminals 21 11, 21 12, 21 13, 21 21, 21 23, 21 31, 21 32With 21 33, be used for master unit 1 is connected electrically to eight auxiliary unit 2 that form the television system expanded shown in Figure 1A 11, 2 12, 2 13, 2 21, 2 23, 2 31, 2 32With 2 33
In the embodiment shown in Fig. 3 F, in terminal panel 21, be connected to auxiliary unit 2 ij IEEE1394 terminal 21 ijBe installed in watch from back side side as the user obtain in the time of can expanding television system with the television system expanded shown in Figure 1A auxiliary unit 2 ijCorresponding position, position, thereby master unit 1 is known auxiliary unit 2 ijPosition in the television system expanded shown in Figure 1A.
Therefore, in the television system expanded shown in Figure 1A, require the user to pass through IEEE-1394 terminal 21 11Carry out master unit 1 to auxiliary unit 2 11Connection, by IEEE-1394 terminal 21 12Carry out master unit 1 to auxiliary unit 2 12Connection, by IEEE-1394 terminal 21 13Carry out master unit 1 to auxiliary unit 2 13Connection, by IEEE-1394 terminal 21 21Carry out master unit 1 to auxiliary unit 2 21Connection, by IEEE-1394 terminal 21 23Carry out master unit 1 to auxiliary unit 2 23Connection, by IEEE-1394 terminal 21 31Carry out master unit 1 to auxiliary unit 2 31Connection, by IEEE-1394 terminal 21 32Carry out master unit 1 to auxiliary unit 2 32Connection, and by IEEE-1394 terminal 21 33Carry out master unit 1 to auxiliary unit 2 33Connection.
In the television system expanded shown in Figure 1A, which the IEEE-1394 connecting terminals in terminal panel 21 is received auxiliary unit ijUnrestricted.When auxiliary unit ijBe connected to and be different from IEEE-1394 terminal 21 ijThe IEEE-1394 terminal time, need to specify (requiring the user to specify) auxiliary unit in master unit 1 ijBe arranged in the capable and j row of the i of the television system expanded shown in Figure 1A.
In the embodiment shown in Fig. 3 F, terminal panel 21 is installed on eight IEEE-1394 terminals 21 11To 21 31, and master unit 1 is connected to eight auxiliary unit 2 with parallel mode 11To 2 33 Master unit 1 can be connected to eight auxiliary unit 2 with serial mode 11To 2 33Equally in this case, need to specify auxiliary unit in master unit 1 ijBe arranged in the capable and j row of the i of the television system expanded shown in Figure 1A.Therefore, the number of the IEEE-1394 terminal installed of terminal panel 21 is not limited to eight.
In addition, can expand that in television system, the electrical connection between television receiver is not limited to IEEE 1394.Can also adopt LAN (IEEE 802) etc.Can expand in television system the electrical connection between television receiver can adopt wireless mode rather than wired mode to carry out.
Antenna terminal 22 is connected to the cable that is connected with unshowned antenna.Like this, the television broadcasting signal by antenna reception is input to master unit 1.For example view data and the voice data from VTR (video tape recorder, video tape recorder) output is input to input terminal 23.Export view data and the voice data of the television broadcasting signal that master unit 1 for example receiving from lead-out terminal 24.
Fig. 4 is the perspective view that the example structure of the television receiver of taking on auxiliary unit 2 is shown.
Auxiliary unit 2 is the television receivers with display screen size identical with master unit shown in Figure 21, and the central authorities in the front are useful on the CRT (cathode ray tube) 31 that shows image, and left hand end and right hand end in the front loudspeaker unit 32L and the 32R that are useful on output sound.For master unit 1 and auxiliary unit 2, also can use different display screen size.
Be included in by the image in the television broadcasting signal of unshowned antenna reception and be presented on CRT 31, and from L (left side) channel sound and R (right side) channel sound of loudspeaker unit 32L and 32R output accompanying image.
As master unit 1, auxiliary unit 2 is connected to the controller 35 for emission infrared ray IR.The user can change receiving channels and volume by operation control 35, and various command is sent to auxiliary unit 2.
Controller 35 not only can be controlled auxiliary unit 2, can also control master unit 1.
For consisting of the television system expanded shown in Figure 1A, the user need to buy a master unit 1 and eight auxiliary unit 2 11To 2 33In this case, because master unit 1 is connected to controller 15, and eight auxiliary unit 2 11To 2 33Be connected to controller 35, so the user has 9 controllers, and their management is trouble.
Can be with the controller 35 of auxiliary unit 2 with 2 distributions of auxiliary unit as an option.Also can be with the controller 15 of master unit 1 with master unit 1 distribution as an option.
As mentioned above, controller 15 and 35 can not only be controlled master unit 1 but also control auxiliary unit 2.Therefore, if the user only has one of controller 15 and 35, just can not only control master unit 1 but also control auxiliary unit 2.
Next step, Fig. 5 A illustrates six faces of the example structure of auxiliary unit 2 shown in Figure 4 to 5F.
Fig. 5 A illustrates the front of auxiliary unit 2, and Fig. 5 B illustrates the end face of auxiliary unit 2, and Fig. 5 C illustrates the bottom surface of auxiliary unit 2, and Fig. 5 D illustrates the left hand side of auxiliary unit 2, and Fig. 5 E illustrates the right hand side of auxiliary unit 2, and Fig. 5 F illustrates the back side of auxiliary unit 2.
Fixed mechanism is installed in end face (Fig. 5 B), bottom surface (Fig. 5 C), left hand side (Fig. 5 D) and the right hand side (Fig. 5 E) of auxiliary unit 2.When in the end face side of auxiliary unit 2, when bottom surface side, left hand side side or right hand side side settle master unit 1 or another auxiliary unit, be installed in fixed mechanism engagement on the end face, bottom surface, left hand side of auxiliary unit 2 or the fixed mechanism on right hand side and the opposite face that is installed in master unit 1 or another auxiliary unit auxiliary unit 2 being fixed to master unit 1 or another auxiliary unit, thereby be not easy separately.
The back side in auxiliary unit 2 as shown in Fig. 5 F, is installed on terminal panel 41, antenna terminal 42, input terminal 43 and lead-out terminal 44.
Terminal panel 41 has an IEEE-1394 terminal 41 1, be used for master unit 1 is connected electrically to auxiliary unit 2.When auxiliary unit 2 is for example the auxiliary unit 2 that is positioned at the upper left in the television system expanded shown in Figure 1A 11The time, the IEEE-1394 terminal 41 in terminal panel 41 1Be connected to IEEE-1394 terminal 21 in the terminal panel 21 shown in Fig. 3 F by unshowned IEEE-1394 cable 11
The number of the IEEE-1394 terminal that terminal panel 41 is installed is not limited to 1.
Antenna terminal 42 is connected to the cable that is connected with unshowned antenna.Like this, the television broadcasting signal by antenna reception is input to auxiliary unit 2.For example view data and the voice data from VTR output is input to input terminal 43.From lead-out terminal 44, output is view data and the voice data of the television broadcasting signal that receiving of auxiliary unit 2 for example.
Nine television receivers altogether of constructing as mentioned above, a master unit 1 and eight auxiliary unit 2 11To 2 33All arrange with three unit in the horizontal and vertical directions, thereby form the television system expanded shown in Figure 1A.
The television system expanded shown in Figure 1A is by directly being placed on television receiver top, below, left or right-hand structure of the television receiver of taking on master unit or auxiliary unit.Can expand television system also can for example construct by television receiver being placed in system-specific rack shown in Figure 6.When using special machine cabinet by this way, can prevent more lustily that forming the television receiver that can expand television system misplaces.
In the time can expanding the top of television system by television receiver directly being placed on the television receiver of taking on master unit or auxiliary unit, below, left or right-hand the structure, for example, there is no auxiliary unit 2 32Situation under, master unit 1 can not be placed on the second row and the secondary series shown in Figure 1A.On the contrary, in the time can expanding television system and use special machine cabinet shown in Figure 6, even there is no auxiliary unit 2 32Situation under, master unit 1 also can be placed on the second row and secondary series.
Next step, Fig. 7 is the plane graph that the example structure of controller 15 is shown.
Select the push-button switch 51 can be in the operation (operating on a direction therein) on eight directions altogether of four direction up and down and four angle directions thereof.In addition, can also be perpendicular to the end face of controller 15 by selecting push-button switch 51 (to elect).Menu button switch 54 is used at menu screen of the upper demonstration of the CRT11 of master unit 1 (or CRT 31 of auxiliary unit 2), is used for the various settings of input indication and (for example, specifies as mentioned above auxiliary unit ijBe arranged in the capable and j of the i that can expand television system row) and the order of a predetermined process.
When the display menu screen, showing the cursor be used in reference to the project on the menu screen on CRT 11.Cursor moves on the corresponding direction of direction of operation selection push-button switch 51.When cursor is positioned on predetermined item, if by selecting push-button switch 51, judge this project of selecting.In the present embodiment, as described later, display icon in the project on being presented at menu, and be also by selecting push-button switch 51 to come clickable icon.
Can install on controller 15 be presented at menu screen on project and the corresponding push-button switch of icon.In this case, the user can in the situation that not display menu screen direct-acting controller 15 specify project or the icon that is presented on menu screen.
For example, when the user wants to get back to initial screen usually from menu screen, operation exit button switch 55.
Volume button switch 52 is used for increasing or reducing volume.Channel up/down push-button switch 53 is used for the broadcasting channel that increases or reduce to receive number.
Digital button (numeric keypad) switch 58 represents from zero to nine numeral, and is used for inputting represented numeral.Immediate operation load button switch 57 after complete operation digital button switch 58, thus the representative digit input is completed.When switching channels, new channel number etc. are the upper OSD (on-screen display, screen display) that shows the scheduled time of the CRT 11 (or CRT 31 of auxiliary unit 2) at master unit 1.The Show Button 56 is used for switching the ON/OFF OSD that shows current selected channel number or current volume and shows.
TV/video/switching push button switch 59 is used for from the input (perhaps from tuner 141 shown in Figure 11 described later) of built-in tuner 121 shown in Figure 10 described later and the input of switching master unit 1 (or auxiliary unit 2) between from the input of the input terminal 23 shown in Fig. 3 F (perhaps from the input terminal 43 shown in Fig. 5 F).TV/DSS switching push button switch 60 is used in tuner 121 selecting DDS (digital satellite system, the trade mark of the Hughes communication common carrier) pattern that is used for receiving the television mode of terrestrial broadcasting or is used for receiving satellite broadcast.When operand word push-button switch 58 changes channel, selected channel before storage changes.Selected channel before hop button switch 61 is used for turning back to and changes.
Language button 62 is used for selecting a scheduled instruction when with two or more language broadcasting.Guided push button switch 63 is used for showing EPG (electric program guide, electric program guiding).Collection (favorite) push-button switch 64 is used for selecting preassigned user's favorite channel.
Wired push-button switch 65, TV switch 66 and DSS switch 67 are used for switching the unit classification from the corresponding command code of infrared ray of controller 15 transmissions.More particularly, except the television receiver of taking on master unit 1 and auxiliary unit 2, the all right unshowned STB of remote control (the set-top box of controller 15 (and controller 35), set-top box) and IRD (integrated receiver and decoder, integrated receiver and decoder).Wired push-button switch 65 is used for controlling by controller 15 STB that is used for receiving by the signal of wired TV network transmission.When the wired push-button switch 65 of operation, the corresponding infrared ray of other command code of unit class of STB is distributed in controller 15 emissions.Equally, TV push-button switch 66 is used for controlling master unit 1 (or auxiliary unit 2) by controller 15.DSS push-button switch 67 is used for controlling by controller 15 IRD that is used for receiving from the signal of satellite transmission.
When connecting wired push-button switch 65, TV push-button switch 66 or DSS push-button switch 67, connect respectively LED (light emitting diode, light-emitting diode) 68,69 or 70.By these LED, which kind of other unit what controller 15 showed that current permission controls to the user is.When disconnecting wired push-button switch 65, TV push-button switch 66 or DSS push-button switch 67, disconnect respectively LED 68,69 or 70.
Cable power push-button switch 71, TV power supply push-button switch 72 or DSS power knob switch 73 are used for respectively the power supply of on/off STB, master unit 1 (or auxiliary unit 2) or IRD.
Mute button switch 74 is used for arranging or removing the mute state of master unit 1 (or auxiliary unit 2).Sleep push-button switch 75 is used for arranging or remove when arriving the scheduled time or the sleep pattern of autoshutdown power supply when passing by the scheduled time.
Next step, Fig. 8 is the plane graph of example structure that the controller 35 of auxiliary unit 2 is shown.
Due to controller 35 have with controller 15 shown in Figure 7 in the selection push-button switch 51 selection push-button switch 81 of constructing to the identical mode of sleep push-button switch 75 to sleep push-button switch 105, therefore the descriptions thereof are omitted.
Fig. 9 is the plane graph of another example structure that the controller 15 of master unit 1 is shown.
In the embodiment shown in fig. 9, replace the selection push-button switch 51 that can operate shown in Figure 7 on eight directions, be provided for the arrow button switch 111,112,113 and 114 and be used for carrying out the push-button switch 110 of selecting operation of four direction up and down.In addition, in the embodiment shown in fig. 9, wired push-button switch 65, TV push-button switch 66 and DSS push-button switch 67 are the internal illumination type, and omit LED shown in Figure 7 68 to 70.Unshowned LED is positioned at the dorsal part of push-button switch 65 to 67.When pushbutton switch 65 to 67, the LED that is positioned at dorsal part responds these operations and switch.
Although other push-button switches position is different basically with shown in Figure 7 identical to inductive switch.
The controller 35 of auxiliary unit 2 can be constructed as shown in Figure 9 equally.
Controller 15 can comprise a gyroscope, for detection of the motion of controller 15.In this case, the gyroscope that is built in controller 15 can detect the direction of motion and the move distance of controller 15, and is presented at cursor on menu screen according to direction and the distance moving of motion.When controller 15 comprises a gyroscope by this way, in the embodiment shown in fig. 7 need to be at eight directions drilling push-button switch 51 that elects, and arrow button switch 111 to 114 is unnecessary in the embodiment shown in fig. 9.Controller 35 can comprise a gyroscope equally.
Next step, Figure 10 illustrates the example electrical structure of master unit 1.
Television broadcasting signal by unshowned antenna reception sends to tuner 121, and process detects and demodulation under the control of CPU 129.The output of tuner 121 sends to QPSK (quadrature phaseshift keying, orthogonal PSK) demodulator circuit 122, and process QPSK demodulation under the control of CPU 129.The output of QPSK demodulator circuit 122 sends to error correction circuit 123.Process error detection and correction under the control of CPU 129, and output is sent to demodulation multiplexer 124.
Demodulation multiplexer 124 carries out descrambling (de-scramble) to the output of error correction circuit 123 in case of necessity under the control of CPU 129, and extracts TS (transport stream, transport stream) grouping in predetermined channel.Then, demodulation multiplexer 124 sends to MPEG (MovingPicture Experts Group with view data (video data) TS grouping, motion picture expert group) Video Decoder 125, and voice data (voice data) TS grouping is sent to mpeg audio decoder 126.The TS grouping that demodulation multiplexer 124 also will be included in the output of error correction circuit 123 in case of necessity sends to CPU 129.Demodulation multiplexer 124 also receives view data or the voice data (comprising those data the TS grouping) that sends from CPU 129, and it is sent to mpeg video decoder 125 or mpeg audio decoder 126.
Mpeg decode is carried out in 125 pairs of view data TS groupings that send from demodulation multiplexer 124 of mpeg video decoder, and sends to signal processing 127.Mpeg decode is carried out in 126 pairs of voice data TS groupings that send from demodulation multiplexer 124 of mpeg audio decoder.The L channel voice data and the right audio channel data that obtain by decoding in mpeg audio decoder 126 send to signal processing 127.
Signal processing 127 will send to matrix circuit 128 from the view data that mpeg video decoder 125 sends, and will send to amplifier 137 from the voice data (acoustic data) that mpeg audio decoder 126 sends.
Signal processing 127 is by DSP (digital signal processor, digital signal processor) 127A, EEPROM (electrically erasable programmable read only memory, Electrically Erasable Read Only Memory) 127B, RAM (random access memory, random access memory) formation such as 127C, and the view data of its transmission of subtend under the control of CPU 129 and voice data apply various Digital Signal Processing.
More particularly, DSP 127A carries out various signals processing in case of necessity according to the program that is stored in EEPROM 127B with the data that are stored in EEPROM 127B.EEPROM 127B stores program and necessary data used when DSP 127A carries out various the processing.RAM 137C stores DSP 137A temporarily and carries out the required data of various processing and program.
Being stored in data in EEPROM 127B and program can upgrade by overriding.
Matrix circuit 128 will convert RGB (RGB) view data to from the view data that signal processing 127 sends, and it is sent to CRT to show.Matrix circuit 128 has D/A (digital-to-analog) transducer, and view data is applied D/A changes and export it.
CPU 129 carries out various processing for example to control tuner 121, QPSK demodulator circuit 122, error correction circuit 123, demodulation multiplexer 124, signal processing 127, IEEE-1394 interface 133, IR interface 135 and modulator-demodulator 136 according to the program that is stored in EEPROM 130 and ROM (read only memory, read-only memory) 131.In addition, CPU 129 also will send to IEEE-1394 interface 133 from the data that demodulation multiplexer 124 sends, and will send to demodulation multiplexer 124 and signal processing 127 from the data that IEEE-1394 interface 133 sends.In addition, CPU 129 also carries out from the corresponding processing of order of front panel 134 or 135 transmissions of IR interface.And CPU 129 controls modulator-demodulator 136 and obtains refresh routine and necessary data to access unshowned server by telephone wire.
Even also need data and the program of preserving after EEPROM 130 storage power supplys disconnect.ROM131 is storing IP L (initial program loader, initial program loader) program for example.Being stored in data in EEPROM 130 and program can upgrade by overriding.
Required data and the program of operation of the interim storage CPU 129 of RAM 132.
IEEE-1394 interface 133 is connected to terminal panel 21 (its IEEE-1394 terminal 21 11To 21 33(Fig. 3 F)), and as the interface of carrying out the communication of following IEEE 1394 standards.Like this, IEEE-1394 interface 133 will send to the external world from the data that CPU 129 sends according to IEEE 1394 standards, and receive according to IEEE 1394 standards the data that send from the external world, and it is sent to CPU 129.
Although front panel 134 is not shown in Fig. 3 F at Fig. 2 or Fig. 3 A, it for example is installed in a certain position in the front of master unit 1.Front panel 134 is included as the part (Fig. 7 and Fig. 9) in the push-button switch that controller 15 provides.More particularly, front panel 134 for example comprises the push-button switch corresponding to volume button switch 52, channel up/down push-button switch 53, menu button switch 54, digital button switch 58 and TV power supply push-button switch 72.During push-button switch when operation on front panel 134, send to CPU 129 corresponding to the operation signal of this operation.In this case, CPU 129 carries out the corresponding processing of operation signal that sends from front panel 134.
Operation on IR interface 135 response controllers 15 receives the infrared ray (light) that sends from controller 15.In addition, 135 pairs of receiving infrared-rays of IR interface carry out opto-electronic conversion, and the signal that will therefore obtain sends to CPU 129.In this case, CPU 129 carries out the corresponding processing of signal that sends from IR interface 135, namely, and the corresponding processing of the operation on controller 15.In addition, IR interface 135 is launched infrared ray according to the control of CPU 129.In other words, in the present embodiment, master unit 1 can be carried out IEEE-1394 communication by above-mentioned IEEE-1394 interface 133, by modulator-demodulator 136 executive communications described later, and carries out infrared communication by IR interface 135.
Modulator-demodulator 136 is controlled communication by telephone wire, thereby sends by telephone wire the data that send from CPU 129, receives the data that send by telephone wire, and it is sent to CPU 129.
Amplifier 137 amplifies the voice data that sends from signal processing 127 in case of necessity, and it is sent to loudspeaker unit 12L and 12R to export.Amplifier 137 has D/A converter, voice data is applied the D/A conversion, and export it.
The master unit 1 of structure is exported image and the sound (showing image and output sound) in television program in the following manner as mentioned above.
Send to demodulation multiplexer 124 as the transport stream by the television broadcasting signal of antenna reception by tuner 121, QPSK demodulator circuit 122 and error correction circuit 123.Demodulation multiplexer 124 extracts the TS grouping of scheduled program from transport stream, view data TS grouping is sent to mpeg video decoder 125, and voice data TS grouping is sent to mpeg audio decoder 126.
Mpeg decode is carried out in 125 pairs of TS groupings that send from demodulation multiplexer 124 of mpeg video decoder.Therefore the view data that obtains sends to CRT 11 to show by signal processing 127 and matrix circuit 128 from mpeg video decoder 125.
Mpeg decode is carried out in 126 pairs of TS groupings that send from demodulation multiplexer 124 of mpeg audio decoder.Therefore the voice data that obtains sends to loudspeaker unit 12L and 12R to export by signal processing 127 and amplifier 137 from mpeg audio decoder 126.
Next step, Figure 11 illustrates the example electrical structure of auxiliary unit 2.
Because auxiliary unit 2 is formed to amplifier 157 to the identical tuner 141 of amplifier 137 with tuner 121 shown in Figure 10 by structure, therefore the descriptions thereof are omitted.
Because master unit 1 and auxiliary unit 2 have antenna terminal 22 and antenna terminal 42 separately as shown in Fig. 3 F and 5F, so antenna (arising from this cable) can be connected to each master unit 1 and the auxiliary unit 2 of taking on the television receiver that forms the television system expanded shown in Figure 1A or Figure 1B.Yet when antenna was connected to master unit 1 and auxiliary unit 2, wiring may be pretty troublesome.Antenna can be connected to composition and can expand one of television receiver of television system in can expanding television system, and the television broadcasting signal that is received by this television receiver by IEEE-1394 communications distribution for example to other television receivers.
Next step, in the present embodiment, the IEEE-1394 terminal 21 of master unit 1 middle-end sub-panel 21 ij(Fig. 3 F) is connected to auxiliary unit 2 by the IEEE-1394 cable ijThe IEEE-1394 terminal 41 of middle-end sub-panel 41 1(Fig. 5 F) communicates by letter (following the communication of IEEE-1394 standard) with the various types of data of transmission between master unit 1 and auxiliary unit 2 with electrical connection master unit 1 and auxiliary unit 2 thereby carry out IEEE-1394.
The below communicates by letter to Figure 21 B Short Description IEEE-1394 with reference to Figure 12.
IEEE 1394 is one of serial bus standards.IEEE-1394 communication allows the data synchronous transmission, therefore is suitable for transmitting data such as image and the sound that needs reproduce in real time.
More particularly, the data synchronous transmission can be carried out between the unit with IEEE-1394 interface (IEEE-1394 unit) with maximum 100 μ s (microsecond) transmission bands (although it is with time measurement, such appellation) and the interval of 125 μ s.In above-mentioned transmission band, allow a plurality of channels are carried out synchronous transmission.
Figure 12 illustrates the hierarchy of IEEE-1394 communication protocol.
The IEEE-1394 agreement has three layers of hierarchy, and it comprises transaction layer, link layer and physical layer.Each layer intercoms mutually, and communicates by letter with serial bus management.In addition, transaction layer is also communicated by letter with higher layer applications with link layer.Four kinds of sending and receiving message are used for communication.They are request, expression (demonstration), response and confirm.Arrow in Figure 12 represents these communication information types.
The communication of ending represents request message take " .req " as the arrow title, and the communication that ends up with " .ind " represents message, and the communication that ends up with " .resp " represents response message, and represents acknowledge message with the communication of " .conf " ending.For example, TR CONT.req refers to send to from serial bus management the request communication information of transaction layer.
Transaction layer is provided for the asynchronous transmission service that basis comes the request of self-application to communicate by letter with another IEEE-1394 unit (unit with IEEE-1394 interface) executing data, and realizes the request response protocol that ISO/IEC13213 is required.More particularly, the data transmission method according to the IEEE-1394 standard comprises above-mentioned synchronous transmission and asynchronous transmission.Transaction layer is carried out asynchronous transmission and is processed.The asynchronous transmission that the three kind processing units required by transaction layer protocol namely read affairs, write affairs and locking transaction is the transmission of data between the IEEE-1394 unit.
Link layer provides the data transport service of using confirmation, address process, error in data confirmation and data framing.An operation of the transmitted in packets that realizes by link layer is called child-operation.There are two kinds of child-operations: asynchronous child-operation and synchronous child-operation.
Asynchronous child-operation adopts the physical label (ID) of a node of appointment (the addressable unit in IEEE 1394) and the address in specified node to carry out, and confirmation of the node loopback of receive data.In the asynchronous broadcast child-operation of all nodes in data are set to the IEEE-1394 universal serial bus, the not loopback of node of receive data is confirmed.
In synchronous child-operation, adopt specified channel number with fixed intervals (125 μ s, the transmission of datas as previously mentioned).Do not return to confirmation in synchronous child-operation.
Physical layer will convert the signal of telecommunication to for the logical symbol of link layer.In addition, the congested request of the arbitration that physical layer process sends from link layer (contention to the node of attempting to carry out IEEE-1394 communication is arbitrated), and when bus reset the reconfiguring of execution IEEE-1394 universal serial bus when automatically distributing physics ID.
Serial bus management realizes essential bus control function, and CSR (the control ﹠amp of ISO/IEC 13212 is provided; Status register architecture controls and the status register framework).Serial bus management has the function of Node Controller, Isochronous Resource Manager and bus manager.The state of Node Controller control node, physical I D etc., and control transaction layer, link layer and physical layer.Isochronous Resource Manager is provided for the use state of the resource of synchronous communication.For carrying out synchronous communication, need at least one to have the IEEE-1394 unit of isochronous resource manager function in the unit that is connected to the IEEE-1394 universal serial bus.Bus manager is compared with other has highest function, and its purpose is to attempt the most suitably to use the IEEE-1394 universal serial bus.Isochronous Resource Manager and bus manager can be omitted.
The IEEE-1394 unit can connect by node branch or node daisy chain.When connecting a new IEEE-1394 unit, to bus reset, and determine tree sign, root node, physical I D, Isochronous Resource Manager, cycle controller (cycle master), bus manager etc.
Determine with the tree sign as the hierarchical relationship between the IEEE-1394 unit of node.Root node is specified the node that obtains to use the authority of IEEE-1394 universal serial bus in arbitration.Physical I D determines when the grouping that will be called the self ID grouping sends to each node.Self ID grouping also comprises the information of whether taking on Isochronous Resource Manager as node data transmission rate and node.
Isochronous Resource Manager is the node of use state that is provided for as mentioned above the resource of synchronous communication, and has as described later bandwidth register (bandwidth_available (available bandwidth) register) and channel number register (channels_available (available channel) register).In addition, Isochronous Resource Manager has the register of physical I D that the node of bus manager is taken in expression.When there is no bus manager in the node as the IEEE-1394 unit that connects by the IEEE-1394 universal serial bus, Isochronous Resource Manager is as simplifying bus manager.
Cycle controller begins to be grouped into the IEEE-1394 universal serial bus with the interval transmission circulation of 125 μ s, and wherein, 125 μ s are synchronous transmission intervals.For this reason, cycle controller has for register circulation timei (cycle_time register) to interval (125 μ s) counting.Root node is taken on cycle controller.When root node did not have the function of cycle controller, bus manager changed root node.
Power supply on bus manager managing I EEE-1394 universal serial bus, and change as mentioned above root node.
When determining as mentioned above Isochronous Resource Manager etc. after bus reset, ready by the transfer of data of IEEE-1394 universal serial bus.
In the synchronous transmission as one of IEEE-1394 data transmission method, obtain transmission band and transmission channel, then transmission wherein arranges to have the grouping (synchronous grouping) of data.
More particularly, in synchronous transmission, cycle controller will circulate the beginning multicasting to the IEEE-1394 universal serial bus with the interval of 125 μ s.When broadcast cycle began to divide into groups, the transmission of synchronous grouping was ready.
Be to carry out synchronous transmission, data need to be rewritten to by Isochronous Resource Manager and provide to obtain the bandwidth register of transmission band and the channel number register that is used for obtaining channel, obtain resource for synchronous transmission with statement.
Bandwidth register and channel number register all are assigned as a CSR (controlling and status register) as described later, and it has 64 bit address space by ISO-IEC 13213 definition.
Bandwidth register is 32 bit register, and wherein, high-order 19 bits are as reserve area, and low level 13 bits represent current operable transmission band (bw_remaining).
More particularly, the initial value of bandwidth register is 00000000000000000001001100110011B (B represents that value before is binary system) (=4915).This is due to following reason.IEEE 1394 definition take 1572.864Mbps (bps) required time of transmission 32 bits is as 1.Above-mentioned interval 125 μ s are corresponding to 00000000000000000001100000000000B (=6144).Yet it is 80% of interval 125 μ s that IEEE 1394 specifies the transmission band that can be used for synchronous transmission.Therefore, the maximum transmitted frequency band that can be used for synchronous transmission is 100 μ s, i.e. 00000000000000000001001100110011B as above (=4915).
The residue transmission band 25 μ s that obtain by the maximum transmitted frequency band 100 μ s that deduct from 125 μ s for synchronous transmission are used for asynchronous transmission.Asynchronous transmission is used for reading the value that is stored in bandwidth register and channel number register or is used for other purposes.
The beginning synchronous transmission need to be obtained transmission band.For example, when 10 μ s transmitting pins in using 125 μ s intervals bring and carry out synchronous transmission, need to obtain transmission band 10 μ s.Obtaining transmission band realizes by the value that rewrites bandwidth register.More particularly, for obtaining 10 μ s transmission bands as above, deduct the value 492 corresponding to 10 μ s from the value of bandwidth register, and with Difference Storage in bandwidth register.Therefore, when the value of bandwidth register is that 4915 (if not carrying out synchronous transmission) are when obtaining the transmission band of 10 μ s, the value of bandwidth register becomes 4423 (00000000000000000001000101000111B) from above-mentioned 4915, and it is to obtain by deducting from 4915 corresponding to 492 of 10 μ s.
When deducting value that the transmission band that will obtain (use) obtains by the value from bandwidth register less than zero the time, can not obtain transmission band, do not rewrite the value of bandwidth register, can't carry out synchronous transmission.
For carrying out synchronous transmission, also need to obtain transmission channel except transmission band.Rewrite the channel number register and obtain a transmission channel.
The channel number register is 64 bit register, and each bit is corresponding to a channel.More particularly, when being 1, (n-1) channel does not use, and when the n bit was 0, (n-1) channel used when n bit (the n bit that begins from minimum effective bit).Therefore, when not having channel just in use, the channel number register is 11111111111111111111111111111111111111111111111111111111 11111111B.When for example obtaining the first channel, the channel number register makes 11111111111111111111111111111111111111111111111111111111 11111101B into.
Because the channel number register has 64 bits as mentioned above, therefore maximum 64 channels altogether that can obtain the from the 0th to the 63rd channel in synchronous transmission.The 63rd channel is used for the broadcast synchronization grouping.
As mentioned above, owing to obtaining transmission band and transmission channel, then realize synchronous transmission, so data are, and the transmission rate that is guaranteed is transmitted.As mentioned above, it needing to be particularly suitable for the transfer of data of real-time rendering data, as image and sound.
Next step, the CSR framework with 64 bit address space by ISO/IEC 13213 definition is followed in IEEE-1394 communication.
Figure 13 illustrates the address space of CSR framework.
A high position 16 bits of CSR represent the ID of each node, are used for specifying distributing to each space, address of node and remain 48 bits.These high-order 16 bits further are divided into 10 bits of expression bus ID and 6 bits of expression physical I D (sense stricto node ID).Owing to being used for specific purposes by all bits being made as the value that " 1 " obtain, therefore can specify 1023 buses and 63 nodes.
In 256 gigabyte address spaces by the low level 48 bit appointments of CSR, the space of being determined by high-order 20 bits is divided into the initial register space of being used by 2048 byte CSR unique register, IEEE-1394 unique register etc.; Private room; The initial memory space; And other, and when the space of being determined by high-order 20 bits was used for the initial register space, the space of being determined by low level 28 bits was used for configuration ROM, to the initial cell space of the unique use of node, plug control register (PCR) etc.
Figure 14 illustrates offset address, title and the function of main CSR.
In Figure 14, the offset address that begins to count from the initial register initial FFFFF000000h (value before " h " expression is hexadecimal) in space is shown in " skew " list.Skew represents to distribute to the bandwidth of synchronous communication as mentioned above for the bandwidth register of 220h.The value that is only represented by the bandwidth register of the node of taking on Isochronous Resource Manager is just effective.In other words, each node all has CSR shown in Figure 13, but only has the bandwidth register of Isochronous Resource Manager just effective.Therefore, in fact only have Isochronous Resource Manager that bandwidth register is just arranged.
Skew for 224h in the channel number register of 228h, as mentioned above, each bit is corresponding to one of channel number 0 to 63, and the bit with " 0 " represents that respective channels is assigned with.The channel number register of only taking on Isochronous Resource Manager is just effective.
Get back to Figure 13, the configuration ROM of following general ROM form is arranged in the address 400h in initial register space to 800h.
Figure 15 illustrates general ROM form.
Can have a plurality of shared addresses space but unit in the node of independent operation as the node of the addressed location in IEEE 1394.The unit catalogue can represent to distribute to version and the position of the software of this unit.The position of bus message piece and root is fixed, but the position of other pieces is specified by offset address.
Figure 16 illustrates the details of bus message piece, root and unit catalogue.
The ID of company storage in the bus message piece represents No. ID of unit manufacturer.The chip id storage is unique in the ID of unit, and this ID is not equal to any ID of other unit.According to IEC 1833 standards, the one or the eight bit group of unit specific ID of unit catalogue that satisfies the unit of IEC 1883 is made as 00h, and its two or eight bit group is made as A0h, and its three or eight bit group is made as 2Dh.In addition, the one or eight bit group of unit sw version is made as 01h, and the LSB of its three or eight bit group (least significant bit, minimum effective bit) is made as " 1 ".
The address 900h of node in initial register shown in Figure 13 space has the PCR (plug control register) of appointment in IEC1883 to 9FFh.This is to obtain when logic forms the signal path that is similar to analog interface when the plug concept is reduced to.
Figure 17 illustrates the PCR structure.
PCR has the oPCR (output plug control register) of expression output plug and the iPCR (input plug control register) of expression input plug.PCR also has oMPR (exporting main plug register) and the iMPR (inputting main plug register) that represents respectively unique output plug in each unit and input plug information.The IEEE-1394 unit does not have a plurality of oMPR or a plurality of iMPR, but can have according to the performance of IEEE-1394 unit oPCR and iPCR corresponding to each plug.PCR shown in Figure 17 has 31 oPCR#0 to #30 and 31 iPCR#0 to #30.Synchronous data flow is controlled in the register of these plugs by operational correspondence.
Figure 18 A illustrates the structure of oMPR, oPCR, iMPR and iPCR to 18D.
Figure 18 A illustrates the oMPR structure, and Figure 18 B illustrates the oPCR structure, and Figure 18 C illustrates the iMPR structure, and Figure 18 D illustrates the iPCR structure.
OMPR and the iMPR data rate capacity field store in the highest dibit of MSB side represents the code of the peak transfer rate of the synchrodata that the unit can send or receive.Basic broadcast channel in oMPR (broadcast channel base) field is specified the channel number that is used for broadcasting output.
The output plug digital section storage of oMPR in minimum five bits of LSB side represents that the unit has the value of output plug number, namely, and the oPCR number.Perishability extended field and persistence extended field are the zones that defines for expansion in the future.
OPCR and iPCR represent the use state of plug in the online field of MSB.More particularly, on duty when being " 1 ", plug is on line state, and on duty when being " 0 ", plug is in off-line state.The broadcasting linkage counter field that oPCR is connected with iPCR represents whether the broadcasting connection exists (" 1 " expression exists, and " 0 " expression does not exist).Six bit width point to point connect counter field in oPCR and iPCR have the expression plug and have the value of point to point connect number.
The six bit width channel number fields that oPCR is connected with iPCR have the value of the synchronizing channel that connects of expression plug number.Dibit width data speed field in oPCR represents from the actual transfer rate of the isochronous data packet of plug output.Four bit width expense ID in oPCR have the code that exceeds bandwidth of expression synchronous communication.Ten bit width payload fields in oPCR represent can be by the maximum that is included in the data in synchronous grouping of plug processing.
The AV/C command set is designated as the control command for the IEEE-1394 unit of carrying out as mentioned above IEEE-1394 communication.In the present embodiment, master unit 1 and auxiliary unit 2 can be by controlling with the AV/C command set mutually.Also can control master unit 1 and auxiliary unit 2 with the specific command system that is different from the AV/C command set.
The below will be briefly described the AV/C command set.
Figure 19 illustrates the data structure with the AV/C command set grouping of asynchronous transfer mode transmission.
The AV/C command set is for the command set of controlling AV (audio frequency and video) unit.In the control system of using the AV/C command set, AV/C command frame and response frame transmit by FCP (function control protocol, function control protocol).In order bus and AV unit not to be applied load, command response should send in 100ms.
As shown in figure 19, the data in asynchronous packet are formed by 32 bits (quadlets (quadlet)) in the horizontal direction.Top in figure represents packet header, and the bottom in figure represents data block.Destination_ID represents the destination.
The CTS field represents command set ID, and is " 0000 " for the AV/C command set.When grouping comprised order, command type/response (ctype/response) field represented the Function Classification of ordering, and when grouping comprises response, the result of its expression order.Order roughly can be divided into Four types: (1) is used for controlling from the external world order (control) of function; (2) for the order (state) from extraneous query State; (3) be used for inquiring about from the external world order (general inquiry (whether command code is supported) and ad hoc inquiry (whether command code and operand are supported)) whether control command is supported; And (4) are used for asking state variation is reported to extraneous order (notice).
Response comes loopback according to command type.Response to control command comprises " unrealized " (not installing), " acceptance " (acceptance), " refusal " (refusal) and " temporarily " (temporarily).Response to status command comprises " unrealized ", " refusal ", " in conversion " (transition period) and " stablizing " (stability).Response to general inquiry order and ad hoc inquiry order comprises " realization " (installation) and " unrealized ".Response to notification command comprises " unrealized ", " refusal ", " temporarily " (temporarily) and " variation " (variation).
The subelement type field is used for the identify unit function.For example, distribute magnetic tape recorder/player or tuner.For distinguishing the subelement of a plurality of same types, subelement id field (after being positioned at the subelement type field) is with the differentiation that acts on addressing number.Command code (opcode) field represents an order.Operand (operand) field represents command parameter.The additional operations digital section is the field of placing the additional operations number.Filling field is to place the dummy argument data so that block length equals the field of predetermined number of bits.Data CRC (cycleredundancy check, cyclic redundancy check (CRC)) field represents for the mistake during transfer of data being carried out the CRC of verification.
Next step, Figure 20 A illustrates the specific examples of AV/C order to Figure 20 C.
Figure 20 A illustrates the specific examples in command type/response.The top of this figure represents order, and the bottom represents response." control " is assigned as " 0000 ", and " state " is assigned as " 0001 ", and " ad hoc inquiry " is assigned as " 0010 ", and " notice " is assigned as " 0011 ", and " general inquiry " is assigned as " 0100 "." 0101 " to " 01111 " is preserved for standard in the future." unrealized " is assigned as " 1000 ", and " acceptance " is assigned as " 1001 ", and " refusal " is assigned as " 1010 ", " in conversion " is assigned as " 1011 ", " realize/stable " being assigned as " 1100 ", " variation " is assigned as " 1101 ", and " temporarily " is assigned as " 1111 "." 1110 " are preserved for standard in the future.
Figure 20 B illustrates the specific examples in subelement type (subunit_type) field." video monitor " is assigned as " 00000 ", " dish recorder/player " is assigned as " 00011 ", " magnetic tape recorder/player " is assigned as " 00100 ", " tuner " is assigned as " 00101 ", " video camera " is assigned as " 00111 ", " unique in manufacturer " is assigned as " 11100 ", and " expanding to the subelement type of next byte " is assigned as " 11110 "." unit " is assigned as " 11111 ".This is to use in the time will sending to unit itself as the setting that is switched on or switched off power supply.
Figure 20 C illustrates the specific examples in command code (opcode) field.All there is a function code table for each subelement type, and the command code used when being magnetic tape recorder/player of the cell type of group shown in figure.Operand defines for each command code." manufacturer is relevant " is assigned as " 00h ", " search pattern " is assigned as " 50h ", " timing code " is assigned as " 51h ", and " ATN " is assigned as " 52h ", and " opening MIC " is assigned as " 60h ", " read MIC " and be assigned as " 61h ", " write MIC " and be assigned as " 62h ", " being written into medium " is assigned as " C1h ", and " record " is assigned as " C2h ", " broadcast " is assigned as " C3h ", and " rewinding " is assigned as " C4h ".
Figure 21 A and Figure 21 B illustrate the specific examples of AV/C order and AV/C response.
For example, when when indication is reproduced as the reproduction units of destination end (consumer) (side that will control), controller (controlling party) sends to destination end with the order as shown in Figure 21 A.Because the AV/C command set is used in this order, so CTS is made as " 0000 ".Due to the order (control) of using from extraneous control unit, so ctype is made as " 0000 " (Figure 20 A).Due to use magnetic tape recorder/player, so the subelement type field is made as " 00100 " (Figure 20 B).The id field is made as " 000 ", and this represents that this ID is non-vanishing.Opcode field is made as " C3h ", and (Figure 20 C) reproduced in this expression.Operand field is made as " 75h ", this expression " F.F. ".When reproducing, destination end is transmitted back to controller with the response as shown in Figure 21 B." acceptance " that expression is accepted is placed in response field.Response field is made as " 1001 " (seeing Figure 20 A).Because the field except response field has the value identical with Figure 21 A, therefore the descriptions thereof are omitted.
In television system can be expanded, carry out various controls by using above-mentioned AV/C command set between master unit 1 and auxiliary unit 2.In the present embodiment, between master unit 1 and auxiliary unit 2 in performed control, control definition newer command and the response that can not process for existing command and response, and with these newer commands with respond and carry out various controls.
Above-mentioned IEEE-1394 communication and AV/C command set see " WHITESERIES No.181 IEEE-1394 MULTIMEDIA INTERFACE (the white paper sequence number 181IEEE-1394 multimedia interface) " published by Triceps for details.
As described in reference Figure 10, the IR interface 135 of master unit 1 can receive and launch infrared ray, and the controller 15 of master unit 1 is with can to transmit and receive ultrared IR interface 135 corresponding, can not only send but also receiving infrared-ray.
Figure 22 illustrates the example electrical structure of controller 15.
Operation part 161 has with reference to Fig. 7 or the described various push-button switches that provide for controller 15 of Fig. 9, and will send to corresponding to the operation signal of pushbutton switch control section 162.
Control section 162 receives operation signal from operation part 161, and will represent that the command code (command code) that operation signal asks to process sends to frame generating portion 163.Control section 162 is also carried out various processing according to the output of reception ﹠ disposal part 167.In addition, control section 162 also stores device code in device code storage area 168 into.
163 generations of frame generating portion wherein are equipped with from the command code of control section 162 transmissions and are stored in the frame structure data (frame data) of the device code device storage part 168, and these data are sent to transmission processing part 164.
Transmission processing part 164 is modulated the carrier wave with preset frequency according to the frame data that send from frame generating portion 163, and drives light radiating portion 165 according to the modulation signal that therefore obtains.
Light radiating portion 165 is for example formed by LED, and launches infrared ray when being driven by transmission processing part 164.Infrared ray by 165 emissions of light radiating portion is for example received by IR interface 135 (Figure 10).
Light receiving part 166 receiving infrared-rays convert it to the signal of telecommunication, and this signal is sent to reception ﹠ disposal part 167.Light receiving part 166 for example receives the infrared ray by 135 emissions of IR interface.
Demodulation is carried out in the output of 167 pairs of light receiving parts 166 of reception ﹠ disposal part, and the frame data that will therefore obtain send to control section 162.
Next step, Figure 23 illustrates the frame format of the frame data that generated by frame generating portion 163.
Frame data are formed by the frame header part (leader) that is positioned at stem and two the data part #1 and the #2 that are positioned at thereafter.
Frame header partly has the data that formed by the predetermined bit string that represents the frame stem.
Data division #1 comprises device code and command code.
Device code is to distribute to the code of the equipment of carrying out the frame data transmission.When an equipment frames received according to the time, if be arranged in the device code that the device code coupling of frame data is distributed to this equipment, this equipment is thought frame data towards this equipment, and carries out the corresponding processing of command code that is arranged in frame data.
More particularly, in controller shown in Figure 22 15, when control section 162 from reception ﹠ disposal part 167 frames receiveds according to the time, control section 162 relatively is arranged in the device code and the device code that is stored in device code storage area 168 of frame data, and only have their couplings, control section 162 is just carried out the corresponding processing of command code that is arranged in frame data.
When the device code the frame data that send from reception ﹠ disposal part 167 was not mated the device code that is stored in device code storage area 168, control section 162 was ignored (abandoning) these frame data.Therefore, in this case, control section 162 is not carried out any processing.
Data division #2 has the data identical with data division #1.
In controller shown in Figure 22 15, when control section 162 from reception ﹠ disposal part 167 frames receiveds according to the time, control section 162 is data division #1 and the #2 in frame data relatively, and only has their couplings, control section 162 is compare facilities code as mentioned above.Therefore, if data division #1 and #2 in frame data do not mate, control section 162 is not carried out any processing.
If due to the data division #1 in frame data as mentioned above with #2 does not mate control section 162 is not carried out any processing, therefore prevent from processing the frame data (erroneous frame data) that are not successfully received.
As mentioned above, the IR interface 155 (Figure 11) of auxiliary unit 2 is also to construct in the mode identical with the IR interface 135 of master unit 1 shown in Figure 10, therefore can transmit and receive infrared ray.The controller 35 of auxiliary unit 2 is with can to transmit and receive ultrared IR interface 155 corresponding, can not only launch but also receiving infrared-ray.
Figure 24 illustrates the example electrical structure of controller 35.
Because controller 35 is formed to device code storage area 178 by operation part 171, wherein, these parts are to construct to the identical mode of device code storage area 168 with operation part 161 shown in Figure 22, and therefore the descriptions thereof are omitted.
Next step, Figure 25 illustrates the example detailed construction of the IR interface 135 (Figure 10) of master unit 1.
Control section 182 receives an order from CPU 129 (Figure 10), and will send to corresponding to the command code of this order frame generating portion 183.Control section is also from reception ﹠ disposal part 187 frames received certificates, and whether data division #1 and the #2 (Figure 23) of judgement frame data mate.In addition, when the data division #1 of frame data and #2 coupling, control section 182 is the device code (Figure 23) and the device code that is stored in device code storage area 188 of frame data relatively, and if their couplings, control section 182 sends to CPU 129 with the corresponding order of command code (Figure 23) of frame data.
In addition, control section 182 also is stored in device code in device code storage area 188.
183 generations of frame generating portion wherein are equipped with from the command code of control section 182 transmissions and are stored in the frame data of the device code device code storage area 188, and these data are sent to transmission processing part 184.
Transmission processing part 184 is modulated the carrier wave with preset frequency according to the frame data that send from frame generating portion 183, and drives light radiating portion 185 according to the modulation signal that therefore obtains.
Light radiating portion 185 is for example formed by LED, and launches infrared ray when being driven by transmission processing part 184.Infrared ray by 185 emissions of light radiating portion is for example received by the light receiving part 166 (Figure 22) of controller 15.
Light receiving part 186 receiving infrared-rays convert it to the signal of telecommunication, and this signal is sent to reception ﹠ disposal part 187.Light receiving part 186 for example receives the infrared ray by light radiating portion 165 (Figure 22) emission of controller 15.
Demodulation is carried out in the output of 187 pairs of light receiving parts 186 of reception ﹠ disposal part, and the frame data that will therefore obtain send to control section 182.
The IR interface 155 of auxiliary unit 2 is also to construct in the mode identical with the IR interface 135 of master unit 1 shown in Figure 25.
Next step processing with reference to the controller 15 of flow chart description shown in Figure 26 master unit 1 shown in Figure 22 (controller processing).Also carry out identical processing in the controller 35 of auxiliary unit 2 shown in Figure 24.
At step S1, control section 162 judges whether it receives when user's operating operation part 161 operation signal of taking on order from operation part 161.
When step S1 judgement not yet receives the operation signal of taking on order, in other words, when controller 15 is not operated, processes and enter step S2, and control section 162 judges whether it receives frame data.
When step S2 judgement not yet receives frame data, process and turn back to step S1, and repeat later on identical process.
When step S2 judgement receives frame data, in other words, when light receiving part 166 receives infrared ray and reception ﹠ disposal part 167 and will send to control section 162 corresponding to ultrared frame data, processing enters step S3, and control section 162 judges whether the device code of frame data mates the device code that is stored in device code storage area 168.
When the device code of judging frame data at step S3 is not mated device code in being stored in device code storage area 168, process and turn back to step S1, and repeat later on identical process.
During device code in the device code coupling of judging frame data at step S3 is stored in device code storage area 168, process and enter step S4, and the corresponding processing of command code carried out in frame data of control section 162, and process and turn back to step S1.
When step S1 judgement receives the operation signal of taking on order, in other words, when having sent to control section 162 when user's operating operation part 161 and corresponding to the operation signal of this operation, processing enters step S5, and whether control section 162 decision signals ask to arrange device code.
Device code storage area 168 storage default device codes.The user can change this device code.More particularly, the device code of controller 15 can be specified by scheduled operation such as the user menu button switch 54 of operation control 15 and the operation of TV power supply push-button switch 72 (Fig. 7) simultaneously.At step S5, operate the decision signal whether to ask to arrange device code in the time of according to menu button switch 54 and TV power supply push-button switch 72.
When the operation signal that sends from operation part 161 in step S5 judgement does not ask device code is set, control section 162 will send to corresponding to the command code of operation signal frame generating portion 163, and processing enters step S6.
At step S6, frame generating portion 163 is placed from control section 162 command code that sends and the device code that is stored in device storage part 168, has the frame data of form shown in Figure 23 with generation, and these data are sent to transmission processing part 164.Processing enters step S7.
At step S7, transmission processing part 164 drives light radiating portion 165 according to the frame data that send from frame generating portion 163, and processing turns back to step S1.165 emissions of light radiating portion are corresponding to the infrared ray of frame data.
When judging that at step S5 the operation signal request that sends from operation part 161 arranges device code, in other words, simultaneously when the menu button switch 54 of operation control 15 and TV power supply push-button switch 72 (Fig. 7), process entering step S8 as the user.Control section 162 is waited for the operation signal that sends corresponding to device code from operation part 161, receives this operation signal, and (overriding) is set corresponding to the device code of this operation signal in device code storage area 168.Then, process and turn back to step S1, and repeat later on identical process.
Device code can be for example the numerical value with predetermined numerical digit.In this case, the user comes the input equipment code by the digital button switch 58 (Fig. 7) of for example operation control 15.
Next step is with reference to the processing (processing of IR interface) of the IR interface 135 of flow chart description shown in Figure 27 master unit 1 shown in Figure 25.The IR interface (Figure 11) of auxiliary unit 2 is also carried out identical processing.
At step S21, control section 182 judges whether it receives an order from CPU 129.
When step S21 judgement not yet receives order, process and enter step S22, and control section 182 judges whether it receives frame data from reception ﹠ disposal part 187.
When step S22 judgement not yet receives frame data, process and turn back to step S21, and repeat later on identical process.
When step S22 judgement receives frame data, in other words, when light receiving part 186 for example receives the infrared ray that sends from controller 15 and reception ﹠ disposal part 187 and will send to control section 182 corresponding to ultrared frame data, processing enters step S23, and control section 182 judges whether the device code of frame data mates the device code that is stored in device code storage area 188.
When the device code of judging frame data at step S23 is not mated device code in being stored in device code storage area 188, process and turn back to step S21, and repeat later on identical process.
During device code in the device code coupling of judging frame data at step S23 is stored in device code storage area 188, processing enters step S24, and control section 182 sends to CPU 129 with the corresponding order of the command code in frame data, and processing turns back to step S21.
Therefore, in this case, CPU 129 carries out the corresponding processing of order that sends from IR interface 135 (control section 182).
When step S21 judgement receives an order from CPU 129, process and enter step S25, and control section 182 judges whether these orders ask to arrange device code.
Device code storage area 188 is stored the default device code in the mode identical with device code storage area 168 shown in Figure 22.The user can change this device code.More particularly, the device code of IR interface 135 can be specified by the operation that scheduled operation such as user operate two push-button switches on front panel 134 (Figure 10) simultaneously, wherein, these two push-button switches are corresponding to menu button switch 54 and the TV power supply push-button switch 72 (Fig. 7) of controller 15.At step S25, operate in the time of according to this two push-button switches to judge whether the order from CPU 129 transmissions asks to arrange device code.
When the operation signal that sends from CPU 129 in step S25 judgement does not ask device code is set, control section 182 will send to corresponding to the command code of this order frame generating portion 183, and processing enters step S26.
At step S26, frame generating portion 183 is placed from control section 182 command code that sends and the device code that is stored in device code storage area 188, has the frame data of form shown in Figure 23 with generation, and these data are sent to transmission processing part 184.Processing enters step S27.
At step S27, transmission processing part 184 drives light radiating portion 185 according to the frame data that send from frame generating portion 183, and processing enters step S21.185 emissions of light radiating portion are corresponding to the infrared ray of frame data.Infrared ray is for example received by controller 15.
When the operation signal request that sends from CPU 129 in step S25 judgement arranges device code, in other words, when the user operates two push-button switches on front panel 134 (Figure 10) simultaneously, wherein, these two push-button switches are processed and are entered step S28 corresponding to menu button switch 54 and the TV power supply push-button switch 72 (Fig. 7) of controller 15.Control section 182 is waited for from CPU 129 transmitting apparatus codes, receives this device code, and in device code storage area 188, (overriding) this device code is set.Then, process and turn back to step S21, and repeat later on identical process.
In master unit 1, the push-button switch of user on can the corresponding front panel 134 of digital button switch 58 (Figure 10) of operation control 15 comes the input equipment code.
As mentioned above, the user can arrange device code for the IR interface 135 of controller 15 and 35, master unit 1 and the IR interface 155 of auxiliary unit 2.In addition, in the middle of them, command code can be only transmitted having between those equipment of identical device code.
Therefore, for example, for controlling master unit 1 by controller 15, the device code of the IR interface 135 of the device code of controller 15 and master unit 1 need to be made as identical value.In addition, for example, for controlling auxiliary unit 2 by controller 15, the device code of the IR interface 155 of the device code of controller 15 and auxiliary unit 2 need to be made as identical value.And, for example, when the device code of the IR interface 155 of the IR interface 135 of controller 15, master unit 1 and auxiliary unit 2 makes when equating, if user's operation control 15 is carried out identical processing in master unit 1 and auxiliary unit 2.
For example, even when the user only has controller 15, if can expand master unit 1 and the auxiliary unit 2 of the television receiver of television system as composition ijMiddle different device codes is set, the user can be made as by the device code with controller 15 device code of required television receiver, with the independent remote control of a controller 15 as the master unit 1 and the auxiliary unit 2 that form the television receiver that can expand television system ij
Next step is with reference to the processing of flow chart description shown in Figure 28 master unit 1 shown in Figure 10.
At first, at step S41, CPU 129 determines whether the event that any unit is connected to terminal panel 21 or sends any order by IEEE-1394 interface 133, IR interface 135 etc. that occurs.When judgement any event does not occur, process turning back to step S41.
When judging that at step S41 a unit occurs is connected to the event of terminal panel 21, process and enter step S42, and CPU 129 carries out authentication processing shown in Figure 29 described later.Then, processing turns back to step S41.
For having determined whether that a unit is connected to terminal panel 21, need to detect this unit and be connected to terminal panel 21 these states.This test example is as carrying out in following mode.
The IEEE-1394 terminal 21 that provides for terminal panel 21 (Fig. 3 F) is provided when a unit (by the IEEE-1394 cable) ijThe time, IEEE-1394 terminal 21 ijTerminal voltage change.IEEE-1394 interface 133 is configured to make it that terminal voltage is changed and reports to CPU 129.CPU 129 receives the report of terminal change in voltage from IEEE-1394 interface 133, be connected to terminal panel 21 these states to detect new unit.CPU 129 also for example by Same Way identify a unit with this state of terminal panel 21 disconnections.
When step S41 judgement occurs to send the event of any orders by IEEE-1394 interface 133, IR interface 135 etc., process and enter step S43, and master unit 1 execution is corresponding to the processing of this order.Then, processing turns back to step S41.
Next step, the authentication processing of being carried out by master unit 1 at step S42 shown in Figure 28 with reference to flow chart description shown in Figure 29.
In the authentication processing of being carried out by master unit 1, carry out two kinds of authentications: whether the unit (in case of necessity hereinafter referred to as connecting the unit) that newly is connected to terminal panel 21 is to authorize the IEEE-1394 unit, and whether the IEEE-1394 unit is the television receiver (can expand and use the unit) of taking on master unit or auxiliary unit.
More particularly, in the authentication processing by master unit 1 execution, at first at step S51, CPU 129 controls IEEE-1394 interfaces 133 will send to linkage unit for the authentication request order that request authenticates mutually, and processing enters step S52.
At step S52, CPU 129 determines whether from linkage unit and returns to response to the authentication request order.Process and enter step S53 when returning to response to the authentication request order from linkage unit when judging at step S52 not yet, and whether CPU 129 determines whether and occur overtimely namely, pass by the scheduled time since sending the authentication request order.
When judge at step S53 occured overtime, in other words, even when also not returning to response to the authentication request order from linkage unit during the scheduled time in the past after the authentication request order sends to linkage unit, processing entering step S54.CPU 129 judges that linkage unit is not to authorize IEEE-1394 unit and authentification failure, and operator scheme is made as the stand-alone mode that does not transmit any data with linkage unit.Processing is returned.
Therefore, after master unit 1 not be not any data of linkage unit transmission of authorizing the IEEE-1394 unit, and do not carry out IEEE-1394 communication.
Not yet occur to process and turn back to step S52, and repeat later on identical process when overtime when judging at step S53.
Return to response to the authentication request order when judging at step S52 from linkage unit, in other words, received by IEEE-1394 interface 133 and when sending to CPU 129 from the response of linkage unit, processing enters step S55, and CPU 129 generates random number (pseudo random number) R1 according to pre-defined algorithm, and by IEEE-1394 interface 133, it is sent to linkage unit.
Then, processing enters step S56, and CPU 129 determines whether and sends encrypted random number E ' (R1) from linkage unit, wherein, encrypted random number E ' is (R1) by according to pre-defined algorithm (secret key encryption method such as DES (Data Encryption Standard, data encryption standard), FEAL (Fast DataEncipherment Algorithm, rapid data cryptographic algorithm) or RC5) random number R 1 that sends at step S55 is encrypted obtains.
Send encrypted random number E ' (R1) time when judging at step S56 not yet from linkage unit, process and enter step S57, and whether CPU 129 determines whether and occur overtimely namely, pass by the scheduled time since sending random number R 1.
Occured overtimely when judging at step S57, in other words, even also do not return to encrypted random number E ' (R1) time from linkage unit during the scheduled time in the past after random number R 1 sends to linkage unit, processing entering step S54.CPU 129 judges that as mentioned above linkage unit is not to authorize the IEEE-1394 unit, and operator scheme is made as stand-alone mode.Processing is returned.
Not yet occur to process and turn back to step S56, and repeat later on identical process when overtime when judging at step S57.
Return to encrypted random number E ' (R1) from linkage unit when judging at step S56, in other words, (R1) received by IEEE-1394 interface 133 and when sending to CPU129 from the encrypted random number E ' of linkage unit, processing enters step S58, and CPU 129 is encrypted the random number R 1 that generates at step S55 according to predetermined cryptographic algorithm, to generate encrypted random number E (R1).Processing enters step S59.
At step S59, CPU 129 judges whether the encrypted random number E ' that sends from linkage unit (R1) equals the encrypted random number E (R1) that generated by CPU 129 at step S58.
When judging that at step S59 encrypted random number E ' (R1) is not equal to encrypted random number E (R1), in other words, when the cryptographic algorithm that linkage unit adopts (comprise in case of necessity for encrypt key) is different from the cryptographic algorithm that CPU 129 adopts, processing enters step S54, and CPU 129 judges that as mentioned above linkage unit is not to authorize the IEEE-1394 unit, and operator scheme is made as stand-alone mode.Processing is returned.
When judging that at step S59 encrypted random number E ' (R1) equals encrypted random number E (R1), in other words, when the cryptographic algorithm that the cryptographic algorithm that linkage unit adopts and CPU 129 adopt is identical, processing enters step S60, and CPU 129 determines whether from linkage unit and sends the random number R 2 that is used for authenticating master unit 1 by linkage unit.
When step S60 judgement not yet receives random number R 2, processing enters step S61, and CPU 129 determine whether occur overtime, namely, for example, whether judge that at step S59 encrypted random number E ' (R1) equals encrypted random number E (R1) and passes by the scheduled time afterwards.
When having occured overtime in step S61 judgement, in other words, even when also sending random number R 2 from linkage unit when the scheduled time in past, processing enters step S54, and CPU 129 judges that as mentioned above linkage unit is not to authorize the IEEE-1394 unit, and operator scheme is made as stand-alone mode.Processing is returned.
On the other hand, not yet occur to process and turn back to step S52, and repeat later on identical process when overtime when judging at step S61.
Send random number R 2 when judging at step S60 from linkage unit, in other words, received by IEEE-1394 interface 133 and when sending to CPU 129 from the random number R 2 of linkage unit, processing enters step S62, and CPU 129 is encrypted random number R 2 according to predetermined cryptographic algorithm, with generation encrypted random number E (R2), and by IEEE-1394 interface 133, it is sent to linkage unit.
When judging from linkage unit transmission random number R 2 at step S60, linkage unit is the IEEE-1394 unit by success identity.
Then, process to enter step S63, and CPU 129 controls IEEE-1394 interfaces 133 will be used for asking the unit ID of linkage unit and the function-information request command of function information and unit ID and the function information of master unit 1 to send to linkage unit.
Unit ID is unique ID that sign is taken on the television receiver of master unit 1 or auxiliary unit 2.
Function information is the information relevant to having function, and (for example for example comprise the type of the order that will receive from the external world, to receive to be used for control from the external world which of order of mains switch, volume adjusting, channel, brightness and acutance), whether allow screen display (OSD demonstration), whether allow mute state, and whether allow sleep state.In addition, function information comprises also whether the unit has the function of master unit or the function of auxiliary unit.
Master unit 1 can for example be stored in unit ID and function information in vendor_dependent_information (manufacturer's relevant information) field of EEPROM 130 or configuration ROM shown in Figure 15.
Then, process to enter step S64, and CPU 129 wait-for-responses send to the function-information request command of linkage unit and arrive unit ID and the function information of linkage unit at step S63.CPU 129 passes through IEEE-1394 interface 133 receiving element ID and function informations, and they are stored in EEPROM130, and processing enters step S65.
At step S65, CPU 129 is with reference to the function information that is stored in EEPROM 130, to judge that whether linkage unit is as auxiliary unit.When judging that at step S65 linkage unit is auxiliary unit, in other words, when linkage unit is auxiliary unit by success identity, processes skips steps S66 and S67, and enter step S68.CPU 129 is made as many viewpoints with operator scheme and shows the possibility pattern, wherein, provides aftermentioned virtual many viewpoints Presentation Function together with the linkage unit of taking on auxiliary unit.Processing is returned.
On the other hand, when judging that at step S65 linkage unit is not auxiliary unit, process and enter step S66, and CPU 129 is with reference to being stored in function information in EEPROM 130 to judge that whether linkage unit is as master unit.When judging that at step S66 linkage unit is master unit, in other words, when linkage unit is master unit by success identity, processes and enter step S67, and CPU 129 carries out master unit with the linkage unit of taking on master unit and the adjustment of auxiliary unit is processed.
Specifically, in this case, because master unit 1 is connected to another master unit, therefore exist two television receivers to take on master unit in forming the television receiver that to expand television system.In the present embodiment, requirement can be expanded only a master unit in television system.Therefore, at step S67, carry out master unit and the adjustment of auxiliary unit and process, wherein, judgement is that master unit 1 or the linkage unit of taking on master unit are used as the television receiver of taking on master unit.
More particularly, for example, judging becomes that master unit that can expand a television system part the earliest, and namely, the master unit 1 in the present embodiment is as the television receiver of taking on master unit.Be judged to be as other master units outside the master unit 1 of master unit as auxiliary unit.
Carry out after master unit processes with the adjustment of auxiliary unit at step S67, process and enter step S68, and CPU 129 as mentioned above operator scheme is made as that many viewpoints show may patterns, and processing is returned.
When judging that at step S66 linkage unit is not master unit, in other words, linkage unit is not neither master unit is again auxiliary unit, therefore when linkage unit is not authenticated to be master unit or auxiliary unit, processes and enter step S69, and CPU 129 is made as common command function with operator scheme and receives/supply a pattern, wherein, can have the AV/C command set now with the linkage unit transmission, but can not transmit be used to the control command that many viewpoints Presentation Function is provided, and process and return.
In other words, in this case, neither master unit is not again auxiliary unit, even therefore this linkage unit is connected to master unit 1, do not provide many viewpoints Presentation Function due to linkage unit yet.Yet, in this case, because linkage unit is to authorize the IEEE-1394 unit, therefore allow the existing AV/C command set of transmission between master unit 1 and linkage unit.Thereby in this case, between master unit 1 and linkage unit, one another IEEE-1394 unit of master unit 1 (or be connected to) can control another by existing AV/C command set.
Next step is with reference to the processing of flow chart description shown in Figure 30 auxiliary unit 2 shown in Figure 11.
At first, at step S71, CPU 149 determines whether the event that any unit is connected to terminal panel 41 or sends any order by IEEE-1394 interface 153 or IR interface 155 that occurs.When judgement any event does not occur, process turning back to step S71.
When judging that at step S71 a unit occurs is connected to the event of terminal panel 41, process and enter step S72, and CPU 149 carries out authentication processing shown in Figure 31 described later.Then, processing turns back to step S71.
For having determined whether that a unit is connected to terminal panel 41, need to detect this unit and be connected to terminal panel 41 these states.This test example is as to carry out with mode identical as described in step S41 shown in Figure 28.
When occuring in step S71 judgement to send the event of any orders by IEEE-1394 interface 153 or IR interface 155, process and enter step S73, and 2 execution of auxiliary unit are corresponding to the processing of this order.Then, processing turns back to step S71.
Next step, the authentication processing of being carried out by auxiliary unit 2 at step S72 shown in Figure 30 with reference to flow chart description shown in Figure 31.
In the authentication processing of being carried out by auxiliary unit 2, carries out two kinds of authentications: whether the unit (in case of necessity hereinafter referred to as connecting the unit) that newly is connected to terminal panel 41 is mandate IEEE-1394 unit, and whether the IEEE-1394 unit is master unit.
More particularly, in the authentication processing of being carried out by auxiliary unit 2, at first at step S81, CPU 149 determines whether to send from linkage unit and is used for the request authentication request order of authentication mutually.When judgement not yet sends this order, process entering step S82.
Whether at step S82, CPU 149 determines whether that generation is overtime, namely, pass by the scheduled time since the beginning authentication processing.
Occured overtimely when judging at step S82, in other words, even when also sending the authentication request order from linkage unit during the scheduled time in the past after the order of beginning authentication request, processing entering step S83.CPU 149 judges that linkage unit is not to authorize IEEE-1394 unit and authentification failure, and operator scheme is made as the stand-alone mode that does not transmit any data with linkage unit.Processing is returned.
Therefore, auxiliary unit 2 not with any data of linkage unit transmission that are not mandate IEEE-1394 unit, and is not carried out IEEE-1394 communication in the mode identical with master unit 1.
Not yet occur to process and turn back to step S81, and repeat later on identical process when overtime when judging at step S82.
Send the authentication request order when judging at step S81 from linkage unit, in other words, the authentication command that sends from the master unit 1 of taking on linkage unit at step S51 shown in Figure 29 is received by IEEE-1394 interface 153 and when sending to CPU 149, processing enters step S84, and CPU 149 controls IEEE-1394 interface 153 will the response of authentication request order being sent to linkage unit.
In the present embodiment, master unit 1 is carried out step S51 shown in Figure 29 to the processing of S53, and step S81, S82 shown in Figure 31 and the processing of S84 are carried out in auxiliary unit 2.Can also be that 2 execution step S51 shown in Figure 29 in auxiliary unit carries out step S81, S82 shown in Figure 31 and the processing of S84 to processing and the master unit 1 of S53.In other words, master unit 1 or auxiliary unit 2 can send the authentication request order.
Then, process and enter step S85, and CPU 149 determines whether from linkage unit transmission random number R 1.When judgement not yet sends random number R 1, process entering step S86.
At step S86, whether CPU 149 determines whether and occurs overtimely, namely, pass by the scheduled time since step S84 sends response to the authentication request order since CPU 149.
When judge at step S86 occured overtime, in other words, even when also sending random number R 1 from linkage unit during the scheduled time in the past after CPU 149 sends the response of authentication command, processing entering step S83.CPU 149 judges that as mentioned above linkage unit is not to authorize the IEEE-1394 unit, and operator scheme is made as the stand-alone mode that does not transmit any data with linkage unit.Processing is returned.
Not yet occur to process and turn back to step S85, and repeat later on identical process when overtime when judging at step S86.
Send random number R 1 when judging at step S85 from linkage unit, in other words, the random number R 1 that sends from the master unit 1 of taking on linkage unit at step S55 shown in Figure 29 is received by IEEE-1394 interface 153 and when sending to CPU 149, processing enters step S87, and CPU 149 is encrypted random number R 1 according to predetermined cryptographic algorithm, to generate encrypted random number E ' (R1).In addition, at step S87, CPU 149 controls IEEE-1394 interfaces 153 encrypted random number E ' (R1) is sent to linkage unit.Processing enters step S89.
At step S89, CPU 149 generates random number (pseudo random number) R2, and controls IEEE-1394 interface 153 so that random number R 2 is sent to linkage unit.Processing enters step S90.
At step S90, CPU 149 determines whether from linkage unit and sends encrypted random number E (R2), and wherein, encrypted random number E (R2) is by random number R 2 being encrypted and being generated by the master unit 1 of taking on linkage unit at step S62 shown in Figure 29.
Process and enter step S91 when not yet receiving encrypted random number E (R2) when judging at step S90, and CPU 149 determines whether and occur overtimely, namely, CPU 149 passes by the scheduled time after sending random number R 2.
When having occured overtime in step S91 judgement, in other words, even when CPU 149 sends random number R 2 and also sends encrypted random number E (R2) from linkage unit during the scheduled time in the past after the linkage unit, processing enters step S83, and CPU 149 judges that as mentioned above linkage unit is not to authorize the IEEE-1394 unit, and operator scheme is made as stand-alone mode.Processing is returned.
On the other hand, not yet occur to process and turn back to step S90, and repeat later on identical process when overtime when judging at step S91.
Send encrypted random number E (R2) when judging at step S90 from linkage unit, in other words, received by IEEE-1394 interface 153 and when sending to CPU 149 from the encrypted random number E (R2) of linkage unit, processing enters step S92, and CPU 149 is encrypted the random number R 2 that generates at step S89 according to predetermined cryptographic algorithm, to generate encrypted random number E ' (R2).Processing enters step S93.
At step S93, CPU 149 judges whether the encrypted random number E (R2) that sends from linkage unit equals the encrypted random number E ' that generated by auxiliary unit 2 at step S92 (R2).
When judging that at step S93 encrypted random number E (R2) is not equal to encrypted random number E ' (R2), in other words, when the cryptographic algorithm that linkage unit adopts (comprise in case of necessity for encrypt key) is different from the cryptographic algorithm that CPU 149 adopts, processing enters step S83, and CPU 149 judges that as mentioned above linkage unit is not to authorize the IEEE-1394 unit, and operator scheme is made as stand-alone mode.Processing is returned.
When judging that at step S93 encrypted random number E (R2) equals encrypted random number E ' (R2), in other words, thereby the cryptographic algorithm that linkage unit adopts is when authorizing the IEEE-1394 unit with the identical linkage unit of cryptographic algorithm that CPU 149 adopts by success identity, processing enters step S94, and CPU 149 is received in by IEEE-1394 interface 153 unit ID and the function information that step S63 shown in Figure 29 is sent together with function-information request command by the master unit 1 of taking on linkage unit, and they are stored in EEPROM 150.
Then, process to enter step S95, and CPU 149 controls function-information request command that IEEE-1394 interfaces 153 receive from linkage unit in response to step S94 and unit ID and the function information of auxiliary unit 2 is sent to linkage unit.Then, processing enters step S96.
Auxiliary unit 2 can adopt the mode that is same as with reference to the described master unit 1 of Figure 29, unit ID and function information is stored in vendor_dependent_information (manufacturer's relevant information) field of EEPROM 150 or configuration ROM shown in Figure 15.
At step S96, CPU 149 is with reference to the function information that is stored in EEPROM 150, to judge that whether linkage unit is as master unit.When judging that at step S96 linkage unit is master unit, in other words, when linkage unit is master unit by success identity, processing enters step S97, and CPU 149 is made as many viewpoints with operator scheme and shows the possibility pattern, wherein, can provide virtual many viewpoints Presentation Function together with the linkage unit of taking on master unit.Processing is returned.
On the other hand, when judging that at step S96 linkage unit is not master unit, in other words, when linkage unit is not authenticated to be master unit, process to enter step S98, and CPU 149 is made as common command function with operator scheme and receives/supply a pattern, wherein, can have the AV/C command set now with the linkage unit transmission, but can not transmit be used to the control command that many viewpoints Presentation Function is provided, and process and return.
In other words, in this case, because linkage unit is not master unit, even therefore this linkage unit is connected to auxiliary unit 2, do not provide many viewpoints Presentation Function yet.Therefore, only by another auxiliary unit is connected to auxiliary unit 2, do not provide many viewpoints Presentation Function.Yet, in this case, because linkage unit is to authorize the IEEE-1394 unit, therefore allow the existing AV/C command set of transmission between auxiliary unit 2 and linkage unit.Thereby, in this case, between auxiliary unit 2 and linkage unit (comprising other auxiliary unit), can control another by existing AV/C command set for one.
Next step, master unit 1 and auxiliary unit 2 be successfully completed respectively with reference to Figure 29 and the described authentication processing of Figure 31 and master unit 1 and auxiliary unit 2, their operator scheme is made as many viewpoints show may patterns after, when user's operation control 15 (or controller 35) asked many viewpoints to show, virtual many viewpoints Graphics Processing described later was carried out in master unit 1 and auxiliary unit 2.
The instruction that is used for carrying out virtual many viewpoints Graphics Processing for example can be sent from menu screen.
More particularly, as mentioned above, when the menu button switch 54 on user's operation control 15 (Fig. 7) (the perhaps menu button switch 84 on controller 35 (Fig. 8)), menu screen is presented on the CRT 11 (the perhaps CRT 31 of auxiliary unit 2) of master unit 1.On this menu screen, for example show the icon (being called virtual many viewpoints display icon in case of necessity) of a virtual many viewpoints Graphics Processing of expression.When user's operation control 15 is clicked virtual many viewpoints display icon, carry out virtual many viewpoints Graphics Processing in master unit 1 and auxiliary unit 2.
Figure 32 illustrates the first illustrative functions structure of the signal processing 127 (Figure 10) in the master unit 1 of carrying out virtual many viewpoints Graphics Processing.When clicking virtual many viewpoints display icon, the CPU 129 control signal processing sections 127 of master unit 1 are so that DSP 127A carries out the preset program that is stored in EEPROM 127B.Functional structure shown in Figure 32 is to realize when DSP 127A carries out program in being stored in EEPROM 127B.Other functional structures of signal processing 127 described later also realize in the same manner.
Frame memory 191,192 and 193 is (perhaps (field) is unit take the territory) interim brightness signal Y and chrominance signal R-Y and B-Y that stores as the view data of exporting from mpeg video decoder 125 (Figure 10) take frame as unit.Specifically, 125 pairs of mpeg video decoders carry out mpeg decode from the view data TS grouping of the predetermined channel program of demodulation multiplexer 124 outputs, and the view data that will be formed by brightness signal Y and chrominance signal R-Y and B-Y is as decoded result.Brightness signal Y and chrominance signal R-Y and B-Y that frame memory 191,192 and 193 storages are exported from mpeg video decoder 125 by this way.
In embodiment shown in Figure 32, frame memory 191 to 193 has the memory capacity that allows storage at least two frames (two territories) view data.Specifically, frame memory 191 to 193 has two memory banks (bank) that can store separately a frame image data, and view data alternately is stored in these two memory banks.
Therefore, when the nearest frame in being stored in frame memory 191 was called present frame, frame memory 191 was always stored the view data of present frame and former frame (being called former frame in case of necessity) thereof.Frame memory 192 and 193 is storing image data in the same manner.
Frame memory 194,195 and 196 is stored in respectively brightness signal Y and chrominance signal R-Y and the B-Y that stores and divide from memory controller 197 frame (or a territory) view data that send in frame memory 191,192 and 193.
Memory controller divides 197 to be controlled by system controller 201, and the view data (brightness signal Y and chrominance signal R-Y and B-Y) that will be stored in respectively the present frame in frame memory 191 to 193 sends to frame memory 194 to 196, and stores data to override mode.
Difference test section 198 obtains difference between the brightness signal Y of view data of the present frame that is stored in frame memory 191 and former frame as the feature of current frame image data, and this difference is sent to system controller 201.Specifically, for example, difference test section 198 obtains absolute difference sum between the luminance signal of the brightness signal Y of current frame pixel and former frame pixel as the feature of current frame image, and should and send to system controller 201.
Counter portion 199 is carried out the counting of predetermined value under the control of system controller 201 and count value is sent to system controller 201.Counter portion 199 also according to the control of system controller 201 to its count resets.
Output control part 200 reads the brightness signal Y and chrominance signal R-Y and the B-Y that are stored in the frame image data in frame memory 194 to 196, and they is sent to CPU 129 under the control of system controller 201.
System controller 201 control storage control sections 197, counter portion 199 and output control part 200.
More particularly, system controller 201 with predetermined threshold with the 198 absolute difference sums that send compare as the current frame image data characteristics and from the difference test section, and according to comparative result control counter part 199.System controller 201 also comes control storage control section 197 and output control part 200 according to the counting of counter portion 199.
At this, system controller 201 is according to the absolute difference sum of brightness signal Y, control counter part 199.For example, can also be that difference test section 198 obtains the absolute difference sum of chrominance signal R-Y or B-Y and system controller 201 and considers that the absolute difference sum of chrominance signal R-Y or B-Y comes control counter part 199.
As sending to frame memory 191 to 193 from the brightness signal Y of the view data of mpeg video decoder 125 (Figure 10) output and chrominance signal R-Y and B-Y and being positioned at matrix circuit 128 after signal processing 127.The brightness signal Y that matrix circuit 128 will send by this way and chrominance signal R-Y and B-Y convert rgb image data to.
Figure 32 does not illustrate the voice data of mpeg audio decoder 126 (Figure 10) output from master unit 1.For example in statu quo send to the amplifier 137 of rear one-level from the voice data of mpeg audio decoder 126 outputs.
Next step, the virtual many viewpoints Graphics Processing in the master unit of being carried out by signal processing shown in Figure 32 127 with reference to flow chart description shown in Figure 33.
At first, at step S101, frame memory 191 to 193 is waited for from brightness signal Y and chrominance signal R-Y and the B-Y of mpeg video decoder 125 (Figure 10) arrival as a frame image data, and storage brightness signal Y and chrominance signal R-Y and B-Y.Processing enters step S102.
at step S102, the brightness signal Y of the view data during difference test section 198 is detected (obtaining) performed step S101 is stored in frame memory 191 before the next-door neighbour is that the brightness signal Y of the brightness signal Y of current frame image data and the view data during performed step S101 is stored in frame memory 191 in last operation is that absolute difference sum (the absolute difference sum that is called present frame in case of necessity) between the brightness signal Y of former frame view data is as the feature of current frame image data, and should and send to system controller 201.Processing enters step S103.
At step S103, system controller 201 judges whether the absolute difference sum of present frames is almost nil, namely, should and whether be equal to or less than (or less than) little positive threshold value.
Non-vanishing or during close to zero value when the absolute difference sum of judging present frame at step S103, process skips steps S104 to S108, and enter step S109.
When the absolute difference sum of judging present frame at step S103 as zero or close to zero value, in other words, current frame image and former frame image almost do not have (or not having) to change when therefore current frame image can be regarded still frame as, processing enters step S104, and 199 pairs of countings of system controller 201 control counter part increase 1.Processing enters step S105.
At step S105, the counting of system controller 201 reference count parts 199 is to judge that whether counting is greater than (perhaps being equal to or greater than) predetermined threshold Th c(as 5).
When judging that at step S105 the counting of counter portion 199 is not more than threshold value Th cThe time, process skips steps S106 to S108, and enter step S109.
When judging that at step S105 the counting of counter portion 199 is greater than threshold value Th cIn other words, when not comprising any motion from the view data of the predetermined number of frames of mpeg video decoder 125 output, processing enters step S106, and system controller 201 control storage control sections 197 send to frame memory 194 to 196 with the current frame image data (brightness signal Y and chrominance signal R-Y and B-Y) that will be stored in frame memory 191 to 193, and to override mode, these data are stored in wherein.Processing enters step S107.
At step S107, the count resets of 201 pairs of counter portion 199 of system controller.Processing enters step S108.At step S108, system controller 201 is controlled output control part 200 to read brightness signal Y and chrominance signal R-Y and the B-Y that is stored in the frame image data in frame memory 194 to 196 and these signals are sent to CPU 129.In addition, at step S108, system controller 201 will be indicated in predetermined auxiliary unit 2 ijThe demonstration request command of display image data sends to CPU 129.Processing enters step S109.
When CPU 129 received the demonstration request command from system controller 201, CPU 129 controlled IEEE-1394 interfaces 133 sending to auxiliary unit 2 together with the demonstration request command that is used to indicate display image data from the frame image data (brightness signal Y and chrominance signal R-Y and B-Y) that output control part 200 sends ijCarry out the auxiliary unit 2 of virtual many viewpoints Graphics Processing when ijWhen receiving view data and demonstration request command from master unit 1, auxiliary unit is display image data as described later.
Therefore, for example, as above-mentioned threshold value Th cBe 5 o'clock, if the view data of continuous six frames is almost identical, the 6th frame image data sent to auxiliary unit 2 from master unit 1 ijAnd show.
At step S109, whether decision-making system controller 201 receives from CPU 129 the termination order that indication stops virtual many viewpoints Graphics Processing.
When judging that at step S109 not yet receiving termination orders, process and turn back to step S101, and repeat later on identical process.
When judging that at step S109 receiving termination orders, in other words, for example, when user's operation control 15 (Fig. 7) thereby with at display menu screen on CRT 11 and when again clicking virtual many viewpoints display icon indication CPU 129 stopping virtual many viewpoints Graphics Processings and CPU 129 and will stop order and send to system controller 201 in menu screen, the virtual many viewpoints Graphics Processing of termination in master unit.
Figure 34 illustrates the auxiliary unit 2 of carrying out virtual many viewpoints Graphics Processing ijIn the illustrative functions structure of signal processing 147 (Figure 11).When clicking virtual many viewpoints display icon, the CPU129 of master unit 1 controls as mentioned above IEEE-1394 interface 133 (Figure 10) and sends to each auxiliary unit 2 with the initiation command that will be used to indicate the virtual many viewpoints Graphics Processing of auxiliary unit execution ijIn auxiliary unit 2 ijIn, when CPU 149 (Figure 11) received initiation command by IEEE-1394 interface 153, CPU 149 made the DSP 147A of signal processing 147 carry out the preset program that is stored in EEPROM 147B.Functional structure shown in Figure 34 is to realize when DSP 147A carries out program in being stored in EEPROM 147B.Other functional structures of signal processing 147 described later also realize in the same manner.
Frame memory 211,212 and 213 interim storages from brightness signal Y and chrominance signal R-Y and the B-Y of a frame (or a territory) view data of CPU 149 transmissions.
More particularly, virtual many viewpoints Graphics Processing according to the described master unit of reference Figure 33, when CPU 129 received the demonstration request command from system controller 201, CPU 129 controlled IEEE-1394 interfaces 133 sending to auxiliary unit 2 together with showing request command from the frame image data (brightness signal Y and chrominance signal R-Y and B-Y) that output control part 200 sends ijIn auxiliary unit 2 ijIn, CPU 149 receives by IEEE-1394 interface 153 (Figure 11) and shows request command and a frame image data.CPU 149 will show that request command sends to system controller 219 described later, and the brightness signal Y of a frame image data and chrominance signal R-Y and B-Y are sent to frame memory 211,212 and 213.Brightness signal Y and chrominance signal R-Y and the B-Y of the frame image data that frame memory 211,212 and 213 interim storages send from CPU 149 by this way.
Frame memory 214,215 and 216 is stored in respectively brightness signal Y and chrominance signal R-Y and the B-Y that stores and divide from memory controller 217 frame (or a territory) view data that send in frame memory 211,212 and 213.
Memory controller divides 217 to be controlled by system controller 219, and a frame image data (brightness signal Y and chrominance signal R-Y and B-Y) that will be stored in respectively in frame memory 211 to 213 sends to frame memory 214 to 216, and stores data to override mode.
Selector 218 is under the control of system controller 219, and brightness signal Y and chrominance signal R-Y and the B-Y of the frame image data of Selective storage in frame memory 214 to 216 are perhaps from auxiliary unit 2 ijBrightness signal Y and chrominance signal R-Y and the B-Y of view data of mpeg video decoder 145 (Figure 11) output, and they are sent to the matrix circuit 148 (Figure 11) of rear one-level.
Therefore, when selector 218 was selected view data from mpeg video decoder 145 (Figure 11) output, the program view data of tuner 141 (Figure 11) selected channel was presented at auxiliary unit 2 ij CRT 31 on, and when selector 218 Selective storages during the view data in frame memory 214 to 216, view data, namely, the view data that sends from master unit 1 as mentioned above is presented at auxiliary unit 2 ijCRT31 on.
System controller 219 is control storage control section 217 and selector 218 under the control of CPU 149.
As Figure 32, Figure 34 does not illustrate from auxiliary unit 2 ijIn the voice data of mpeg audio decoder 146 (Figure 11) output.For example in statu quo send to the amplifier 157 of rear one-level from the voice data of mpeg audio decoder 146 outputs.
Next step, the virtual many viewpoints Graphics Processing in the auxiliary unit of being carried out by signal processing shown in Figure 34 147 with reference to flow chart description shown in Figure 35.
At first, at step S121, system controller 219 is controlled brightness signal Y and chrominance signal R-Y and the B-Y of selector 218 frame image data in frame memory 214 to 216 with Selective storage, and begins to show data.More particularly, therefore, selector 218 repeats to read brightness signal Y and chrominance signal R-Y and the B-Y that is stored in the frame image data in frame memory 214 to 216, and they are sent to the matrix circuit 148 (Figure 11) of rear one-level.Like this, be presented at auxiliary unit 2 in the described mode of reference Figure 33 from the view data that master unit 1 sends ij CRT 32 on.
When not carrying out virtual many viewpoints Graphics Processing in auxiliary unit 2, selector 218 is selected from the view data of mpeg video decoder 145 (Figure 11) output, and it is outputed to the matrix circuit 148 of rear one-level.Therefore, in this case, the program view data of tuner 141 selected channels is presented on the CRT 31 of auxiliary unit 2.
Then, process and enter step S122, and system controller 219 judges whether CPU sends a frame image data together with showing request command.
When step S122 judgement does not receive demonstration request command and view data, process skips steps S123 and S124, turn back to step S125, and repeat later on identical process.
Show request command and view data when judging to receive at step S122, in other words, master unit 1 will show that by virtual many viewpoints Graphics Processing of the described master unit of reference Figure 33 request command and view data send to auxiliary unit 2 ijThe time, process and enter step S123, and brightness signal Y and chrominance signal R-Y and the B-Y of frame memory 211 to 213 difference storing image datas.Processing enters step S124.
At step S124, system controller 219 control storage control sections 217 are to wait for that selector 218 reads a frame image data from frame memory 214 to 216; Respectively will be before the next-door neighbour brightness signal Y and chrominance signal R-Y and the B-Y of performed step S123 in being stored in frame memory 211 to 213 send to frame memory 214 to 216; And store them to override mode.Processing enters step S125.
Like this, show that from next frame selector 218 reads the view data that newly is stored in frame memory 214 to 216 constantly.View data sends to CRT 31 by the matrix circuit 148 of rear one-level, and shows.
At step S125, system controller 219 determines whether to send from CPU 149 (Figure 11) and stops order.
More particularly, as described in reference Figure 33, when indication stopped virtual many viewpoints Graphics Processing, the CPU 129 (Figure 10) of master unit 1 sent and stops order to system controller 201.Simultaneously, CPU 129 control IEEE-1394 interface 133 (Figure 10) also will stop order and send to auxiliary unit 2 ijIn auxiliary unit 2 ijIn, CPU 149 receives from master unit 1 by IEEE-1394 interface 153 and stops order.When CPU 149 received the termination order, it sent to system controller 219 with this order.At step S125, determine whether that will stop by this way order sends to system controller 219 from CPU 149.
When judging not yet from CPU 149 transmission termination order at step S125, process and turn back to step S122, and repeat later on identical process.
When judging from CPU 149 transmission termination order at step S125, processing enters step S126, system controller 219 is controlled selectors 218 selection mode being turned back to its previous state used before step S121 changes, and stops virtual many viewpoints Graphics Processing of auxiliary unit.
According to reference Figure 32 to the described virtual many viewpoints Graphics Processing of Figure 35, when master unit 1 as shown in Figure 36 A during the display program view data, if (perhaps occuring hardly) motion does not occur in successive frame, the frame image data that will not comprise any motion sends to from master unit 1 the auxiliary unit that composition can be expanded television system, for example auxiliary unit 2 11, and be presented at as shown in Figure 36 B on its CRT 31.
Then, if having again successive frame not move in shown view data in master unit 1, the frame image data that will not comprise any motion sends to auxiliary unit 2 from master unit 1 11, and replace present shown view data to be presented on its CRT 31, as shown in Figure 36 C.
In Figure 36 A, master unit 1 shows the view data of the real-time television broadcast program of MLB Slam match.In Figure 36 B, master unit 1 continues to show the real-time television broadcast program of MLB Slam match, and auxiliary unit 211 shows that the view data of scoreboard is as the frame image data that does not comprise any motion.In addition, in Figure 36 C, master unit 1 continues the view data of the real-time television broadcast program of demonstration MLB Slam match, and auxiliary unit 2 11Show that the view data of player's bench for the replacement of a majority is as the frame image data that does not comprise any motion.
In other words, in the embodiment shown in Figure 36 C, master unit 1 shows the view data of the real-time television broadcast program of MLB Slam match in the mode identical with the common TV receiver at Figure 36 A.Owing to changing video camera in the real-time television broadcast program of professional baseball game, show scoreboard in the mode of close-up shot, and this scene continues some frames, so master unit 1 sends to auxiliary unit 2 with the scoreboard view data 11, and auxiliary unit 2 11Display image data (Figure 36 B).In addition, then, owing to changing video camera in the real-time television broadcast program of professional baseball game, show player's bench for the replacement of a majority, and the lasting some frames of this scene, so master unit 1 sends to auxiliary unit 2 with bench for the replacement of a majority view data 11, and auxiliary unit 2 11Display image data (Figure 36 C).
As mentioned above,,, frame image data is sent to auxiliary unit 2, and show when having successive frame not comprise any motion in the shown program of master unit 1 to the described virtual many viewpoints Graphics Processing of Figure 35 according to reference Figure 32.Therefore, the user can watch the view data that is presented in master unit 1 and be different from the scene of this view data, and in other words, the user can watch the view data of taking from a plurality of viewpoints simultaneously.
Usually, the real-time broadcast program of MLB Slam match shows scoreboard in the beginning of each innings in the mode of close-up shot.In this case, according to virtual many viewpoints Graphics Processing, even the user not one innings begin to see the scoreboard that shows in the close-up shot mode, but owing to showing this scene on auxiliary unit 2, so the user also can know score immediately.
In the embodiment shown in 36C, view data always sends to auxiliary unit 2 from master unit 1 at Figure 36 A 11Send from master unit 1 the auxiliary unit 2 that view data will arrive at and show therein ijCan change.
Specifically, for example, can be that the view data that will not comprise the first framing of any motion sends to auxiliary unit 2 from master unit 1 11, as shown in Figure 37 A, the view data that does not comprise the second framing of any motion is sent to auxiliary unit 2 from master unit 1 12, as shown in Figure 37 B, and order changes the auxiliary unit that the transmission view data will arrive at and show therein in the same manner ijIn this case, send to all auxiliary unit 2 when view data as shown in Figure 37 C ijAnd when do showing, for example, the view data that does not comprise next framing of any motion can be sent at first to its auxiliary unit 2 that sends view data and do to show 11, and replace present shown view data to show.
In this case, the user can watch a lot of scenes that are presented at the view data in master unit 1 and are different from this view data simultaneously.
In the embodiment shown in Figure 37 C, view data sends to from master unit 1 all auxiliary unit 2 that composition can be expanded television system at Figure 37 A ij, and show.View data only can be sent to from master unit 1 certain auxiliary unit 2 that composition can be expanded television system ijAnd show.Certain auxiliary unit 2 ijCan for example specify on menu screen.
Next step, Figure 38 illustrates the second illustrative functions structure of the signal processing 127 (Figure 10) in the master unit 1 of carrying out virtual many viewpoints Graphics Processing.In the figure, be same as Figure 32 allocation of symbols used and give corresponding to part shown in Figure 32, and if there is no need, the descriptions thereof are omitted.Specifically, except counter portion 199 was not provided, signal processing 127 shown in Figure 38 had the structure identical with Figure 32 basically.
Next step, the virtual many viewpoints Graphics Processing in the master unit of being carried out by signal processing shown in Figure 38 127 with reference to flow chart description shown in Figure 39.
At step S131 and S132, carry out the processing identical with S102 with step S101 shown in Figure 33.
In difference test section 198 in absolute difference sum that step S132 detects present frame as the feature of current frame image data and after should and sending to system controller 201, processing enters step S133, and system controller 201 judge present frames the absolute difference sum whether greater than (more than or equal to) predetermined threshold Th1.
When step S133 judges that the absolute difference sum of present frame is not more than threshold value Th1, process skips steps S134 and S135, and enter step S136.
when judging that at step S133 the absolute difference sum of present frame is greater than threshold value Th1, in other words, current frame image is compared with the former frame image and is changed greatly, therefore when the current frame image Scene changes, processing enters step S134, and system controller 201 is with the mode control storage control section 197 identical with step S106 shown in Figure 33, thereby will be stored in the brightness signal Y of the current frame image data in frame memory 191 to 193 and chrominance signal R-Y and B-Y and send to frame memory 194 to 196, and to override mode, these data are stored in wherein.Processing enters step S135.
At step S135, system controller 201 is controlled output control part 200, reading brightness signal Y and chrominance signal R-Y and the B-Y that is stored in the frame image data in frame memory 194 to 196, and these signals is sent to CPU 129.In addition, at step S135, system controller 201 will be used to indicate in predetermined auxiliary unit 2 ijThe demonstration request command of display image data sends to CPU 129.Processing enters step S136.
When CPU 129 receives the demonstration request command from system controller 201, CPU 129 controls IEEE-1394 interface 133 as mentioned above will send to auxiliary unit 2 from the frame image data (brightness signal Y and chrominance signal R-Y and B-Y) that output control part 200 sends together with showing request command ijIn this case, auxiliary unit 2 ijSignal processing 147 construct as shown in figure 34, and carrying out described virtual many viewpoints Graphics Processing with reference to Figure 35.Therefore, as reference Figure 36 A to Figure 36 C or Figure 37 A to Figure 37 C as described in, auxiliary unit 2 ijShow the view data that sends from master unit 1 together with showing request command.
In embodiment shown in Figure 39, because the first frame that obtains after changing scene sends to auxiliary unit 2 from master unit 1 ij, so auxiliary unit 2 ijBe presented at the summary of program shown in master unit 1.
At step S136, whether decision-making system controller 201 receives from CPU 129 and is used to indicate the termination order that stops virtual many viewpoints Graphics Processing.
When judging that at step S136 not yet receiving termination orders, process and turn back to step S131, and repeat later on identical process.
When judging that at step S131 receiving termination orders, in other words, for example, when user's operation control 15 (Fig. 7) thereby with at display menu screen on CRT 11 and when again clicking virtual many viewpoints display icon indication CPU 129 stopping virtual many viewpoints Graphics Processings and CPU 129 and will stop order and send to system controller 201 in menu screen, the virtual many viewpoints Graphics Processing of termination in master unit.
Next step, Figure 40 illustrates the 3rd illustrative functions structure of the signal processing 127 (Figure 10) in the master unit 1 of carrying out virtual many viewpoints Graphics Processing.In the figure, be same as Figure 32 allocation of symbols used and give corresponding to part shown in Figure 32, and if there is no need, the descriptions thereof are omitted.In other words, except do not provide frame memory 192 to 196, memory controller divide 197 or output control part 200, signal processing 127 shown in Figure 40 has the structure identical with Figure 32 basically.
Next step, the virtual many viewpoints Graphics Processing in the master unit of being carried out by signal processing shown in Figure 40 127 with reference to flow chart description shown in Figure 41.
To S145, carry out step S101 with Figure 33 to the identical processing of S105 at step S141.Yet, at step S141, only the brightness signal Y from the view data of mpeg video decoder 125 (Figure 10) output is stored in frame memory 191.
When judging that at step S145 the counting of counter portion 199 is not more than threshold value Th cThe time, process skips steps S146 and S147, and enter step S148.
When judging that at step S145 the counting of counter portion 199 is greater than threshold value Th c, in other words, when not comprising any motion from the view data of the predetermined number of frames of mpeg video decoder 125 output, processes and enter step S146, and system controller 201 is with auxiliary unit 2 ijBe tuned to the selected channel of tuner 121 (current channel) of master unit 1, and will be used to indicate and freeze and show that the freeze command of the image data frame of this channel institute broadcast program sends to CPU 129.Processing enters step S147.
When CPU 129 received freeze command from system controller 201, CPU 129 controlled IEEE-1394 interface 133 freeze command is sent to auxiliary unit 2 ijCarry out the auxiliary unit 2 of virtual many viewpoints Graphics Processing when ijWhen receiving freeze command from master unit 1, auxiliary unit receives the channel by the freeze command appointment, stores the program view data in this channel, and shows it, as described later.
In Figure 32 and embodiment shown in Figure 33, when the view data in being presented at master unit 1 does not change (changing hardly) in some successive frames, will not become view data and send to auxiliary unit 2 from master unit 1 ijAnd show.Yet, in Figure 40 and embodiment shown in Figure 41, be not that view data is sent to auxiliary unit 2 from master unit 1 ij, but send the freeze command just comprise at the channel of broadcast image data programs.In auxiliary unit 2 ijIn, as described later, tuner 141 selects to be included in the channel in freeze command, and storage and show the view data of this channel institute broadcast program.
At step S147, system controller 201 is zero with the count resets of counter portion 199.Processing enters step S148.
At step S148, whether decision-making system controller 201 receives from CPU 129 and is used to indicate the termination order that stops virtual many viewpoints Graphics Processing.
When judging that at step S148 not yet receiving termination orders, process and turn back to step S141, and repeat later on identical process.
When judging that at step S148 receiving termination orders, in other words, for example, when user's operation control 15 (Fig. 7) thereby with at display menu screen on CRT 11 and when again clicking virtual many viewpoints display icon indication CPU 129 stopping virtual many viewpoints Graphics Processings and CPU 129 and will stop order and send to system controller 201 in menu screen, the virtual many viewpoints Graphics Processing of termination in master unit.
Next step, Figure 42 illustrates the corresponding auxiliary unit 2 of situation that the signal processing 127 of master unit 1 is constructed as shown in figure 40 ijIn the illustrative functions structure of signal processing 147 (Figure 11).In Figure 42, be same as Figure 34 allocation of symbols used and give corresponding to part shown in Figure 34, and if there is no need, the descriptions thereof are omitted.Specifically, signal processing 147 shown in Figure 42 has the structure identical with Figure 34 basically.
In embodiment shown in Figure 42, not will be from the view data of CPU 149 (Figure 11) output but will send to frame memory 211 to 213 from the view data of mpeg video decoder 145 (Figure 11) output.
Next step, the virtual many viewpoints Graphics Processing in the auxiliary unit of being carried out by signal processing shown in Figure 42 147 with reference to flow chart description shown in Figure 43.
At first, at step S151, system controller 219 is controlled selector 218 with brightness signal Y and chrominance signal R-Y and the B-Y of the view data of Selective storage in frame memory 214 to 216, and begins to show data.More particularly, therefore, selector 218 repeats to read brightness signal Y and chrominance signal R-Y and the B-Y that is stored in the frame image data in frame memory 214 to 216, and they are sent to the matrix circuit 148 (Figure 11) of rear one-level.Like this, the view data that is stored in frame memory 214 to 216 is presented at auxiliary unit 2 ij CRT 32 on.
In embodiment shown in Figure 43, before the virtual many viewpoints Graphics Processing of beginning, suppose that frame memory 214 to 216 storages for example have the view data of black level.In this case, be adjacent to the view data that will have black level after the processing of execution in step S151 and be presented at auxiliary unit 2 ij CRT 32 on.
Then, process to enter step S152, and system controller 219 determines whether and receives freeze command.When judgement not yet receives freeze command, process skips steps S153 and S154, and enter step S155.
Receive freeze command when judging at step S152, in other words, the freeze command that is sent by master unit 1 at step S146 shown in Figure 41 has been received by CPU 149 by IEEE-1394 interface 153 (Figure 11) and when sending to system controller 219, processing enters step S153, and system controller 219 request CPU 149 receive by tuner 141 channel that is included in freeze command.CPU 149 controls tuner 141 according to the request from system controller 219, thereby receives the channel that is included in freeze command.
Like this, tuner 141 receives the channel that is included in freeze command, and by QPSK demodulator circuit 142, error correction circuit 143 and demodulation multiplexer 144 and mpeg video decoder 145 and mpeg audio decoder 146, it is sent to signal processing 147.
Begin to store the view data in the channel that is included in the freeze command that sends as mentioned above in the frame memory 211 to 213 of signal processing 147, and processing enters step S154.
Frame memory 211 to 213 is to override the mode sequential storage to its image data frame that provides.
At step S154, system controller 219 control storage control sections 217 are waited for the nearest frame image data of storage in frame memory 211 to 213; Respectively the brightness signal Y of view data and chrominance signal R-Y and B-Y are sent to frame memory 214 to 216; And store them to override mode.Processing enters step S155.
Like this, selector 218 reads in step S154 and newly is stored in view data in frame memory 211 to 213.Therefore, newly be stored in the view data in frame memory 214 to 216, namely, be adjacent to the view data in the identical channel of the channel that is receiving with master unit 1 that obtains after master unit 1 all shows and do not comprise (comprising few) motion in some successive frames view data, be presented at auxiliary unit 2 ij CRT 31 on.
At step S155, system controller 219 determines whether to send from CPU 149 (Figure 11) and stops order.
More particularly, as mentioned above, the CPU 129 (Figure 10) of master unit 1 will stop order and send to system controller 201, and simultaneously, CPU 129 also will stop order and send to auxiliary unit 2 ijIn auxiliary unit 2 ijIn, CPU 149 receives from master unit 1 by IEEE-1394 interface 153 and stops order.When CPU 149 received the termination order, it sent to system controller 219 with this order.At step S155, determine whether that will stop by this way order sends to system controller 219 from CPU 149.
When judging not yet from CPU 149 transmission termination order at step S155, process and turn back to step S152, and repeat later on identical process.
When step S155 judgement sends the termination order from CPU 149, process entering step S156, system controller 219 control tuners 141 turn back to tuning state and are adjacent to the virtual many viewpoints Graphics Processing of beginning previous state used before.Processing enters step S157.
At step S157, system controller 219 is controlled selectors 218 selection mode being turned back to previous state used before step S151 changes it, and stops virtual many viewpoints Graphics Processing in auxiliary unit.
In addition, virtual many viewpoint Graphics Processing described according to reference Figure 40 to 43, with with reference Figure 32 to the identical mode of the described virtual many viewpoints Graphics Processing of Figure 35, when having a plurality of successive frames not comprise (comprising seldom) motion in the program view data that just shows in master unit 1, as reference Figure 36 A to Figure 36 C or Figure 37 A to Figure 37 C as described in, auxiliary unit 2 ijThe view data that shows the frame that does not comprise motion.
Next step, Figure 44 illustrates the 4th illustrative functions structure of the signal processing 127 (Figure 10) in the master unit 1 of carrying out virtual many viewpoints Graphics Processing.In the figure, be same as Figure 38 allocation of symbols used and give corresponding to part shown in Figure 38, and if there is no need, the descriptions thereof are omitted.In other words, except do not provide frame memory 192 to 196, memory controller divide 197 or output control part 200, signal processing 127 shown in Figure 44 has the structure identical with Figure 38 basically.
In Figure 38 and embodiment shown in Figure 39, when the view data occurrence scene that is presented at master unit 1 changed, the view data that is adjacent to the frame of scene after changing sent to auxiliary unit 2 from master unit 1 ij, and show.Yet, in embodiment shown in Figure 44, as Figure 40 and situation shown in Figure 41, be not with view data but the freeze command that just will comprise the channel that is received by master unit 1 sends to auxiliary unit 2 from master unit 1 ijIn auxiliary unit 2 ijIn, as described in reference Figure 42 and Figure 43, tuner 141 selects to be included in the channel from the freeze command that master unit 1 sends, and stores immediately and show the view data of this channel institute broadcast program.
Specifically, Figure 45 is the flow chart that the virtual many viewpoints Graphics Processing in the master unit of being carried out by signal processing shown in Figure 44 127 is shown.
At step S161 and S163, carry out the processing identical with S133 with step S131 shown in Figure 39.
When the absolute difference sum of determining present frame at step S163 is not more than threshold value Th1, processes skips steps S164, and enter step S165.
When determining that at step S163 the absolute difference sum of present frame is greater than threshold value Th1, in other words, current frame image is compared with the former frame image and is changed greatly therefore when the current frame image Scene changes, processing enters step S164, and system controller 201 sends to CPU 129 in the mode identical with step S146 shown in Figure 41 with freeze command.Processing enters step S165.
When CPU 129 received freeze command from system controller 201, CPU 129 controlled IEEE-1394 interface 133 freeze command is sent to auxiliary unit 2 ij
In this case, auxiliary unit 2 ijSignal processing 147 construct as shown in figure 42, and carrying out described virtual many viewpoints Graphics Processing with reference to Figure 43.Therefore, when auxiliary unit 2 ijWhen receiving freeze command from master unit 1, begin immediately to receive the channel that has been received by master unit 1, and store immediately and show the view data of this channel program.In other words, equally in this case, auxiliary unit 2 ijTo be presented at the summary of the program that shows in master unit 1 with Figure 38 and the described identical mode of embodiment shown in Figure 39.
At step S165, whether decision-making system controller 201 receives from CPU 129 and is used to indicate the termination order that stops virtual many viewpoints Graphics Processing.
When judging that at step S165 not yet receiving termination orders, process and turn back to step S161, and repeat later on identical process.
When judging that at step S165 receiving termination orders, in other words, for example, when user's operation control 15 (Fig. 7) thereby with at display menu screen on CRT 11 and when again clicking virtual many viewpoints display icon indication CPU 129 stopping virtual many viewpoints Graphics Processings and CPU 129 and will stop order and send to system controller 201 in menu screen, the virtual many viewpoints Graphics Processing of termination in master unit.
In embodiment shown in Figure 45, freeze command is to communicate by letter by IEEE-1394 with sending to auxiliary unit 2 from master unit 1 at Figure 40 ij.The transmission of freeze command can for example adopt infrared communication to carry out by controller 15 (or 35) as shown in figure 46.
More particularly, in embodiment shown in Figure 46, the CPU 129 indication IR interfaces 135 of master unit 1 send to auxiliary unit 2 with freeze command ijResponse is from the instruction of CPU 129, and 135 emissions of IR interface are used to indicate and send freeze command to auxiliary unit 2 ijThe corresponding infrared ray of transmission command.Infrared ray is received by controller 15, and controller 15 responses will be arrived auxiliary unit 2 corresponding to the infrared emitting of freeze command corresponding to the transmission command of receiving infrared-ray ijCorresponding to the infrared ray of freeze command by auxiliary unit 2 ij IR interface 155 receive, and IR interface 155 will send to CPU 149 corresponding to ultrared freeze command.
The IR interface 135 of master unit 1 and controller 15 send the frame data that have with reference to the described form of Figure 23 by infrared ray.The frame data that send from IR interface 135 are called F1, and the frame data that send from controller 15 are called F2.Frame data F1 comprises the device code of distributing to IR interface 135 and controller 15, thereby is received by controller 15 from the frame data F1 that IR interface 135 sends.
Under present case, the frame data F1 request that sends to controller 15 from IR interface 135 sends to auxiliary unit 2 with freeze command ijTherefore, frame data F1 need to comprise being used to indicate and is transferred to auxiliary unit 2 ijTransmission command, the freeze command that will transmit and as the auxiliary unit 2 of transmission destination ijDevice code.
In the command code of frame data F1, comprise the command code of transmission command as so-called command code, and comprise the command code of freeze command and (be auxiliary unit 2 in this case by the device code that transmission command transmits the transmission destination that freeze command will arrive at ijThe device code of IR interface 155) as so-called operand.
In this case, the master unit 1 that sends frame data F1 need to be identified auxiliary unit 2 ijDevice code, wherein, auxiliary unit 2 ijTo send by transmission command the transmission destination that freeze command will arrive at.When device code changes, and be adjacent to for example when auxiliary unit 2 ijWhen being connected to master unit 1 by the IEEE-1394 cable, performed above-mentioned authentication processing (Figure 31) afterwards, is communicated by letter auxiliary unit 2 by IEEE-1394 ijDevice code report to master unit 1.Like this, master unit 1 identification forms all the auxiliary unit 2 that can expand television system ijDevice code.
When controller 15 receives above-mentioned frame data F1, the step S4 that controller 15 is processed at the described controller of reference Figure 26 carries out the corresponding processing of command code that is arranged in frame data F1, with the frame data F2 of the device code that generates the freeze command code wherein be equipped with in frame data F1 and transmission destination, and it is sent to auxiliary unit 2 ij
More particularly, in this case, controller shown in Figure 22 15 responses be arranged in frame data F1 transmission command and according to flow chart fill order transmission process shown in Figure 47, the processing corresponding to command code of carrying out as the step S4 at Figure 26.
In command transfer is processed, at first, at step S171, from reception ﹠ disposal part 167 frames receiveds during according to F1, control section 162 control frame generating portions 163 are to put into the transmission destination device code in the command code of frame data F1 the device code of frame data F2 when control section 162.Processing enters step S172.
At step S172, control section 162 control frame generating portions 163 are to pile up the freeze command in the command code of frame data F1 the command code in frame data F2.Processing enters step S173.
At step S173, the frame data F2 that frame generating portion 163 will wherein be equipped with the command code of the device code of transmission destination and freeze command as mentioned above sends to transmission processing part 164, thereby by infrared ray output frame data F2, and finish dealing with.
In this case, frame data F2 comprises the device code of transmission destination, namely, and auxiliary unit 2 ijDevice code.Therefore, in auxiliary unit 2 ijIn, IR interface 155 frames receiveds are according to F2, and will be that freeze command sends to CPU 149 corresponding to the order of command code.
Send to auxiliary unit 2 by infrared communication from master unit 1 ijOrder be not limited to freeze command.Can also transmit other orders.
Next step, Figure 48 illustrates the 5th illustrative functions structure of the signal processing 127 (Figure 10) in the master unit 1 of carrying out virtual many viewpoints Graphics Processing.In the figure, be same as Figure 32 allocation of symbols used and give corresponding to part shown in Figure 32, and if there is no need, the descriptions thereof are omitted.In other words, except difference test section 198 or counter portion 199 not being provided and newly providing circular buffer 221, audio frequency rating unit 222 and audio mode storage area 223, signal processing 127 shown in Figure 48 has the structure identical with Figure 32 basically.
Circular buffer 221 receives from the voice data of mpeg audio decoder 126 (Figure 10) output, and the sequential storage voice data.
In statu quo send to the amplifier 137 of circular buffer and rear one-level from the voice data of mpeg audio decoder 126 outputs.
Audio frequency rating unit 222 is used as input pattern to the voice data that is stored in circular buffer 221, input pattern and the voice data that is stored in audio mode storage area 223 as reference mode are compared, and comparative result is sent to system controller 201.
223 storages of audio mode storage area are as the voice data of reference mode.
The voice data of exporting and being stored in circular buffer 221 from mpeg audio decoder 126 (Figure 10) sends to audio mode storage area 223, and the voice data that audio mode storage area 223 can will be stored under the control of system controller 201 in circular buffer 221 is stored as new reference mode.In other words, can update stored in audio frequency reference mode in audio mode storage area 223.
Next step, the virtual many viewpoints Graphics Processing in the master unit of being carried out by signal processing shown in Figure 48 127 with reference to flow chart description shown in Figure 49.
Frame memory 191 to 193 is to override the mode sequential storage from brightness signal Y and chrominance signal R-Y and the B-Y of the view data of mpeg video decoder 125 (Figure 10) transmission.
The voice data that circular buffer 221 has also sent from mpeg audio decoder 126 (Figure 10) to override the mode sequential storage.
In virtual many viewpoints Graphics Processing, at first, at step S181, system controller 201 determines whether and sends the audio mode registration request from CPU 129.
As mentioned above, audio mode storage area 223 can be stored as new reference mode with the voice data that is stored in circular buffer 221, namely, can register new reference mode.This registration request can for example be carried out by clicking audio mode registration request icon on menu screen shown when the menu button switch 84 on operation control 15 (Fig. 7) time.
At step S181, determine whether and clicked audio mode registration request icon.
When step S181 judgement not yet receives the audio mode registration request, process skips steps S182, and enter step S183.
When step S181 judgement receives the audio mode registration request, in other words, when thereby the user clicks audio mode registration request icon and will send to system controller 201 from CPU 129 for the audio mode registration request of the new reference mode of request registration, processing enters step S182, and system controller 201 for example the nearest audio data sample from be stored in circular buffer 221 be stored in audio mode storage area 223 as new reference mode to the voice data of the sample of the rollback scheduled time.
Therefore, when listening in the program sound of loudspeaker unit 12L and 12R output output user the user when wanting voice data as reference mode, the user can register this voice data as reference mode by operation control 15.
Audio mode storage area 223 can be stored a reference mode, namely, can store new reference mode to override the mode that is stored in the reference mode in audio mode storage area 223, and also can store a plurality of reference modes, namely, reference mode in being stored in audio mode storage area 223, can also store new reference mode.
When in step S182 is stored in audio mode storage area 223 with new reference mode, process and enter step S183, and audio frequency rating unit 222 reads all voice datas of for example being stored in circular buffer 221 as input pattern.Processing enters step S184.
At step S184, audio frequency rating unit 222 reads the reference mode that is stored in audio mode storage area 223 so that it and input pattern are compared.More particularly, audio frequency rating unit 222 obtains distance (being called the audio mode spacing in case of necessity) between input pattern and reference mode in the situation that elongate or shorten time shaft according to predetermined ratio, obtain minimum audio mode spacing as the feature (with respect to reference mode) of input pattern, and it is sent to system controller 201.
Then, process to enter step S185, and system controller 201 (or the less than) predetermined threshold of judging whether the audio mode spacing that obtains as the input pattern feature is equal to or less than.
When step S185 judges the audio mode spacing greater than predetermined threshold, process skips steps S186 and S187, and enter step S188.
When judging that at step S185 the audio mode spacing is not more than predetermined threshold, in other words, in the time of can thinking input pattern and reference mode coupling, process entering step S186 and S187 with execution and step S106 and the performed identical processing of S108 at Figure 33.Then, processing enters step S188.
Like this, in master unit 1, when the voice data from mpeg audio decoder 126 outputs the same as or similar to reference mode, will send to auxiliary unit 2 from the image data frame of mpeg video decoder 125 outputs ij
In this case, auxiliary unit 2 ijSignal processing 147 construct as shown in figure 34, and carrying out virtual many viewpoints Graphics Processing shown in Figure 35.Therefore, in auxiliary unit 2 ijIn, show the image data frame that sends from master unit 1 in the above described manner.
At step S188, whether decision-making system controller 201 receives from CPU 129 and is used to indicate the termination order that stops virtual many viewpoints Graphics Processing.
When judging that at step S188 not yet receiving termination orders, process and turn back to step S181, and repeat identical process.
When judging that at step S188 receiving termination orders, in other words, for example, when user's operation control 15 (Fig. 7) thereby with at display menu screen on CRT 11 and when again clicking virtual many viewpoints display icon indication CPU 129 stopping virtual many viewpoints Graphics Processings and CPU 129 and will stop order and send to system controller 201 in menu screen, the virtual many viewpoints Graphics Processing of termination in master unit.
According to virtual many viewpoints Graphics Processing shown in Figure 49, when the voice data from mpeg audio decoder 126 outputs the same as or similar to reference mode, in auxiliary unit 2 ijMiddle demonstration is from the image data frame of mpeg video decoder 125 outputs.Therefore, for example, the voice data of exporting when special news incision segment (special-news telop) is play in storage is during as reference mode, in auxiliary unit 2 ijThe view data that middle demonstration is broadcasted when this voice data of output namely, comprises the view data of special news incision segment.
When audio mode storage area 223 stores a plurality of reference mode, at the step S184 of Figure 49, input pattern and a plurality of reference mode are compared one by one.For example, if can think that input pattern mates at least one in a plurality of patterns, the processing of execution in step S186 and S187.
In these cases, obtain the audio mode spacing as the feature of taking on the voice data of input pattern.Can obtain the power (or amplitude leyel) of the voice data of taking on input pattern as its feature.In this case, power and the predetermined threshold of taking on the voice data of input pattern can be compared, and in auxiliary unit 2 ijThe power that middle demonstration is adjacent to voice data become and be greater than or less than (more than or equal to, perhaps, be less than or equal to) after predetermined threshold from the image data frame of mpeg video decoder 125 outputs.
In addition, be presented at auxiliary unit 2 ijIn view data be transferred to auxiliary unit 2 from master unit 1 ij, in addition as mentioned above can also be by freeze command be sent to auxiliary unit 2 from master unit 1 ijThe auxiliary unit 2 of cause ijReceive.
Next step, Figure 50 illustrates the 6th illustrative functions structure of the signal processing 127 (Figure 10) in the master unit 1 of carrying out virtual many viewpoints Graphics Processing.In the figure, be same as Figure 32 allocation of symbols used and give corresponding to part shown in Figure 32, and if there is no need, the descriptions thereof are omitted.In other words, except difference test section 198 or counter portion 199 are not provided and image rating unit 232 newly is provided and image model storage area 233, signal processing 127 shown in Figure 50 has the structure identical with Figure 32 basically.
Image rating unit 232 is used as input pattern to the view data that is stored in frame memory 191, input pattern and the view data that is stored in image model storage area 233 as reference mode are compared, and comparative result is sent to system controller 201.
The view data of reference mode is taken in 233 storages of image model storage area.
The view data (its brightness signal Y) of exporting and being stored in frame memory 191 from mpeg video decoder 125 (Figure 10) sends to image model storage area 233, and the view data that image model storage area 233 can will be stored under the control of system controller 201 in frame memory 191 is stored as new reference mode.In other words, can update stored in graphic based pattern in image model storage area 233.
Next step, the virtual many viewpoints Graphics Processing in the master unit of being carried out by signal processing shown in Figure 50 127 with reference to the flow chart description shown in Figure 51.
Frame memory 191 to 193 is to override the mode sequential storage from brightness signal Y and chrominance signal R-Y and the B-Y of the view data of mpeg video decoder 125 (Figure 10) transmission.
In virtual many viewpoints Graphics Processing, at first, at step S191, system controller 201 determines whether and sends the image model registration request from CPU 129.
As mentioned above, image model storage area 233 can be stored as new reference mode with the view data that is stored in frame memory 191, namely, can register new reference mode.This registration request can for example be carried out by clicking image model registration request icon on menu screen shown when the menu button switch 84 on operation control 15 (Fig. 7) time.
At step S191, determine whether and clicked image model registration request icon.
When step S191 judgement not yet receives the image model registration request, process skips steps S192, and enter step S193.
When step S191 judgement receives the image model registration request, in other words, when thereby the user clicks image model registration request icon and will send to system controller 201 from CPU 129 for the image model registration request of the new reference mode of request registration, processing enters step S192, and the nearest frame image data that system controller 201 will be stored in frame memory 191 stores in image model storage area 233 as new reference mode.
Therefore, show that when watching the program image that is presented on CRT 11 the user when when the user wants to be used as the view data of reference mode, the user can register this view data as reference mode by operation control 15.
Image model storage area 233 can adopt the mode identical with audio mode storage area 223, stores a reference mode, and also can store a plurality of reference modes.
When in step S192 is stored in image model storage area 233 with new reference mode, process and enter step S193, and image rating unit 232 reads the nearest frame image data that for example is stored in frame memory 191 as input pattern.Processing enters step S194.
At step S194, image rating unit 232 reads the reference mode that is stored in image model storage area 233 so that it and input pattern are compared.More particularly, image rating unit 232 obtains distance between input pattern and reference mode (in case of necessity hereinafter referred to as do image mode interval from) as the feature (with respect to reference mode) of input pattern according to predetermined ratio, and it is sent to system controller 201.
The image model spacing can be for example take on input pattern the view data pixel pixel value and take on absolute difference sum between the pixel value of respective pixel of view data of reference mode.
At this, input pattern and reference mode are all frame image datas.Can also use the part of a frame image data as input pattern and reference mode.
In addition, can also be use a frame image data as input pattern and a part of using a frame image data as reference mode.In this case, can be in the situation that a frame image data of input pattern is taken in change and the position corresponding relation taken between the part of a frame image data of reference mode obtains a plurality of image model spacings, and use the minimum value of obtaining as final image model spacing.
When obtaining the image model spacing at step S194, process and enter step S195, and system controller 201 (or the less than) predetermined threshold of judging whether the image model spacing that obtains as the input pattern feature is equal to or less than.
When step S195 process decision chart as mode interval from greater than predetermined threshold the time, process skips steps S196 and S197, and enter step S198.
When at step S195 process decision chart as mode interval from being not more than predetermined threshold, in other words, can think that input pattern and reference mode when coupling, processing sequence enter step S196 and S197 to carry out and step S 106 and the performed identical processing of S108 at Figure 33.Then, processing enters step S198.
Like this, in master unit 1, when the view data from mpeg video decoder 125 outputs the same as or similar to reference mode, image data frame is sent to auxiliary unit 2 ij
In this case, auxiliary unit 2 ijSignal processing 147 construct as shown in figure 34, and carrying out virtual many viewpoints Graphics Processing shown in Figure 35.Therefore, in auxiliary unit 2 ijIn, show the image data frame that sends from master unit 1 in the above described manner.
At step S198, whether decision-making system controller 201 receives from CPU 129 and is used to indicate the termination order that stops virtual many viewpoints Graphics Processing.
When judging that at step S198 not yet receiving termination orders, process and turn back to step S191, and repeat identical process.
When judging that at step S198 receiving termination orders, in other words, for example, when user's operation control 15 (Fig. 7) thereby with at display menu screen on CRT 11 and when again clicking virtual many viewpoints display icon indication CPU 129 stopping virtual many viewpoints Graphics Processings and CPU 129 and will stop order and send to system controller 201 in menu screen, the virtual many viewpoints Graphics Processing of termination in master unit.
According to the virtual many viewpoints Graphics Processing shown in Figure 51, when the view data from mpeg video decoder 125 outputs the same as or similar to reference mode, in auxiliary unit 2 ijMiddle display image data frame.
Therefore, when the scoreboard view data that shows in the close-up shot mode in the real time programme that is stored in MLB Slam match during as reference mode, in auxiliary unit 2 ijLater on broadcasting of middle demonstration and have the view data of identical or icotype with the view data of taking on reference mode, namely, the later broadcast image data of the scoreboard that shows in the close-up shot mode.
More particularly, when the scoreboard view data that shows in the close-up shot mode in the real time programme that is stored in MLB Slam match master unit 1 is receiving the MLB Slam contest show as shown in Figure 52 A when during as reference mode, if certain office afterwards broadcasts the scoreboard view data in the close-up shot mode, for example Figure 52 B is shown in auxiliary unit 2 11In show the scoreboard view data in the close-up shot mode.In addition, after, when again broadcasting the scoreboard view data in the close-up shot mode in next office, for example shown in Figure 52 C, in auxiliary unit 2 12In show the scoreboard view data in the close-up shot mode.
When the beginning of each innings in the real-time television broadcast program in professional baseball game is broadcasted the scoreboard image in the close-up shot mode, as mentioned above, forming the auxiliary unit 2 that to expand television system ijIn sequentially show the scoreboard image in the close-up shot mode.
Therefore, in this case, the user can watch auxiliary unit 2 ijDemonstration know that the score of every innings changes.
In the fast announce items of election, show the view data that shows the person to be elected's face in the close-up shot mode, wherein, this view data is with the picture of the expression the person to be elected of each political parties and groups number.When this view data being stored in image model storage area 233 as reference mode, if master unit 1 receives the fast announce items of election, forming the auxiliary unit 2 that to expand television system ijMiddle order shows the view data of being broadcasted that shows the person to be elected's face in the close-up shot mode in the fast announce items of election, as shown in Figure 53.
Therefore, in this case, the user can watch auxiliary unit 2 ijDemonstration understand the person to be elected.
And for example, the broadcasting station is frequent broadcast weather forecast in one day.In the time being stored in for the view data of the japanese map of weather predicting program (or the map in some areas such as Kanto area) in image model storage area 233 as reference mode, if master unit 1 receives weather predicting program, forming the auxiliary unit 2 that to expand television system ijMiddle order is presented at the weather map of broadcasting in weather forecast, as shown in Figure 54.
Therefore, in this case, the user can watch auxiliary unit 2 ijDemonstration easily know at different time zone with the weather forecast of same channel broadcasting with the weather forecast of different channel broadcasting.
Will be in auxiliary unit 2 ijThe view data of middle demonstration sends to auxiliary unit 2 from master unit 1 ijIn addition, as mentioned above, view data can also be by sending to auxiliary unit 2 with freeze command from master unit 1 ijThe auxiliary unit 2 of cause ijReceive.
Next step, Figure 55 illustrates the 7th illustrative functions structure of the signal processing 127 (Figure 10) in the master unit 1 of carrying out virtual many viewpoints Graphics Processing.In the figure, be same as Figure 32, Figure 48 or Figure 50 allocation of symbols used and give corresponding to Figure 32, Figure 48 or part shown in Figure 50, and if there is no need, the descriptions thereof are omitted.In other words, except difference test section 198 shown in Figure 32 or counter portion 199 are not provided and circular buffer 221, audio frequency rating unit 222 and audio mode storage area 223 shown in Figure 48 newly is provided and image rating unit 232 and image model storage area 233 shown in Figure 50, the signal processing 127 shown in Figure 55 has the structure identical with Figure 32 basically.
Next step, the virtual many viewpoints Graphics Processing in the master unit of being carried out by the signal processing 127 shown in Figure 55 with reference to the flow chart description shown in Figure 56.
Frame memory 191 to 193 is to override the mode sequential storage from brightness signal Y and chrominance signal R-Y and the B-Y of the view data of mpeg video decoder 125 (Figure 10) transmission.
The voice data that circular buffer 221 has also sent from mpeg audio decoder 126 (Figure 10) to override the mode sequential storage.
In virtual many viewpoints Graphics Processing, at first, at step S201, system controller 201 determines whether to send the audio mode registration request from CPU 129 in the mode identical with the step S181 of Figure 49.When judgement not yet receives the audio mode registration request, process skips steps S202, and enter step S203.
When step S201 judgement receives the audio mode registration request, processing enters step S202, and system controller 201 is stored in audio mode storage area 223 voice data that is stored in the given time in circular buffer 221 as new reference mode in the mode identical with the step S183 of Figure 49.Processing enters step S203.
At step S203, system controller 201 determines whether to send the image model registration request from CPU 129 in the mode identical with the step S191 of Figure 51.When judgement not yet receives the image model registration request, process skips steps S204, and enter step S205.
When step S203 judgement receives the image model registration request, processing enters step S204, and system controller 201 stores in image model storage area 233 the nearest frame image data that is stored in frame memory 191 as new reference mode in the mode identical with the step S192 of Figure 51.Processing enters step S205.
At step S205, audio frequency rating unit 222 is in the mode identical with the step S183 of Figure 49, reads the voice data that is stored in circular buffer 221 as the audio frequency input pattern.In addition at step S205, image rating unit 232 is in the mode identical with the step S193 of Figure 51, reads the view data that is stored in frame memory 191 as the image input pattern.Processing enters step S206.
At step S206, audio frequency rating unit 222 is in the mode identical with the step S184 of Figure 49, relatively be stored in audio frequency reference mode and audio frequency input pattern in audio mode storage area 223, take on the feature of the voice data of audio frequency input pattern thereby obtain the conduct of audio mode spacing, and it is sent to system controller 201.In addition at step S206, image rating unit 232 is in the mode identical with the step S194 of Figure 51, relatively be stored in graphic based pattern and image input pattern in image model storage area 233, take on the feature of the view data of image input pattern thereby obtain the conduct of image model spacing, and it is sent to system controller 201.
Then, processing enters step S207, system controller 201 is in the mode identical with the step S 185 of Figure 49, whether the audio mode spacing that judgement obtains as audio frequency input pattern feature is equal to or less than (perhaps less than) predetermined threshold, perhaps in the mode identical with the step S195 of Figure 51, (perhaps less than) predetermined threshold of judging whether the image model spacing that obtains as image input pattern feature is equal to or less than.
During also greater than predetermined threshold, process skips steps S208 and S209, and enter step S210 greater than predetermined threshold and image model spacing when judge the audio mode spacing at step S207.
When judging that at step S207 the audio mode spacing is equal to or less than predetermined threshold or the image model spacing is equal to or less than predetermined threshold, in other words, can think that audio frequency input pattern and audio frequency reference mode coupling or can think image input pattern and graphic based pattern matching time the, processing sequence enter step S208 and S209 with execution and step S106 and the performed identical processing of S108 at Figure 33.Then, processing enters step S210.
Like this, in master unit 1, when the voice data from mpeg audio decoder 126 outputs the same as or similar to the audio frequency reference mode, will send to auxiliary unit 2 from the image data frame of mpeg video decoder 125 outputs ij, perhaps when the view data from mpeg video decoder 125 outputs the same as or similar to the graphic based pattern, image data frame is sent to auxiliary unit 2 ij
At step S210, whether decision-making system controller 201 receives from CPU 129 and is used to indicate the termination order that stops virtual many viewpoints Graphics Processing.
When judging that at step S210 not yet receiving termination orders, process and turn back to step S201, and repeat identical process.
When judging that at step S210 receiving termination orders, in other words, for example, when user's operation control 15 (Fig. 7) thereby with at display menu screen on CRT 11 and when again clicking virtual many viewpoints display icon indication CPU 129 stopping virtual many viewpoints Graphics Processings and CPU 129 and will stop order and send to system controller 201 in menu screen, the virtual many viewpoints Graphics Processing of termination in master unit.
When the signal processing 127 of master unit 1 is constructed as shown in Figure 55, auxiliary unit 2 ijSignal processing 147 construct as shown in figure 34, and carrying out virtual many viewpoints Graphics Processing shown in Figure 35.Therefore, in auxiliary unit 2 ijIn, show the image data frame that sends from master unit 1 in the above described manner.More particularly, in auxiliary unit 2 ijIn, when the voice data from mpeg audio decoder 126 outputs the same as or similar to the audio frequency reference mode, demonstration is from the image data frame of mpeg video decoder 125 outputs, perhaps when the view data from mpeg video decoder 125 output the same as or similar to the graphic based pattern, the display image data frame.
In these cases, when the audio mode spacing is equal to or less than predetermined threshold or image model spacing and is equal to or less than predetermined threshold, the processing of execution in step S208 and S209.Can be also for example to only have when the audio mode spacing is equal to or less than predetermined threshold and image model spacing and is equal to or less than predetermined threshold, in other words, when thinking audio frequency input pattern and audio frequency reference mode coupling and can think image input pattern and graphic based pattern matching the time, the processing of ability execution in step S208 and S209.
In this case, in auxiliary unit 2 ijIn, if when the voice data from mpeg audio decoder 126 output the same as or similar to the audio frequency reference mode from the view data of mpeg video decoder 125 outputs the same as or similar to the graphic based pattern, display image data frame.
Next step, Figure 57 illustrates the 8th illustrative functions structure of the signal processing 127 (Figure 10) in the master unit 1 of carrying out virtual many viewpoints Graphics Processing.In the figure, be same as Figure 38 allocation of symbols used and give corresponding to part shown in Figure 38, and if there is no need, the descriptions thereof are omitted.In other words, the signal processing shown in Figure 57 127 has the structure identical with signal processing shown in Figure 38 127 basically.
Next step, the virtual many viewpoints Graphics Processing in the master unit of being carried out by the signal processing 127 shown in Figure 57 with reference to the flow chart description shown in Figure 58.
At first, at step S221, system controller 201 specifies the auxiliary unit of default still frame and default scene to change auxiliary unit in forming the auxiliary unit 2 that can expand television system.
The auxiliary unit of still frame is for showing that the shown view data of master unit 1 can think not comprise the auxiliary unit of view data of the still frame of (comprising seldom) motion.The auxiliary unit of scene change is for showing that the shown view data of master unit 1 is adjacent to the auxiliary unit of the view data that obtains after scene change.
At step S221, for example, system controller 201 will be made as adjacent to the auxiliary unit 221 in master unit 1 left side default still frame auxiliary unit, and will be made as default scene adjacent to the auxiliary unit 223 on master unit right side and change auxiliary unit.
Then, process to enter step S222, and system controller 201 judges whether CPU 129 have specified still frame auxiliary unit.
More particularly, can specify auxiliary unit 2 outside the auxiliary unit 221 that has been designated as the auxiliary unit of default still frame ijAs the auxiliary unit of still frame.For example, the menu button switch 84 of user on can operation control 15 (Fig. 7) be with the display menu screen, and click the icon that is used to specify still frame auxiliary unit and specify still frame auxiliary unit.
At step S222, determine whether and clicked the icon that is used to specify the auxiliary unit of still frame.
When the auxiliary unit of still frame is not yet specified in step S222 judgement, process skips steps S223, and enter step S224.
When the auxiliary unit of still frame has been specified in step S222 judgement, in other words, when user's operation control 15, click and be used to specify the icon of the auxiliary unit of still frame, and specify auxiliary unit 2 ijAs the auxiliary unit of still frame, thereby CPU 129 will be used to indicate and specify auxiliary unit 2 ijWhen sending to system controller 201 as the order of still frame auxiliary unit, process and enter step S223, and system controller 201 arranges and specifies auxiliary unit 2 ij(identify it and be the auxiliary unit of still frame) is as the auxiliary unit of still frame.Then, processing enters step S224.
At step S224, whether system controller 201 judges the CPU 129 auxiliary unit of given scenario change.
More particularly, can specify the auxiliary unit 2 that has been designated as the auxiliary unit of default scene change 23Outside auxiliary unit 2 ijAs the auxiliary unit of scene change.For example, the menu button switch 84 of user on can operation control 15 (Fig. 7) be with the display menu screen, and click the icon that is used to specify scene change auxiliary unit and come given scenario to change auxiliary unit.
At step S224, determine whether and clicked the icon that is used to specify the auxiliary unit of scene change.
When judging that at step S224 not yet given scenario changes auxiliary unit, process skips steps S225, and enter step S226.
When judging that at step S224 given scenario changes auxiliary unit, in other words, when user's operation control 15, click and be used to specify the icon of the auxiliary unit of scene change, and specify auxiliary unit 2 ijBe the auxiliary unit of scene change, thereby CPU 129 will be used to indicate and specify auxiliary unit 2 ijWhen sending to system controller 201 for the order of scene change auxiliary unit, process and enter step S225, and system controller 201 arranges and specifies auxiliary unit 2 ijBe the auxiliary unit of scene change.Then, processing enters step S226.
At step S226, frame memory 191 to 193 is waited for from brightness signal Y and chrominance signal R-Y and the B-Y of mpeg video decoder 125 (Figure 10) arrival as a frame image data, and storage brightness signal Y and chrominance signal R-Y and B-Y.Processing enters step S227.
At step S227, difference test section 198 detect the brightness signal Y (brightness signal Y of view data in present frame) of the view data during performed step S101 is stored in frame memory 191 before the next-door neighbour and the brightness signal Y (brightness signal Y of view data in former frame) of the view data during performed step S101 is stored in frame memory 191 in last operation between the absolute difference sum, namely, the absolute difference sum of present frame is as the feature of current frame image data, and should and send to system controller 201.
Then, processing enters step S228.System controller 201 judges whether the absolute difference sum of present frame is almost nil, namely, and whether less than (being equal to or less than) little positive threshold value Th2.
When the absolute difference sum of judging present frame at step S228 during less than threshold value Th2, namely, compare variation very little (perhaps not changing) with the former frame image when current frame image, therefore can think when current frame image is still frame, processing enters step S229, and system controller 210 control storage control sections 197 send to frame memory 194 to 196 with the view data (its brightness signal Y and chrominance signal R-Y and B-Y) that will be stored in the present frame in frame memory 191 to 193, and to override mode, these data are stored in wherein.Processing enters step S230.
At step S230, system controller 201 is controlled output control part 200 and is read brightness signal Y and chrominance signal R-Y and the B-Y that is stored in frame memory 194 to 196, and they are sent to CPU 129.In addition, at step S230, the demonstration request command that system controller 201 will be used to indicate display image data in auxiliary unit sends to CPU 129.Processing enters step S234.
Receive from system controller 201 as CPU 129 when being used to indicate the demonstration request command that shows still frame auxiliary unit, CPU 129 controls IEEE-1394 interfaces 133 will send to from the frame image data (its brightness signal Y and chrominance signal R-Y and B-Y) that output control part 200 sends still frame auxiliary unit together with the demonstration request command that is used to indicate display image data.The auxiliary unit 2 of taking on the auxiliary unit of still frame ijSignal processing 147 construct as shown in figure 34, and carrying out virtual many viewpoints Graphics Processing shown in Figure 35.Therefore, the view data of comparing the very little present frame of variation in the view data that is presented in master unit 1 with former frame is sent to still frame auxiliary unit, and show therein.
On the other hand, when when step S228 judges that the absolute difference sum of present frame is not less than threshold value Th2, processing enters step S231, and system controller 201 judges that whether the absolute difference sum of present frame is greater than (perhaps being equal to or greater than) threshold value Th1, wherein, threshold value Th1 is fully greater than threshold value Th2.
When step S231 judges that the absolute difference sum of present frame is not more than threshold value Th1, process skips steps S232 and S233, and enter step S234.
When the absolute difference sum of judging present frame at step S231 during greater than threshold value Th1, in other words, compare with the image of former frame when the image of present frame and alter a great deal, when therefore the present frame occurrence scene changes, processing enters step S232, and system controller 210 is with the mode control storage control section 197 identical with step S229, send to frame memory 194 to 196 with the brightness signal Y that will be stored in the current frame image data in frame memory 191 to 193 and chrominance signal R-Y and B-Y, and to override mode, they are stored in wherein.Processing enters step S233.
At step S233, system controller 201 is controlled output control part 200 reading brightness signal Y and chrominance signal R-Y and the B-Y that is stored in the frame image data in frame memory 194 to 196, and they are sent to CPU 129.In addition, at step S233, the demonstration request command that system controller 201 will be used to indicate display image data in scene change auxiliary unit sends to CPU 129.Processing enters step S234.
Receive from system controller 201 as CPU 129 when being used to indicate the demonstration request command that shows scene change auxiliary unit, CPU 129 controls IEEE-1394 interfaces 133 will send to from the frame image data (its brightness signal Y and chrominance signal R-Y and B-Y) that output control part 200 sends scene change auxiliary unit together with showing request command.The auxiliary unit 2 of taking on the auxiliary unit of scene change ijSignal processing 147 construct as shown in figure 34, and carrying out virtual many viewpoints Graphics Processing shown in Figure 35.Therefore, send to scene change auxiliary unit with being adjacent to the view data that obtains after scene change in the view data that is presented in master unit 1, and show therein.
At step S234, whether decision-making system controller 201 receives from CPU 129 and is used to indicate the termination order that stops virtual many viewpoints Graphics Processing.
When judging that at step S234 not yet receiving termination orders, process and turn back to step S221, and repeat later on identical process.
When judging that at step S234 receiving termination orders, in other words, for example, when user's operation control 15 (Fig. 7) thereby with at display menu screen on CRT 11 and when again clicking virtual many viewpoints display icon indication CPU 129 stopping virtual many viewpoints Graphics Processings and CPU 129 and will stop order and send to system controller 201 in menu screen, the virtual many viewpoints Graphics Processing of termination in master unit.
As mentioned above, according to the embodiment shown in Figure 57 and Figure 58, the view data that does not comprise motion in the program that master unit 1 is being received is presented in still frame auxiliary unit, and the view data that obtains after the program Scene that master unit 1 is being received changes is presented in scene change auxiliary unit.
At this, still frame auxiliary unit is taken in auxiliary unit.Can be that still frame auxiliary unit is taken in a plurality of auxiliary unit, and a plurality of auxiliary unit show as reference Figure 37 A order as described in Figure 37 C the view data that sends from master unit 1.This also can be applied to scene change auxiliary unit.
Next step, Figure 59 illustrates the 9th illustrative functions structure of the signal processing 127 (Figure 10) in the master unit 1 of carrying out virtual many viewpoints Graphics Processing.
Frame memory 241 receives from the view data of mpeg video decoder 125 (Figure 10) output, and temporarily storing image data.More particularly, frame memory 241 has the memory capacity that for example allows storage at least two frame image datas, and overrides the mode sequential storage view data of older frame image data of time with nearest view data.
In the present embodiment, as mentioned above, mpeg video decoder 125 output brightness signal Y and chrominance signal R-Y and B-Y are as view data.In Figure 59, brightness signal Y and chrominance signal R-Y and B-Y general designation are made the image data.
In statu quo send to matrix circuit 128 (Figure 10) and the frame memory 241 of rear one-level from the view data of mpeg video decoder 125 outputs.
In the embodiment shown in Figure 59, the voice data of exporting from mpeg audio decoder 126 sends to the amplifier 137 of rear one-level by signal processing 127, but Figure 59 does not illustrate voice data.
N frame memory 242 1To 242 NBe stored in temporarily and store in frame memory 241 and divide 243 view data that send from memory controller.
Memory controller divides 243 to be controlled by system controller 247, and the view data (its brightness signal Y and chrominance signal R-Y and B-Y) that will be stored in the present frame in frame memory 241 sends to N frame memory 242 1To 242 NOne of, and to override mode, these data are stored in wherein.
Still frame (can regard still frame as) is detected in still frame test section 244 in the view data that frame memory 241 is stored, from frame memory 241 reads image data, and it is sent to rating unit 245.
More particularly, still frame test section 224 for example obtains the absolute difference sum between the view data of the view data of the nearest frame (present frame) that is stored in frame memory 241 and former frame, and when difference absolute value sum is zero or close to zero the time, the view data of present frame is read as still frame (not comprising (perhaps comprising seldom) motion) in still frame test section 244 from frame memory 241, and it is sent to rating unit 245.
Can be also that when each absolute difference sums of some successive frames is zero or close to zero the time, the view data of last frames is detected as still frame in still frame test section 244.
Rating unit 245 relatively from the still frame test section the 244 still frame view data that send be stored in each frame memory 242 1To 242 NIn the view data match condition of two view data (judge), and comparative result is sent to system controller 247.
More particularly, rating unit 245 for example obtains from the still frame test section the 244 still frame view data that send and is stored in each frame memory 242 1To 242 NIn view data between the absolute difference sum, and comparative result is sent to system controller 247.
Output control part 246 reads under the control of system controller 247 and is stored in frame memory 242 nIn a frame image data, and these data are sent to CPU 129.
System controller 247 is according to the control of CPU 129 and the output of rating unit 245, control storage control section 243 and output control part 246.
Virtual many viewpoints Graphics Processing in next step master unit of being carried out by the signal processing 127 shown in Figure 59 with reference to the flow chart description of Figure 60.
At first, at step S241, system controller 247 is made as with the maximum number N of the auxiliary unit of still frame the sum that composition can be expanded the auxiliary unit 2 of television system.Therefore, in the present embodiment, the maximum number N of still frame auxiliary unit is made as 8 at step S241.
Then, process and enter step S242, and system controller 247 will represent frame memory 242 respectively 1To 242 NWhether the storage mark flg (1) of storing image data is made as to flg (N) and represents the not yet value of storing image data, as " 0 ".Processing enters step S243.
Storage mark flg (1) is stored in unshowned memory to flg (N), and is built in system controller 247.
At step S243, system controller 247 judges whether CPU 129 have specified still frame auxiliary unit.
More particularly, in the embodiment shown in Figure 60, all auxiliary unit 2 that composition can be expanded television system are by the default still frame auxiliary unit be appointed as.The user can specify the auxiliary unit 2 of taking on the auxiliary unit of still frame.For example, the menu button switch 84 of user on can operation control 15 (Fig. 7) comes the display menu screen, and clicks the icon that is used to specify still frame auxiliary unit and specify auxiliary unit.At step S243, determine whether and clicked the icon that is used to specify the auxiliary unit of still frame.
When the auxiliary unit of still frame has been specified in step S243 judgement, in other words, when user's operation control 15, click is used to specify the icon of the auxiliary unit of still frame, and specify one or more auxiliary unit 2 as the auxiliary unit of still frame, thereby being used to indicate, CPU 129 output specify one or more auxiliary unit 2 for the order of still frame auxiliary unit during to system controller 247, processing enters step S244, and system controller 247 arranges specified one or more auxiliary unit 2 as still frame auxiliary unit (be identified as still frame auxiliary unit).Then, processing enters step S245.
At step S245, system controller 247 is made as with the maximum number N of the auxiliary unit of still frame the auxiliary unit number of being appointed as the auxiliary unit of still frame by CPU 129 again.Processing enters step S248.
On the other hand, process and enter step S246 when not yet specifying the auxiliary unit of still frame when judging at step S243, and system controller 247 is judged whether CPU 129 have sent for removing and is stored in frame memory 242 nIn the removing request of view data.
As described later, frame memory 242 nThen the still frame view data that the storage past is received by master unit 1, receives and is stored in frame memory 242 when master unit 1 is new nIn the similar still frame view data of still frame view data the time, upgrade frame memory 242 nMemory contents.Therefore, when frame memory 242 nDuring storage still frame view data, only will be stored in frame memory 242 with the similar still frame view data of the still frame view data of storing nIn.
In the embodiment shown in Figure 60, the user can remove frame memory 242 by operation control 15 nMemory contents.At step S246, whether judgement user operation control 15 removes frame memory 242 nThereby CPU 129 will be stored in frame memory 242 for removing nIn the removing request of view data send to system controller 247.
Be used for removing frame memory 242 nRequest can for example carry out on menu screen.
Be used for removing frame memory 242 when judging at step S246 not yet to send nRequest the time, process skips steps S247, and enter step S248.
Sent for removing frame memory 242 when judging at step S246 nRequest the time, in other words, receive to be used to indicate from CPU 129 when system controller 247 and remove frame memory 242 nOrder the time, process to enter step S247, and system controller 247 is made as zero with storage mark flg (n).Then, processing enters step S248.
At step S248, frame memory 241 is waited for from mpeg video decoder 125 (Figure 10) and is sent the new image data frame, and storing image data.Processing enters step S249.
At step S249, judge whether step S244 before the next-door neighbour current frame image data in being stored in frame memory by still frame test section 244 show still frame.When judging that view data does not show still frame, process skips steps S250 to S259, and enter step S260.
When the view data of judging present frame at step S249 showed still frame, still frame test section 244 was from the current frame image data of frame memory 241 reading displayed still frames, and it is sent to rating unit 245.Processing enters step S250.At step S250, system controller 247 will represent frame memory 242 1To 242 NIn frame memory 242 to be dealt with nVariable " n " be made as zero.Processing enters step S251.At step S251,247 pairs of variablees of system controller " n " increase 1.Processing enters step S252.System controller 247 judges whether storage mark flg (n) is zero.
When judging storage mark flg (n) at step S252 as zero the time, in other words, when frame memory 242 nNot yet during storing image data, process and enter step S253, and system controller 247 is made as expression frame memory 242 with storage mark flg (n) nStored the value of still frame view data as " 1 ".Processing enters step S254.
At step S254, system controller 247 control storage control sections 243 send to frame memory 242 with the view data (its brightness signal Y and chrominance signal R-Y and B-Y) that will be stored in the present frame in frame memory 241 n, and to override mode, it is stored in wherein.Processing enters step S258.
At step S258, system controller 247 is controlled output control part 246 with from frame memory 242 nRead a frame image data, and it is sent to CPU 129.In addition, at step S258, system controller 247 will be used to indicate and show in still frame auxiliary unit and be stored in frame memory 242 nIn the demonstration request command of view data send to CPU 129.Processing enters step S259.
When being used to indicate demonstration from system controller 247 receptions, CPU 129 is stored in frame memory 242 nIn the demonstration request command of view data the time, CPU 129 controls IEEE-1394 interfaces 133 sending to the auxiliary unit 2 of taking on the auxiliary unit of still frame corresponding to frame memory 242 from the frame image data (its brightness signal Y and chrominance signal R-Y and B-Y) that output control part 246 sends together with the demonstration request command that is used to indicate display image data n Auxiliary unit 2 ij
More particularly, frame memory 242 1To 242 NNumber N equal to take on number (maximum number) N of the auxiliary unit 2 of still frame auxiliary unit.After being adjacent to step S241 or step S244, system controller 247 will be taken on one of auxiliary unit 2 of still frame auxiliary unit and distribute to each frame memory 242 nThereby a frame memory 242 is corresponding to an auxiliary unit 2 of taking on the auxiliary unit of still frame.
When being used to indicate demonstration from system controller 247 receptions, CPU 129 is stored in frame memory 242 nIn the demonstration request command of view data the time, CPU 129 will show that request command sends to corresponding to frame memory 242 nAuxiliary unit.
Take on the signal processing 147 of the auxiliary unit 2 of still frame auxiliary unit and construct as shown in figure 34, and carrying out virtual many viewpoints Graphics Processing shown in Figure 35.Therefore, receive together with showing request command when auxiliary unit 2 and be stored in frame memory 242 nIn view data the time, its shows and to be stored in frame memory 242 nIn view data.
On the other hand, when when step S252 judges that storage mark flg (n) is non-vanishing, in other words, when frame memory 242 nDuring storing image data, process and enter step S255, and rating unit 245 relatively from the still frame test section current frame image data of the 244 demonstration still frames that send be stored in frame memory 242 nIn view data, obtaining absolute difference sum (the absolute difference sum of present frame) as the feature of current frame image data, and should and send to system controller 247.
When system controller 247 receives the absolute difference sum of present frame from rating unit 245, system controller 247 judges at step S256 whether the absolute difference sum of present frame is almost nil, namely, be equal to or less than (or less than) little positive threshold value.
Not only non-vanishing but also keep off when zero when the absolute difference sum of judging present frame at step S256, process skips steps S257 and S258, and enter step S259.
As zero or close to zero the time, in other words, when having to be similar to, the view data of present frame is stored in frame memory 242 when the absolute difference sum of judging present frame at step S256 nIn the pattern of view data, so the view data of present frame receives by master unit 1, and is similar to and is stored in frame memory 242 nIn view data the time, process to enter step S257, and system controller 247 is controlled output control part 246 and is sent to frame memory 242 with the current frame image data that will be stored in the demonstration still frame in frame memory 241 n, and store into wherein to upgrade frame memory 242 to override mode nMemory contents.
Then, process and enter step S258, and as mentioned above, system controller 247 control output control part 246 read and are stored in frame memory 242 nIn a frame image data, and it is sent to CPU 129.In addition, at step S258, output control part 246 will be stored in frame memory 242 nIn view data and being used to indicate show in still frame auxiliary unit and be stored in frame memory 242 nIn the demonstration request command of view data send to CPU 129.Processing enters step S259.
Like this, as mentioned above, newly be stored in frame memory 242 nIn the current frame image data send to corresponding to frame memory 242 from master unit 1 n Auxiliary unit 2, and show.
At step S259, whether system controller 247 decision variables " n " equal the maximum number N of the auxiliary unit of still frame.When judging that they are unequal, process and turn back to step S251, and repeat later on identical process.
When equaling the maximum number N of still frame auxiliary unit at step S259 decision variable " n ", in other words, the current frame image data of completing the demonstration still frame that is stored in frame memory 241 when rating unit 245 be stored in each frame memory 242 1To 242 NIn view data between comparison the time, process to enter step S260, and whether decision-making system controller 247 receives from CPU 129 and is used to indicate the termination order that stops virtual many viewpoints Graphics Processing.
When judging that at step S260 not yet receiving termination orders, process and turn back to step S243, and repeat later on identical process.
When judging that at step S260 receiving termination orders, in other words, for example, when user's operation control 15 (Fig. 7) thereby with at display menu screen on CRT 11 and when again clicking virtual many viewpoints display icon indication CPU 129 stopping virtual many viewpoints Graphics Processings and CPU 129 and will stop order and send to system controller 247 in menu screen, the virtual many viewpoints Graphics Processing of termination in master unit.
According to the virtual many viewpoints Graphics Processing shown in Figure 58, as shown in Figure 61 A, for example, after master unit 1 begins to show the real time programme of MLB Slam match, for example, when master unit 1 shows when representing the still frame view data of scoreboard in the close-up shot mode, the view data of scoreboard is stored in frame memory 242 1In, and send to corresponding to frame memory 242 1The auxiliary unit of still frame, and show.
Specifically, for example, when corresponding to frame memory 242 1Still frame auxiliary unit be auxiliary unit 2 11The time, in auxiliary unit 2 11Middle demonstration is stored in frame memory 242 1In the scoreboard image data frame, as shown in Figure 61 B.
In addition, then, for example, if master unit 1 is presented at the still frame view data that shows the baseballer in interview in the close-up shot mode, will show that baseballer's still frame view data is stored in frame memory 242 in the close-up shot mode 2In, and send to corresponding to frame memory 242 2Auxiliary unit, and show.
Specifically, for example, when corresponding to frame memory 242 2Still frame auxiliary unit be auxiliary unit 2 12The time, in auxiliary unit 2 12Middle demonstration is stored in frame memory 242 2In baseballer's image data frame, as shown in Figure 61 B.
Then, after this, for example, when master unit 1 shows when representing the still frame view data of scoreboard in the close-up shot mode again, with frame memory 242 1Memory contents be updated to this scoreboard view data, and with frame memory 242 1In the update image data send to corresponding to frame memory 242 1The auxiliary unit 2 of still frame 11, and show.In this case, be presented at auxiliary unit 2 11In the scoreboard view data be updated to nearest one.
Therefore, the user can watch the nearest view data of the various still frames that are presented in master unit 1.
Next step, Figure 62 illustrates another example electrical structure of master unit 1.In the figure, be same as Figure 10 allocation of symbols used and give corresponding to part shown in Figure 10, and if there is no need, the descriptions thereof are omitted.
Master unit 1 shown in Figure 10 is the television receiver for receiving digital broadcast, and the master unit 1 shown in Figure 62 is for the television receiver that receives analog broadcasting.
Analog tv broadcast signal in 251 pairs of predetermined channels of tuner detects and demodulation.The output of 252 pairs of tuners 251 of A/D conversion portion applies the A/D conversion, and the view data in the A/D transformation result is sent to Y/C separating part 253, and voice data is sent to signal processing 127.
Y/C separating part 253 is brightness signal separation Y and chrominance signal R-Y and B-Y from the output of tuner 251, and they are sent to signal processing 127.
Even the television receiver that is used for the reception analog broadcasting with the aforementioned manner structure also can be taken on the master unit 1 that can expand television system.
As the television receiver shown in the Figure 62 that takes on master unit, the television receiver that is used for the reception analog broadcasting also can be as the television receiver of taking on auxiliary unit 2.
In the above-described embodiments, can take on that master unit or auxiliary unit and it are taken on auxiliary unit used how many auxiliary unit and difference before using this auxiliary unit according to for example television receiver if composition can be expanded the selling price of the television receiver of television system.
If owing to there is no as mentioned above master unit virtual many viewpoints Presentation Function can not be provided in can expanding television system, so master unit has high value, and the selling price of master unit can be established height.
Can expect that the user will additionally buy a plurality of auxiliary unit in case of necessity after buying master unit.Can establish the selling price of beginning some auxiliary unit for example lower than the selling price of master unit, but higher than the price of common TV receiver.Can also further establish the selling price of the auxiliary unit bought later on low.
In the above-described embodiments, taking on the television receiver that can expand the master unit in television system can for example construct by signal processing 127 being added to the ordinary numbers television receiver and changing the program of being carried out by CPU 129.Therefore, can be by relatively easily making with the ordinary numbers television receiver owing to taking on the television receiver that can expand the master unit in television system, therefore in the situation that consider that above-mentioned virtual many viewpoints by expanding television system and providing show that Premium Features provide expensive efficient.This point is equally applicable to take on the television receiver of auxiliary unit.
In addition, in the above-described embodiments, the view data that sends from master unit 1 (view data that is perhaps received by tuner 141 (Figure 11) according to the freeze command that sends from master unit 1) is stored in frame memory 214 to 216 (Figure 34) to override mode auxiliary unit 2.Can be that auxiliary unit 2 is equipped with such as hard disk etc., the Imagery Data Recording that sends from master unit 1 and reproduces later on this view data according to user instruction hard disk.
In the above-described embodiments, master unit 1 sends to auxiliary unit 2 with a frame image data, and it is presented in auxiliary unit 2.Can be that master unit 1 sends to auxiliary unit 2 with the multiple image data, and auxiliary unit 2 repeat to show the scene by the multiple image data formation.In addition, when the view data that is received by the tuner (Figure 11) of auxiliary unit 2 according to the freeze command that sends from master unit 1 is presented at auxiliary unit 2, can be a scene that repeats in the same manner to show by the multiple image data formation.A scene can be a frame image data or a territory (one-field) view data, can be perhaps from being close to scape on the scene frame after changing to the multiple image data of next scene change frame.
And in the above-described embodiments, virtual many viewpoints Graphics Processing is applied to the television program by master unit 1 reception.Virtual many viewpoints Graphics Processing also can be applied to such as view data and voice data from sending to master unit 1 and do to show as the VTR of external equipment etc.
Next step, Figure 63 A and Figure 63 B are the perspective views that the example structure of using another embodiment that expands television system of the present invention is shown.In the figure, be same as Figure 1A and Figure 1B allocation of symbols used and give corresponding to the part shown in Figure 1A and Figure 1B, and if there is no need, the descriptions thereof are omitted.In other words, Figure 63 A has the structure identical with the television system expanded shown in Figure 1A and Figure 1B basically with the television system expanded shown in Figure 63 B.
In the embodiment shown in Figure 63 A, can expand television system by nine television receivers 1,2 11, 2 12, 2 13, 2 21, 2 23, 2 31, 2 32With 2 33Form.In the embodiment shown in Figure 63 B, can expand television system by 25 television receivers 1,2 11, 2 12, 2 13, 2 14, 2 15, 2 21, 2 22, 2 23, 2 24, 2 25, 2 31, 2 32, 2 34, 2 35, 2 41, 2 42, 2 43, 2 44, 2 45, 2 51, 2 52, 2 53, 2 54With 2 55Form.
The number that composition can be expanded the television receiver of television system is not limited to 9 or 25.In other words, can expand television system can be formed by any a plurality of television receivers.The arrangement that composition can be expanded the television receiver of television system is not limited to 3 * 3 or 5 * 5 horizontal vertical forms shown in Figure 63 A or Figure 63 B.In other words, the arrangement that composition can be expanded the television receiver of television system can be other horizontal vertical forms, as 1 * 2,2 * 1,2 * 3 horizontal vertical forms.In addition, the spread geometry that composition can be expanded the television receiver of television system is not limited to the matrix shape shown in Figure 63 A and 63B, and can be Pyramid.
Can be like this consisted of for any a plurality of television receiver arbitrarily by the arrangement number on the horizontal vertical direction owing to can expanding television system, so it can be called " can expand " system.
In composition can be expanded the television receiver of television system, there are two kinds of television receivers: main television receiver (in case of necessity hereinafter referred to as making master unit), can control other television receivers; And auxiliary television receiver (being called auxiliary unit in case of necessity), can control but can not control other television receivers by another television receiver.
Expand television system any, as described later, full screen display is possible, wherein, shows image on all display screens that form all television receivers that can expand television system.
Be to carry out full screen display in can expanding television system, can expand television system and need to have and follow the television receiver that to expand television system (below be called to expand use the unit), and one of them need to be master unit.Therefore, in each embodiment shown in Figure 63 A and Figure 63 B, in composition can be expanded the television receiver of television system, the television receiver that for example is positioned at central authorities was taken on master unit 1.
For above-mentioned reasons, using in forming the television receiver that can expand television system is not in the time of can expanding the television receiver that uses the unit, can not provide full screen display.In addition, even be all in the time of can expanding the use unit when forming all television receivers that can expand television system, if but they are auxiliary unit entirely, can not provide full screen display.
Therefore, for using the full screen display function can expand television system, the user need to buy at least one or a plurality of master unit or a master unit and one or more auxiliary unit.
Because master unit can be taken on auxiliary unit, therefore can use a plurality of master units in forming the television receiver that to expand television system.
In the embodiment shown in Figure 63 A, the television receiver 1 that is positioned at central authorities' (from left to right second and gone up second) in 3 * 3 television receivers is taken on master unit, and other eight television receivers 2 11, 2 12, 2 13, 2 21, 2 23, 2 31, 2 32With 2 33Take on auxiliary unit.In the embodiment shown in Figure 63 B, the television receiver 1 that is positioned at central authorities' (from left to right the 3rd and gone up the 3rd) in 5 * 5 television receivers is taken on master unit, and other 24 television receivers 2 11, 2 12, 2 13, 2 14, 2 15, 2 21, 2 22, 2 23, 2 24, 2 25, 2 31, 2 32, 2 34, 2 35, 2 41, 2 42, 2 43, 2 44, 2 45, 2 51, 2 52, 2 53, 2 54With 2 55Take on auxiliary unit.
Therefore, in each embodiment shown in Figure 63 A and Figure 63 B, master unit 1 is positioned at the central authorities that composition can be expanded the television receiver of television system.The position of master unit 1 is not limited to form the central authorities of the television receiver that can expand television system.Master unit 1 can be positioned at any position, as the upper left corner or the lower right corner.
For simplification the following describes, suppose that can expand television system is formed by 3 * 3 television receivers as shown in Figure 63 A, and master unit 1 is positioned at the central authorities that composition can be expanded the television receiver of television system.
Be used for forming the auxiliary unit 2 that to expand television system ijIn subfix i and j refer to auxiliary unit 2 ijBe arranged in the capable and j of the i that can expand television system row (on play the capable and j row from left to right of i).
Below, if there is no need to distinguish auxiliary unit 2 ij, their general designations are made auxiliary unit 2.
Master unit 1 and auxiliary unit 2 due to the television system expanded shown in composition diagram 63A and Figure 63 B ijExternal structure with described identical to Fig. 5 F to Fig. 3 F, Fig. 4 and Fig. 5 A with reference to Fig. 2, Fig. 3 A, therefore the descriptions thereof are omitted.Thereby master unit 1 and auxiliary unit 2 ijThe annex that has respectively controller 15 (Fig. 2) and controller 35 (Fig. 4), with described identical to Fig. 9 with reference to Fig. 7, therefore the descriptions thereof are omitted due to its structure.
Next step, Figure 64 illustrates the example electrical structure for the master unit 1 of Figure 63 A and Figure 63 B.
Television broadcasting signal by unshowned antenna reception sends to tuner 1121, and process detects and demodulation under the control of CPU1129.The output of tuner 1121 sends to QPSK (orthogonal PSK) demodulator circuit 1122, and process QPSK demodulation under the control of CPU 1129.The output of QPSK demodulator circuit 1122 sends to error correction circuit 1123.Process error detection and correction under the control of CPU 1129, and output is sent to demodulation multiplexer 1124.
Descrambling is carried out in demodulation multiplexer 1124 output to error correction circuit 1123 in case of necessity under the control of CPU 1129, and extracts TS (transport stream) grouping in predetermined channel.Then, demodulation multiplexer 1124 sends to MPEG (motion picture expert group) Video Decoder 1125 with view data (video data) TS grouping, and voice data (voice data) TS grouping is sent to mpeg audio decoder 1126.The TS grouping that demodulation multiplexer 1124 also will be included in the output of error correction circuit 1123 in case of necessity sends to CPU 1129.Demodulation multiplexer 1124 also receives view data or the voice data (comprising those data the TS grouping) that sends from CPU 1129, and it is sent to mpeg video decoder 1125 or mpeg audio decoder 1126.
Mpeg decode is carried out in 1125 pairs of view data TS groupings that send from demodulation multiplexer 1124 of mpeg video decoder, and sends to selector 1127.Mpeg decode is carried out in 1126 pairs of voice data TS groupings that send from demodulation multiplexer 1124 of mpeg audio decoder.The L channel voice data and the right audio channel data that obtain by decoding in mpeg audio decoder 1126 send to selector 1127.
Under the control of CPU 1129, selector 1127 selections are from the view data of mpeg video decoder 1125 outputs, from the view data of CPU 1129 outputs or the view data that sends from safety system part 1137, and send to NTSC (National Television System Committee, National Television System Committee) encoder 1128.NTSC encoder 1128 will convert the NTSC view data to from the view data that selector 1127 sends, and it is sent to CRT 11 and shows thereon.Under the control of CPU 1129, selector 1127 is also selected from L channel and the right audio channel data of mpeg audio decoder 1126 transmissions, from the voice data of CPU 1129 transmissions or the voice data that sends from safety system part 1137, and send to amplifier 1138.
CPU 1129 carries out various processing according to the program that is stored in EEPROM (Electrically Erasable Read Only Memory) 1130 and ROM (read-only memory) 1131, with control example such as tuner 1121, QPSK demodulator circuit 1122, error correction circuit 1123, demodulation multiplexer 1124, selector 1127, IEEE-1394 interface 1133, modulator-demodulator 1136 and safety system part 1137.In addition, CPU 1129 also will send to IEEE-1394 interface 1133 from the data that demodulation multiplexer 1124 sends, and will send to demodulation multiplexer 1124 and selector 1127 from the data that IEEE-1394 interface 1133 sends.In addition, CPU 1129 also carries out from the corresponding processing of order of front panel 1134 or 1135 transmissions of IR receiving unit.And CPU 1129 also controls modulator-demodulator 1136 and obtains refresh routine and necessary data to access unshowned server by telephone wire.
Even also need data and the program of preserving after EEPROM 1130 storage power supplys disconnect.ROM1131 is storing IP L (initial program loader) program for example.Being stored in data in EEPROM 1130 and program can upgrade by overriding.
Required data and the program of operation of the interim storage CPU 1129 of RAM 1132.
IEEE-1394 interface 1133 is connected to terminal panel 21 (its IEEE-1394 terminal 21 11To 21 33(Fig. 3 F)), and as the interface of carrying out the communication of following IEEE 1394 standards.Like this, IEEE-1394 interface 1133 will send to the external world from the data that CPU 1129 sends according to IEEE 1394 standards, and receive according to IEEE 1394 standards the data that send from the external world, and it is sent to CPU 1129.
Front panel 1134 is installed in a certain position in the front of master unit 1.Front panel 1134 is included as the part of the push-button switch that controller 15 (Fig. 7) provides.During push-button switch when operation on front panel 1134, send to CPU 1129 corresponding to the operation signal of this operation.In this case, CPU 1129 carries out the corresponding processing of operation signal that sends from front panel 1134.
Operation on IR receiving unit 1135 response controllers 15 receives the infrared ray (light) that sends from controller 15.In addition, 1135 pairs of receiving infrared-rays of IR receiving unit carry out opto-electronic conversion, and the signal that will therefore obtain sends to CPU 1129.In this case, CPU 1129 carries out the corresponding processing of signal that sends from IR receiving unit 1135, namely, and the corresponding processing of the operation on controller 15.
Modulator-demodulator 1136 is controlled communication by telephone wire, thereby sends by telephone wire the data that send from CPU 1129, receives the data that send by telephone wire, and it is sent to CPU 1129.
Safety system part 1137 is formed by safety governor 1137A, wave point 1137B, data processing section 1137C and warning processing section 1137D, and carries out described later for the various processing of emergency (abnormal thing) occur to user's warning under the control of CPU 1129.
More particularly, safety governor 1137A controls wave point 1137B, data processing section 1137C and warning processing section 1137D under the control of CPU 1129.
Wave point 1137B receives image (live image) data and the voice data that send from aftermentioned video camera 1162 (Figure 67), and they is sent to selector 1127 and data processing section 1137 as the interface of carrying out radio communication.For example, can adopt in IEEE 802.11 regulation and the NIC (network interface card, network interface unit) by so-called WLAN executive communication as wave point 1137B.
Can also adopt according to the standard that is different from IEEE 802.11 and carry out the unit of radio communication as wave point 1137B.Yet preferably wave point 1137B follows and has the standard that enough transmission bands are used for transmitting and receiving moving image data.
Data processing section 1137C detects from the view data of wave point 1137B transmission or the characteristic quantity of voice data with predetermined space, and it is sent to warning processing section 1137D.
Warning processing section 1137D analyzes view data or the fluctuation in time of voice data characteristic quantity that sends from data processing section 1137C.In addition, warning processing section 1137D judges whether view data or the fluctuation in time of voice data characteristic quantity satisfy predetermined condition, and according to result of determination request safety governor 1137A warning user.
Amplifier 1138 amplifies the voice data that sends from selector 1127 in case of necessity, and it is sent to loudspeaker unit 12L and 12R.Amplifier 1138 has D/A (digital-to-analog) transducer, in case of necessity voice data is applied the D/A conversion, and exports it.
The master unit 1 of structure is exported image and the sound (showing image and output sound) in television program in the following manner as mentioned above.
Send to demodulation multiplexer 1124 as the transport stream by the television broadcasting signal of antenna reception by tuner 1121, QPSK demodulator circuit 1122 and error correction circuit 1123.Demodulation multiplexer 1124 extracts the TS grouping of scheduled program from transport stream, view data TS grouping is sent to mpeg video decoder 1125, and voice data TS grouping is sent to mpeg audio decoder 1126.
Mpeg decode is carried out in 1125 pairs of TS groupings that send from demodulation multiplexer 1124 of mpeg video decoder.Therefore the view data that obtains sends to CRT 11 to show by selector 1127 and NTSC encoder 1128 from mpeg video decoder 1125.
Mpeg decode is carried out in 1126 pairs of TS groupings that send from demodulation multiplexer 1124 of mpeg audio decoder.Therefore the voice data that obtains sends to loudspeaker unit 12L and 12R to export by selector 1127 and amplifier 1138 from mpeg audio decoder 1126.
In master unit 1, IEEE-1394 interface 1133 receives the TS grouping that sends from another unit.In the TS grouping, view data TS grouping and voice data TS grouping send to mpeg video decoder 1125 and mpeg audio decoder 1126 by CPU 1129 and demodulation multiplexer 1124, and export (demonstration) in the mode identical with voice data with the view data of television broadcasting signal.
In master unit 1, the wave point 1137B of safety system part 1137 receives view data and the voice data that sends from another unit.The view data that is received by wave point 1137B sends to CRT 11 to show by selector 1127 and NTSC encoder 1128.The voice data that is received by wave point 1137B sends to loudspeaker unit 12L and 12R to export by selector 1127 and amplifier 1138.
Therefore, in the embodiment shown in Figure 64, master unit 1 has three input systems and is used for receiving view data and voice data: tuner 1121, IEEE-1394 interface 1133 and wave point 1137B.Although not shown in Figure 64, input terminal 23 is installed on master unit as shown in Fig. 3 F.Therefore, when comprising input terminal 23, master unit 1 has four input systems.
The number of input system that offers master unit 1 is unrestricted.
Next step, Figure 65 illustrates the example electrical structure for the auxiliary unit 2 of Figure 63 A and Figure 63 B.
Because auxiliary unit 2 is formed to amplifier 1158 to the identical tuner 1141 of amplifier 1138 with the tuner 1121 shown in Figure 64 by structure, therefore the descriptions thereof are omitted.
Next step, Figure 66 illustrates the example structure of the safety system of using the television system expanded shown in Figure 63 A and Figure 63 B.
In the embodiment shown in Figure 66, safety system is by the television system expanded with a plurality of television receivers 1161 and three video cameras (video camera) 1162 1, 1162 2With 1162 3Form.
Can expand television system 1161 for example forms in the mode identical with the television system expanded shown in Figure 63 A.
Video camera 1162 1, 1162 2With 1162 3For example digital video camcorder, and photographic images, sound gathered, and view data and the voice data that will therefore obtain with wireless mode send to the scheduled television receiver that composition can be expanded television system 1161.
Safety system in Figure 66 illustrated embodiment is by three video cameras 1162 1To 1162 3Form.The video camera number that is included in safety system is not limited to 3.Safety system can be equipped with one or more video cameras.
In safety system, can equal to form from the maximum number that video camera 1162 receives the television receiver of view data and voice data the number of the television receiver that can expand television system 1161.Therefore, when safety system is equipped with its number and surpass forms the video camera 1162 of the television receiver number that can expand television system 1161, can expand television system 1161 and can not receive never view data and the voice data that the video camera 1162 corresponding to any television receiver sends.Each television receiver that can expand television system 1161 can also switch the video camera that receives view data and voice data from it.
Below, if there is no need to distinguish video camera 1162 1To 1162 3, their general designations are made video camera 1162.
Next step, Figure 67 illustrates the example structure of the video camera 1162 shown in Figure 66.
Be incident on the optical system 1171 that is formed by lens, focus adjusting mechanism, aperture adjustment mechanism etc. from the light of object, and be gathered on the light-receiving plane of CCD (Charge Coupled Device, charge coupled device) 1172.1172 pairs of light that send from optical system 1171 of CCD carry out opto-electronic conversion, and will send to as the signal of telecommunication of view data amplifier 1173.1173 pairs of view data that send from CCD 1172 of amplifier are amplified, and it is sent to A/D (analog/digital) conversion portion 1174.The analog signal image data that 1174 pairs of A/D conversion portions send from amplifier 1173 are sampled and are quantized (A/D conversion), converting digital signal image data to, and send to memory 1175.The view data that the interim storage of memory 1175 sends from A/D conversion portion 1174.
Microphone 1176 will convert signal of telecommunication voice data to as the sound of microphone surrounding air vibration, and it is sent to amplifier 1177.1177 pairs of voice datas that send from microphone 1176 of amplifier amplify, and it is sent to A/D conversion portion 1178.1178 pairs of analog signal voice datas that send from amplifier 1177 of A/D conversion portion apply A/D and change to obtain digital audio-frequency data, and it is sent to memory 1179.The voice data that the interim storage of memory 1179 sends from A/D conversion portion 1178.
Wave point 1180 is interfaces identical with the described wave point 1137B of reference Figure 64, and view data and the voice data that will be stored in memory 1175 and 1179 with wireless mode send to the scheduled television receiver that composition can be expanded television system 1161.
Radio communication between wave point 1137B and 1180 can for example be passed through TCP/IP (Transmission Control Protocol/Internet Protocol, TCP/IP) and carry out.In this case, the IP address can be used for specifying video camera 1162 1To 1162 3The composition that transmission view data and voice data will arrive at can be expanded the television receiver of television system.
Video camera 1162 can be field camera, and in this case, the user can easily install video camera 1162, thereby takes desired location.
Can also be that video camera 1162 adopts MPEG or additive method that view data and voice data are encoded, and send them.
Next step is with reference to the processing of the flow chart description shown in Figure 68 as the master unit 1 (Figure 64) of the television receiver that forms the television system expanded 1161 shown in Figure 66.
At first, at step S1041, CPU 1129 determines whether the event that any unit is connected to terminal panel 21 or sends any order by IEEE-1394 interface 1133 or IR receiving unit 1135 that occurs.When judgement any event does not occur, process turning back to step S1041.
When judging that at step S1041 a unit occurs is connected to the event of terminal panel 21, process and enter step S1042, and CPU 1129 carries out the authentication processing shown in Figure 69 described later.Then, processing turns back to step S1041.
For having determined whether that a unit is connected to terminal panel 21, need to detect this unit and be connected to terminal panel 21 these states.This test example is as carrying out in following mode.
The IEEE-1394 terminal 21 that provides for terminal panel 21 (Fig. 3 F) is provided when a unit (by the IEEE-1394 cable) ijThe time, IEEE-1394 terminal 21 ijTerminal voltage change.IEEE-1394 interface 1133 is configured to make it that terminal voltage is changed and reports to CPU 1129.CPU 1129 receives the report of terminal change in voltage from IEEE-1394 interface 1133, be connected to terminal panel 21 these states to detect new unit.CPU 1129 also for example by Same Way identify a unit with this state of terminal panel 21 disconnections.
When occuring in step S1041 judgement to send the event of any orders by IEEE-1394 interface 1133 or IR receiving unit 1135, process and enter step S1043, and master unit 1 execution is corresponding to the processing of this order.Then, processing turns back to step S1041.
Next step, the authentication processing of being carried out by master unit 1 at the step S1042 shown in Figure 68 with reference to the flow chart description shown in Figure 69.
In the authentication processing of being carried out by master unit 1, carry out two kinds of authentications: whether the unit (in case of necessity hereinafter referred to as connecting the unit) that newly is connected to terminal panel 21 is to authorize the IEEE-1394 unit, and whether the IEEE-1394 unit is the television receiver (can expand and use the unit) of taking on master unit or auxiliary unit.
More particularly, in the authentication processing by master unit 1 execution, at first at step S1051, CPU1129 controls IEEE-1394 interface 1133 will send to linkage unit for the authentication request order that request authenticates mutually, and processing enters step S1052.
At step S1052, CPU 1129 determines whether from linkage unit and returns to response to the authentication request order.Process and enter step S1053 when returning to response to the authentication request order from linkage unit when judging at step S1052 not yet, and whether CPU 1129 determines whether and occur overtimely namely, pass by the scheduled time since sending the authentication request order.
When judge at step S1053 occured overtime, in other words, even when also not returning to response to the authentication request order from linkage unit when sending the authentication request order pass by the scheduled time to linkage unit after, processing entering step S1054.CPU 1129 judges that linkage unit is not to authorize IEEE-1394 unit and authentification failure, and operator scheme is made as the stand-alone mode that does not transmit any data with linkage unit.Processing is returned.
Therefore, after master unit 1 not be not any data of linkage unit transmission of authorizing the IEEE-1394 unit, and do not carry out IEEE-1394 communication.
Not yet occur to process and turn back to step S1052, and repeat later on identical process when overtime when judging at step S1053.
Return to response to the authentication request order when judging at step S1052 from linkage unit, in other words, received by IEEE-1394 interface 1133 from the response of linkage unit and when sending to CPU 1129, processing enters step S1055, and CPU 1129 generates random number (pseudo random number) R1 according to pre-defined algorithm, and by IEEE-1394 interface 1133, it is sent to linkage unit.
Then, processing enters step S1056, and CPU 1129 determines whether and sends encrypted random number E ' (R1) from linkage unit, wherein, encrypted random number E ' is (R1) by according to pre-defined algorithm (secret key encryption method such as DES (data encryption standard), FEAL (rapid data cryptographic algorithm) or RC5) random number R 1 that sends at step S1055 is encrypted obtains.
Send encrypted random number E ' (R1) time when judging at step S1056 not yet from linkage unit, process and enter step S1057, and whether CPU 1129 determines whether and occur overtimely namely, pass by the scheduled time since sending random number R 1.
Occured overtimely when judging at step S1057, in other words, even also do not return to encrypted random number E ' (R1) time from linkage unit during the scheduled time in the past after random number R 1 being sent to linkage unit, processing entering step S1054.CPU 1129 judges that as mentioned above linkage unit is not to authorize the IEEE-1394 unit, and operator scheme is made as stand-alone mode.Processing is returned.
Not yet occur to process and turn back to step S1056, and repeat later on identical process when overtime when judging at step S1057.
Return to encrypted random number E ' (R1) from linkage unit when judging at step S1056, in other words, (R1) received by IEEE-1394 interface 1133 and when sending to CPU1129 from the encrypted random number E ' of linkage unit, processing enters step S1058, and CPU 1129 is encrypted the random number R 1 that generates at step S1055 according to predetermined cryptographic algorithm, to generate encrypted random number E (R1).Processing enters step S1059.
At step S1059, CPU 1129 judges whether the encrypted random number E ' that sends from linkage unit (R1) equals the encrypted random number E (R1) that generated by CPU 1129 at step S1058.
When judging that at step S1059 encrypted random number E ' (R1) is not equal to encrypted random number E (R1), in other words, when the cryptographic algorithm that linkage unit adopts (comprise in case of necessity for encrypt key) is different from the cryptographic algorithm that CPU1129 adopts, processing enters step S1054, and CPU 1129 judges that as mentioned above linkage unit is not to authorize the IEEE-1394 unit, and operator scheme is made as stand-alone mode.Processing is returned.
When judging that at step S1059 encrypted random number E ' (R1) equals encrypted random number E (R1), in other words, when the cryptographic algorithm that the cryptographic algorithm that linkage unit adopts and CPU 1129 adopt is identical, processing enters step S1060, and CPU 1129 determines whether from linkage unit and sends the random number R 2 that is used for authenticating master unit 1 by linkage unit.
When step S1060 judgement not yet receives random number R 2, processing enters step S1061, and CPU 1129 determines whether that generation is overtime, namely, for example, whether judge that at step S1059 encrypted random number E ' (R1) equals encrypted random number E (R1) and passes by the scheduled time afterwards.
When having occured overtime in step S1061 judgement, in other words, even when also sending random number R 2 from linkage unit when the scheduled time in past, processing enters step S1054, and CPU 1129 judges that as mentioned above linkage unit is not to authorize the IEEE-1394 unit, and operator scheme is made as stand-alone mode.Processing is returned.
On the other hand, not yet occur to process and turn back to step S1052, and repeat later on identical process when overtime when judging at step S1061.
Send random number R 2 when judging at step S1060 from linkage unit, in other words, received by IEEE-1394 interface 1133 and when sending to CPU 1129 from the random number R 2 of linkage unit, processing enters step S1062, and CPU 1129 is encrypted random number R 2 according to predetermined cryptographic algorithm, with generation encrypted random number E (R2), and by IEEE-1394 interface 1133, it is sent to linkage unit.
When judging from linkage unit transmission random number R 2 at step S1060, linkage unit is the IEEE-1394 unit by success identity.
Then, process to enter step S1063, and CPU 1129 controls IEEE-1394 interfaces 1133 will be used for asking the unit ID of linkage unit and the function-information request command of function information and unit ID and the function information of master unit 1 to send to linkage unit.
Unit ID is unique ID that sign is taken on the television receiver of master unit 1 or auxiliary unit 2.
Function information is the information relevant to having function, and (for example comprise the type of the order that for example will receive from the external world, to receive to be used for control from the external world which of order of mains switch, volume adjusting, channel, brightness and acutance), whether allow screen display (OSD demonstration), whether allow mute state, and whether allow sleep state.In addition, function information comprises also whether the unit has the function of master unit or the function of auxiliary unit.
Master unit 1 can for example be stored in unit ID and function information in vendor_dependent_information (manufacturer's relevant information) field of EEPROM 1130 or configuration ROM shown in Figure 15.
Then, process to enter step S1064, and CPU 1129 wait-for-responses send to the function-information request command of linkage unit and arrive unit ID and the function information of linkage unit at step S1063.CPU1129 passes through IEEE-1394 interface 1133 receiving element ID and function informations, and they are stored in EEPROM 1130, and processing enters step S1065.
At step S1065, CPU 1129 is with reference to the function information that is stored in EEPROM 1130, to judge that whether linkage unit is as auxiliary unit.When judging that at step S1065 linkage unit is auxiliary unit, in other words, when linkage unit is auxiliary unit by success identity, processes skips steps S1066 and S1067, and enter step S1068.CPU 1129 is made as with operator scheme the full screen display that allows full screen display may pattern.Processing is returned.
On the other hand, when judging that at step S1065 linkage unit is not auxiliary unit, process and enter step S1066, and CPU 1129 is with reference to being stored in function information in EEPROM 1130 to judge that whether linkage unit is as master unit.When judging that at step S1066 linkage unit is master unit, in other words, when linkage unit is master unit by success identity, processes and enter step S1067, and CPU 1129 carries out master unit with the linkage unit of taking on master unit and the adjustment of auxiliary unit is processed.
Therefore specifically, in this case, because master unit 1 is connected to another master unit, forms and exist two television receivers to take on master unit in the television receiver that to expand television system.In the present embodiment, requirement can be expanded only a master unit in television system.Therefore, at step S1067, carry out master unit and the adjustment of auxiliary unit and process, wherein, judgement is that master unit 1 or the linkage unit of taking on master unit are used as the television receiver of taking on master unit.
More particularly, for example, judging becomes that master unit that can expand a television system part the earliest, and namely, the master unit 1 in the present embodiment is as the television receiver of taking on master unit.Other master units that are not judged to be as master unit are used as auxiliary unit.
Carry out after master unit processes with the adjustment of auxiliary unit at step S1067, process and enter step S1068, and CPU 1129 as mentioned above operator scheme is made as full screen display may pattern, and processing is returned.
When judging that at step S1066 linkage unit is not master unit, in other words, linkage unit is not neither master unit is again auxiliary unit, therefore when linkage unit is not authenticated to be master unit or auxiliary unit, processes and enter step S1069, and CPU 1129 is made as common command function with operator scheme and receives/supply a pattern, wherein, can have the AV/C command set now with the linkage unit transmission, but can not transmit be used to the control command that the full screen display function is provided, and process and return.
Therefore in other words, in this case, neither master unit is not again auxiliary unit, even when this linkage unit is connected to master unit 1, do not provide the full screen display function due to linkage unit yet.Yet, in this case, because linkage unit is to authorize the IEEE-1394 unit, therefore allow the existing AV/C command set of transmission between master unit 1 and linkage unit.Thereby in this case, between master unit 1 and linkage unit, one another IEEE-1394 unit of master unit 1 (or be connected to) can control another by existing AV/C command set.
Next step is with reference to the processing of the flow chart description shown in Figure 70 as the auxiliary unit 2 (Figure 65) of the television unit that forms the television system expanded shown in Figure 66.
At first, at step S1071, CPU 1149 determines whether the event that any unit is connected to terminal panel 41 or sends any order by IEEE-1394 interface 1153 or IR receiving unit 1155 that occurs.When judgement any event does not occur, process turning back to step S1071.
When judging that at step S1071 a unit occurs is connected to the event of terminal panel 41, process and enter step S1072, and CPU 1149 carries out the authentication processing shown in Figure 71 described later.Then, processing turns back to step S1071.
For having determined whether that a unit is connected to terminal panel 41, need to detect this unit and be connected to terminal panel 41 these states.This test example is as to carry out with mode identical as described in the step S1041 as shown in Figure 68.
When occuring in step S1071 judgement to send the event of any orders by IEEE-1394 interface 1153 or IR receiving unit 1155, process and enter step S1073, and 2 execution of auxiliary unit are corresponding to the processing of this order.Then, processing turns back to step S1071.
Next step, the authentication processing of being carried out by auxiliary unit 2 at the step S1072 shown in Figure 70 with reference to the flow chart description shown in Figure 71.
In the authentication processing of being carried out by auxiliary unit 2, carries out two kinds of authentications: whether the unit (in case of necessity hereinafter referred to as connecting the unit) that newly is connected to terminal panel 41 is mandate IEEE-1394 unit, and whether the IEEE-1394 unit is master unit.
More particularly, in the authentication processing of being carried out by auxiliary unit 2, at first at step S1081, CPU1149 determines whether to send from linkage unit and is used for the request authentication request order of authentication mutually.When judgement not yet sends this order, process entering step S1082.
Whether at step S1082, CPU 1149 determines whether that generation is overtime, namely, pass by the scheduled time since the beginning authentication processing.
Occured overtimely when judging at step S1082, in other words, even when also sending the authentication request order from linkage unit during the scheduled time in the past after the order of beginning authentication request, processing entering step S1083.CPU 1149 judges that linkage unit is not to authorize IEEE-1394 unit and authentification failure, and operator scheme is made as the stand-alone mode that does not transmit any data with linkage unit.Processing is returned.
Therefore, auxiliary unit 2 not with any data of linkage unit transmission that are not mandate IEEE-1394 unit, and is not carried out IEEE-1394 communication in the mode identical with master unit 1.
Not yet occur to process and turn back to step S1081, and repeat later on identical process when overtime when judging at step S1082.
Send the authentication request order when judging at step S1081 from linkage unit, in other words, the authentication command that sends from the master unit 1 of taking on linkage unit at the step S1051 shown in Figure 69 is received by IEEE-1394 interface 1153 and when sending to CPU 1149, processing enters step S1084, and CPU 1149 controls IEEE-1394 interface 1153 will the response of authentication request order being sent to linkage unit.
In the present embodiment, master unit 1 is carried out the step S1051 shown in Figure 69 to the processing of S 1053, and step S1081, the S1082 shown in Figure 71 and the processing of S1084 are carried out in auxiliary unit 2.Can also be that the step S1051 shown in Figure 69 is carried out to processing and step S1081, the S1082 shown in master unit 1 execution Figure 71 and the processing of S1084 of S1053 in auxiliary unit 2.
Then, process and enter step S1085, and CPU 1149 determines whether from linkage unit transmission random number R 1.When judgement not yet sends random number R 1, process entering step S1086.
At step S1086, whether CPU 1149 determines whether and occurs overtimely, namely, pass by the scheduled time since step S1084 sends response to the authentication request order since CPU 1149.
When judge at step S1086 occured overtime, in other words, even when also sending random number R 1 from linkage unit during the scheduled time in the past after CPU 1149 sends the response of authentication command, processing entering step S1083.CPU 1149 judges that as mentioned above linkage unit is not to authorize the IEEE-1394 unit, and operator scheme is made as the stand-alone mode that does not transmit any data with linkage unit.Processing is returned.
Not yet occur to process and turn back to step S1085, and repeat later on identical process when overtime when judging at step S1086.
Send random number R 1 when judging at step S1085 from linkage unit, in other words, the random number R 1 that sends from the master unit 1 of taking on linkage unit at the step S1055 shown in Figure 69 is received by IEEE-1394 interface 1153 and when sending to CPU 1149, processing enters step S1087, and CPU 1149 is encrypted random number R 1 according to predetermined cryptographic algorithm, to generate encrypted random number E ' (R1).In addition, at step S1087, CPU 1149 controls IEEE-1394 interfaces 1153 so that encrypted random number E ' (R1) is sent to linkage unit.Processing enters step S1089.
At step S1089, CPU 1149 generates random number (pseudo random number) R2, and controls IEEE-1394 interface 1153 so that random number R 2 is sent to linkage unit.Processing enters step S1090.
At step S1090, CPU 1149 determines whether and sends encrypted random number E (R2) from linkage unit, wherein, encrypted random number E (R2) is by random number R 2 being encrypted and being generated by the master unit 1 of taking on linkage unit at the step S1062 shown in Figure 69.
Process and enter step S1091 when not yet receiving encrypted random number E (R2) when judging at step S1090, and CPU 1149 determines whether and occur overtimely, namely, CPU 1149 passes by the scheduled time after sending random number R 2.
When having occured overtime in step S1091 judgement, in other words, even when CPU 1149 sends random number R 2 and also sends encrypted random number E (R2) from linkage unit during the scheduled time in the past after the linkage unit, processing enters step S1083, and CPU 1149 judges that as mentioned above linkage unit is not to authorize the IEEE-1394 unit, and operator scheme is made as stand-alone mode.Processing is returned.
On the other hand, not yet occur to process and turn back to step S1090, and repeat later on identical process when overtime when judging at step S1091.
Send encrypted random number E (R2) when judging at step S1090 from linkage unit, in other words, received by IEEE-1394 interface 1153 and when sending to CPU1149 from the encrypted random number E (R2) of linkage unit, processing enters step S1092, and CPU 1149 is encrypted the random number R 2 that generates at step S1089 according to predetermined cryptographic algorithm, to generate encrypted random number E ' (R2).Processing enters step S1093.
At step S1093, CPU 1149 judges whether the encrypted random number E (R2) that sends from linkage unit equals the encrypted random number E ' that generated by auxiliary unit 2 at step S1092 (R2).
When judging that at step S1093 encrypted random number E (R2) is not equal to encrypted random number E ' (R2), in other words, when the cryptographic algorithm that linkage unit adopts (comprise in case of necessity for encrypt key) is different from the cryptographic algorithm that CPU1149 adopts, processing enters step S1083, and CPU 1149 judges that as mentioned above linkage unit is not to authorize the IEEE-1394 unit, and operator scheme is made as stand-alone mode.Processing is returned.
when judging that at step S1093 encrypted random number E (R2) equals encrypted random number E ' (R2), in other words, thereby the cryptographic algorithm that linkage unit adopts is when authorizing the IEEE-1394 unit with the identical linkage unit of cryptographic algorithm that CPU 1149 adopts by success identity, processing enters step S1094, and CPU1149 is received in by IEEE-1394 interface 1153 unit ID and the function information that the step S1063 shown in Figure 69 is sent together with function-information request command by the master unit 1 of taking on linkage unit, and they are stored in EEPROM 1150.
Then, process to enter step S1095, and CPU 1149 controls function-information request command that IEEE-1394 interfaces 1153 receive from linkage unit in response to step S1094 and unit ID and the function information of auxiliary unit 2 is sent to linkage unit.Then, processing enters step S1096.
Auxiliary unit 2 can adopt the mode that is same as with reference to the described master unit 1 of Figure 69, unit ID and function information is stored in vendor_dependent_information (manufacturer's relevant information) field of EEPROM 1150 or configuration ROM shown in Figure 15.
At step S1096, CPU 1149 is with reference to the function information that is stored in EEPROM 1150, to judge that whether linkage unit is as master unit.When judging that at step S1096 linkage unit is master unit, in other words, when linkage unit is master unit by success identity, processes and enter step S1097, and CPU 1149 is made as operator scheme the described full screen display possibility of the step S1068 pattern of Figure 69.Processing is returned.
On the other hand, when judging that at step S1096 linkage unit is not master unit, in other words, when linkage unit is not authenticated to be master unit, process to enter step S1098, and CPU 1149 is made as common command function with operator scheme and receives/supply a pattern, wherein, can have the AV/C command set now with the linkage unit transmission, but can not transmit be used to the control command that the full screen display function is provided, and process and return.
In other words, in this case, because linkage unit is not master unit, even therefore this linkage unit is connected to auxiliary unit 2, do not provide the full screen display function yet.Therefore, only by another auxiliary unit is connected to auxiliary unit 2, do not provide the full screen display function.Yet, in this case, because linkage unit is to authorize the IEEE-1394 unit, therefore allow the existing AV/C command set of transmission between auxiliary unit 2 and linkage unit.Thereby, in this case, between auxiliary unit 2 and linkage unit (comprising other auxiliary unit), can control another by existing AV/C command set for one.
Next step, when being successfully completed respectively as the master unit 1 of the television receiver of the television system expanded shown in composition diagram 66 and auxiliary unit 2 with reference to Figure 69 and the described authentication processing of Figure 71 and master unit 1 and auxiliary unit 2, their operator scheme is made as full screen display may pattern the time, can expands television system 1161 and can carry out full screen display as shown in Figure 72 A and Figure 72 B.
In other words, for example, when master unit 1 as shown in Figure 72 A during display image data, if indicating, operation control 15 (or 35) carries out full screen display, can expand television system 1161 in the shown view data of whole demonstration screen display master units 1 that forms the television receiver that to expand television system, as shown in Figure 72 B.
Specifically, when master unit 1 output program image and sound (demonstration image and output sound), for example, if user's operation control 15 (Fig. 7) thus in menu button switch 54 when connecting this switch, controller 15 emissions are corresponding to the infrared ray of user's operation.Infrared ray is received by the IR receiving unit 1135 of master unit 1 (Figure 64), and on the CRT 11 of master unit 1 the display menu screen.The icon of expression full screen display (being called the full screen display icon in case of necessity) for example is presented on menu screen.When user's operation control 15 was clicked the full screen display icon, full screen display was carried out in master unit 1 and auxiliary unit 2.
In other words, in this case, obtain the view data zone that will be shown by master unit 1 and will be by each auxiliary unit 2 in the view data zone of the CPU 1129 (Figure 64) of master unit 1 on being presented at CRT 11 ijThe view data zone that shows.In addition, the CPU 1129 of master unit 1 controls IEEE-1394 interfaces 1133 with will be in each auxiliary unit 2 ijThe view data of middle demonstration sends to auxiliary unit 2 together with being used to indicate the full screen display order of carrying out full screen display ijThe view data in the zone that the CPU 1129 of master unit 1 for example will be shown by master unit 1 by interpolation converts the view data of the full-sized with CRT 11 to, and by selector 1127 and NTSC encoder 1128, it is sent to CRT 11, and show thereon.According to the full screen display order that sends from master unit 1, each auxiliary unit 2 ijCarry out the processing identical with master unit 1, to show the view data that sends from master unit 1 on whole CRT 31.
By aforementioned operation, composition can be expanded television receiver display image data on whole display screens of television system 1161, as shown in Figure 72 B.
When the view data on the whole display screens of television receiver that are presented at composition and can expand television system 1161 is called the full-screen image data, can expand the part that television system 1161 does not show the full-screen image data.In fact have owing to can expanding television system 1161 casing that composition can be expanded the television receiver of television system 1161, therefore, casing is positioned at the boundary of adjacent television receiver, and does not show image at these boundaries.In Figure 72 A and Figure 72 B, for simplifying these figure, omit the casing of the boundary that is positioned at adjacent television receiver.In fact, casing is present in the boundary of adjacent television receiver, therefore, although casing is the part of television receiver, do not show the full-screen image data on the casing of television receiver, and view data is divided.
Yet, because having from peripheral part of image, people's vision an image part of being hidden by fine rule is carried out the interpolation function of interpolation, and therefore the problems referred to above are not large problems when watching the full-screen image data.
After showing as mentioned above the full-screen image data, when user's operation control 15 (Fig. 7) comes display menu screen on CRT11, and click is displayed in full screen icon on menu screen, thereby the corresponding expression of the operation on controller 15 is used to indicate the full screen display that stops full screen display and stops the infrared ray of order from controller 15 emissions, received and when sending to CPU 1129 by IR receiving unit 1135, for example, the demonstration of master unit 1 turns back to the general size demonstration as shown in Figure 72 A.In this case, will be displayed in full screen the termination order by IEEE-1394 interface 1133 and send to each auxiliary unit 2 from master unit 1 ijThereby, each auxiliary unit 2 ijDemonstration turn back to reset condition.
In the present embodiment, only have when successfully carrying out with reference to Figure 69 and the described authentication of Figure 71, the full screen display function just is provided.Even when authentification failure, also can provide the full screen display function.
Next step will describe the processing that is used for occurring to user's warning emergency of being carried out by the safety system part 1137 (Figure 64) in master unit 1.
The safety system part 1157 (Figure 65) of auxiliary unit 2 is also carried out the processing that is used for occurring to user's warning emergency.Because this is processed with performed identical by the safety system part 1137 of master unit 1, therefore the descriptions thereof are omitted.
Be used for warning the processing that emergency occurs to be carried out separately by the television receiver of taking on master unit 1 or auxiliary unit to the user.In mode identical when the full screen display function is provided, can be to only have just to carry out this processing when successfully carrying out with reference to Figure 69 and the described authentication of Figure 71.
Figure 73 illustrates the example structure of the data processing section 1137C of the safety system part 1137 shown in Figure 64.
In safety system part 1137, the view data that sends and received by wave point 1137B from video camera 1162 and voice data sends to image processing section 1191 and audio treatment part divides 1192.
Image processing section 1191 for example detects in each frame or each territory from the characteristic quantity of the view data of wave point 1137B transmission, and it is sent to fluctuation calculating section 1194.
More particularly, image processing section 1191 is formed by frame memory 1201, difference arithmetic section 1202 and Difference Storage device 1203.Frame memory 1201 is according to the control of Memory Controller 1193, and interim storage is from the view data of wave point 1137B transmission.Difference arithmetic section 1202 will be stored in one of view data time sequencing frame in frame memory 1201 and sequentially be made as designated frame, and calculate the view data of designated frame and than the difference between the view data of the frame (being called former frame in case of necessity) of the Zao frame of designated frame, and will send to Difference Storage device 1203 by the error image data that difference forms.The error image data take frame as unit that the interim storage of Difference Storage device 1203 sends from difference arithmetic section 1202.
Audio treatment part divides 1192 frames every predetermined number to detect from the characteristic quantity of the voice data of wave point 1137B transmission, and it is sent to fluctuation calculating section 1194.
More particularly, audio treatment part divides 1192 to be formed by circular buffer 1206, integral operation part 1207 and integration memory 1208.Circular buffer 1206 is according to the control of Memory Controller 1193, and interim storage is from the voice data of wave point 1137B transmission.Integral operation part 1207 will be stored in one of a plurality of frames that form by the time sequencing audio data sample of predetermined number in circular buffer 1206 and sequentially be made as designated frame, integration (addition) forms the audio data sample of designated frame, and integrated value is sent to integration memory 1208.The voice data integrated value take frame as unit that the interim storage of integration memory 1208 sends from integral operation part 1207.
Integral operation part 1207 is calculated the integrated value of the voice data take frame as unit, and for example can extract a predetermined sample of the voice data of component frame.
Memory Controller 1193 divides 1192 circular buffer 1206 control writing address and reading address for the frame memory 1201 of image processing section 1191 and audio treatment part.
Fluctuation calculating section 1194 reads error image data take frame as unit as the characteristic quantity of frame image data from the Difference Storage device 1203 of image processing section 1191, and divides 1192 integration memory 1208 to read integrated value take frame as unit as the characteristic quantity of frame voice data from audio treatment part.In addition, fluctuation calculating section 1194 obtains wave component in the view data characteristic quantity and the wave component in the voice data characteristic quantity, and they is sent to the warning processing section 1137D (Figure 64) of rear one-level as fluctuation information.
Former stating in the data processing section 1137C that mode constructs carried out for the treatment of the view data of the view data that sends from wave point 1137B and voice data and processed and the voice data processing.
Process and the voice data processing with reference to the view data that the flow chart description shown in Figure 74 A and Figure 74 B is carried out by data processing section 1137C.
At first process with reference to the flow chart description view data of Figure 74 A.
In view data is processed, at first at step S1101, the frame image data that frame memory 1201 storages send from wave point 1137B.Processing enters step S1102.At step S1102, image data frame in being stored in frame memory 1201 in the processing of difference arithmetic section 1202 with the step S1101 before the next-door neighbour is made as designated frame, and read the view data of designated frame, and the view data that once is stored in the former frame in frame memory 1201 before reading in the processing of step S1101.In addition, at step S1102, difference arithmetic section 1202 deducts the pixel value of the respective pixel that forms the former frame view data from the pixel value of each pixel of forming the designated frame view data, and the absolute value that obtains the difference that obtains is with the error image data of generation take these absolute values as pixel value.The error image data of designated frame send to Difference Storage device 1203 and are stored in wherein.
Then, process to enter step S1103, and fluctuation calculating section 1194 reads the error image data of designated frame from Difference Storage device 1203, and it is carried out binaryzation.More particularly, fluctuation calculating section 1194 relatively forms pixel value and the predetermined threshold of each pixel of error image data.In addition, when pixel value during greater than (perhaps being equal to or greater than) predetermined threshold, fluctuation calculating section 1194 for example is made as " 1 " with pixel value, and when pixel value be equal to or less than (less than) during predetermined threshold, the calculating section 1194 that fluctuates for example is made as " 0 " with pixel value.Processing enters step S1104.
At step S1104, the specific pixel that has as mentioned above the view data (being called the binary image data in case of necessity) of pixel value " 0 " and " 1 " in 1194 pairs of designated frames of fluctuation calculating section is counted.More particularly, in 1194 pairs of binary image data of fluctuation calculating section, pixel value is for example the pixel counts of " 1 " in " 0 " and " 1 ", and it is outputed to warning processing section 1137D as the fluctuation information of designated frame view data.
Then, wait for the view data that sends next frame from wave point 1137B.Processing enters step S1101, and repeats later on identical process.
Next step is processed with reference to the flow chart description voice data shown in Figure 74 B.
In voice data is processed, at first, at step S1111, the frame voice data that circular buffer 1206 storages send from wave point 1137B.Processing enters step S1112.At step S1112, audio data frame in being stored in circular buffer 1206 in the processing of integral operation part 1207 with the step S1111 before the next-door neighbour is made as designated frame, reads the voice data of designated frame, and calculates the audio data sample sum of designated frame, namely, its integration.The integrated value of voice data sends to integration memory 1208 and is stored in wherein.
Then, process to enter step S1113, and fluctuation calculating section 1194 reads the voice data of designated frame and than the integrated value of the voice data of the frame (former frame) of the Zao frame of designated frame, and obtain the difference between them.Fluctuation calculating section 1194 these differences of output are to the fluctuation information of warning processing section 1137D as the designated frame voice data.
Then, wait for the voice data that sends next frame from wave point 1137B.Processing enters step S1111, and repeats later on identical process.
Next step, Figure 75 illustrates the example structure of the warning processing section 1137D shown in Figure 64.
The interim storage of fluctuation information storage area 1211 is as the view data that sends from data processing section 1137C as described in reference Figure 73, Figure 74 A and Figure 74 B and the fluctuation information of voice data.
The view data of fluctuation information analysis part 1212 analyzing stored in fluctuation information storage area 1211 and the fluctuation information of voice data, with the characteristic quantity wave component in time that obtains view data and voice data, and they are sent to abnormal thing judgement part 1213.
Abnormal thing judgement part 1213 judges from the view data of fluctuation information analysis part 1212 transmissions and the characteristic quantity wave component in time of voice data whether satisfy the abnormal thing condition that is stored in abnormal thing condition storage area 1214, and result of determination is sent to warning processing requested part 1215.
Warning is processed requested part 1215 request safety governors 1137 (Figure 64) bases and is judged from abnormal thing the result of determination that part 1213 sends, and carries out to be used for warning the warning that emergency (abnormal thing) occurs to process to the user.
The abnormal thing condition that when abnormal thing condition storage area 1214 storages are judged part 1213 judgement abnormal thing when abnormal thing, the characteristic quantity wave component in time of view data and voice data should satisfy.
In abnormal thing condition storage area 1214, the specify error condition, perhaps can store the abnormal thing condition by user's appointment in advance.Specifically, the user can operational example such as controller 15 to input abnormal thing condition.Abnormal thing condition by operation control 15 inputs sends to the abnormal thing condition storage area 1214 of warning processing section 1137D by IR receiving unit 1135, CPU 1129 and safety governor 1137A, and is stored in wherein.
In the warning processing section 1137D that constructs as mentioned above, determine whether the abnormal thing according to the fluctuation information of view data and voice data, and when the abnormal thing, carry out for abnormal thing from abnormal thing to the user that warn and judge/the warning processing.
Judged/the warning processing by the abnormal thing that warning processing section 1137D carries out with reference to the flow chart description shown in Figure 76.
Fluctuation information storage area 1211 receives continuously from the view data of the data processing section 1137C output of previous stage and the fluctuation information of voice data, and stores fluctuation information temporarily.
At step S1121, the view data of fluctuation information analysis part 1212 analyzing stored in fluctuation information storage area 1211 and the fluctuation information of voice data, with the characteristic quantity wave component in time that obtains view data and voice data, and they are sent to abnormal thing judgement part 1213.Processing enters step S1122.
At step S1122, abnormal thing judges part 1213 judges from the view data of fluctuation information analysis part 1212 transmissions and the characteristic quantity wave component in time of voice data whether satisfy the abnormal thing condition that is stored in abnormal thing condition storage area 1214.
When the characteristic quantity wave component in time of judging the view data that sends from fluctuation information analysis part 1212 and voice data at step S1122 does not satisfy abnormal thing condition, wait for that the fluctuation information with next view data and voice data is stored in fluctuation information storage area 1211.Processing enters step S1121.
When the characteristic quantity wave component in time of judging the view data that sends from fluctuation information analysis part 1212 and voice data at step S1122 satisfied abnormal thing condition, abnormal thing judged that part 1213 will represent that the result of determination that satisfies abnormal thing condition sends to warning processing requested part 1215.Processing enters step S1123.
At step S1123, the result of determination of abnormal thing condition is satisfied in the expression that response sends from abnormal thing judgement part 1213, and 1215 request safety system parts 1137 (Figure 64) execution of warning processing requested part are used for occurring to user's warning the warning processing of emergency.Wait is stored in the fluctuation information of next view data and voice data in fluctuation information storage area 1211, and processing turns back to step S1121.
Next step further describes to Figure 78 I the processing that abnormal thing is judged part 1213 to Figure 77 D and Figure 78 A with reference to Figure 77 A.
Abnormal thing condition is satisfied in 1213 judgements of abnormal thing judgement part when the fluctuation information of view data and voice data for example changes in the mode before being different from.
Specifically, for example, when the fluctuation information that changes not significantly significantly changes suddenly or when the fluctuation information that changes does not to a certain extent change suddenly, abnormal thing condition is satisfied in 1212 judgements of fluctuation information analysis part.
For example, when 1212 pairs of fluctuation informations of fluctuation information analysis part differentiate (difference between the adjacent Time Continuous fluctuation information of order computation), and abnormal thing is judged when whether part 1213 judgement differential absolute values are equal to or greater than predetermined threshold, can judge whether the fluctuation information that changes not significantly significantly changes suddenly.In this case, predetermined threshold is stored in abnormal thing condition storage area 1214 as abnormal thing condition.
For example, judge part 1213 when abnormal thing and judge that whether fluctuation informations are equaling or be longer than when continuing to have close to zero value in time of the scheduled time, can judge whether the fluctuation information that changes does not to a certain extent change suddenly.In this case, the scheduled time is stored in abnormal thing condition storage area 1214 as abnormal thing condition.
Figure 77 A illustrates by the view data of video camera 1162 shootings and the example of fluctuation information thereof to Figure 77 D.
For example, as shown in Figure 77 A, when video camera 1162 was being taken the elderly's who walks just in doors image, the fluctuation information moderate change of view data was as shown in Figure 77 B.For example, as shown in Figure 77 C, when video camera 1162 photographed the elderly's who is walking always but falling suddenly image, the fluctuation information of view data significantly changed, and almost nil later on, as shown in Figure 77 D.
Therefore, when the fluctuation information with view data surpasses suddenly the state of predetermined threshold, namely, the differential value of the fluctuation information of view data becomes the state that is equal to or greater than predetermined threshold when being made as abnormal thing condition, can detect the abnormality of Falls in Old People.By warn this abnormality to the user, the elderly can be nursed (rescue) fast.
As shown in Figure 77 C, if a people falls suddenly, the fluctuation information of view data surpasses suddenly predetermined threshold, and becomes later on close to zero.Therefore, the differential value of the fluctuation information of view data becomes and is equal to or greater than predetermined threshold, and the fluctuation information of view data becomes close to zero later on.When fluctuation information is continued to be made as abnormal thing condition close to zero state, can detect the abnormality that a people falls within the scheduled time or longer time.
Figure 78 A illustrates by the view data of video camera 1162 shootings and the example of voice data and fluctuation information thereof to Figure 78 I.
For example, as shown in Figure 78 A, when video camera 1162 was being taken the baby's who creeps just in doors image, the fluctuation information moderate change of view data and voice data was as shown in Figure 78 B and Figure 78 C.For example, as shown in Figure 77 D, creep in doors but when entering baby's the image of sleep, the fluctuation information of view data and voice data becomes almost nil when video camera 1162 photographs always, and change very little, as shown in Figure 78 E and Figure 78 F.
Therefore, in this case, when the fluctuation information with view data and voice data becomes close to zero and when continuing in the given time to be made as abnormal thing condition close to zero state, can detect the abnormality that the baby enters sleep.By warn this abnormality to the user, the user can take rapidly certain action as cover a blanket to the baby.
As shown in Figure 78 G, if waking up suddenly and begin always, the baby who is sleeping crys, become as shown in Figure 78 H and Figure 78 I by the view data of video camera 1162 shootings and the fluctuation information of voice data.
In this case, the baby who is crying is more than activity in bed, but when creeping movable lack, the fluctuation information variation of view data not quite, as shown in Figure 78 H.
Yet when the baby who is sleeping began to cry, because crying of baby is discontinuity, so the fluctuation information of voice data became suddenly large value, and continues to have large value in special time, as shown in Figure 78 I.
Therefore, in this case, when the state that becomes suddenly large value and continue to have large value in special time when the fluctuation information with voice data is made as abnormal thing condition, can detect the abnormality that the baby wakes up and crying.By warn this abnormality to the user, can report rapidly that the baby is awake.
Next step, as described in reference Figure 75 and 76, when occurring satisfying the abnormal thing state of abnormal thing condition, warning processing section 1137D request safety governor 1137A (Figure 64) carries out and is used for warning the warning that emergency occurs to process to the user.The warning processing of being carried out by its when safety governor 1137A receives warning and processes request is described with reference to Figure 79 A and Figure 79 B and Figure 80 A and Figure 80 B.
When safety governor 1137A received warning processing request, for example, safety governor 1137A request CPU 1129 selected from view data and the voice data of wave point 1137B output selector 1127.
Like this, selector 1127 is selected from view data and the voice data of wave point 1137B output, and they are sent to NTSC encoder 1128 and amplifier 1138.The view data that sends to NTSC encoder 1128 sends to CRT 11, and shows thereon.The voice data that sends to amplifier 1138 sends to loudspeaker unit 12L and 12R, and from its output.
As a result, CRT 11 shows the view data that sends and received by wave point 1137B from video camera 1162, and loudspeaker unit 12L and the 12R output voice data that sends and received by wave point 1137B from video camera 1162.
Specifically, for example, when forming master unit 1 and the auxiliary unit 2 that can expand television system 1161 ijAs shown in Figure 79 A when the output specific channel view data of television program and voice data, if auxiliary unit 2 13 Warning processing section 1157D will warn the request processed to output to safety governor 1157A, auxiliary unit 2 13 CRT 31 on demonstration become from sending view data and voice data to auxiliary unit 2 from the present television program view data that receives 13The view data that sends of video camera 1162, as shown in Figure 79 B.In the embodiment shown in Figure 79 A and Figure 79 B, auxiliary unit 2 13 CRT 31 on demonstration become from the television program view data view data that people of demonstration of sending from video camera 1162 falls.
In addition, in this case, in auxiliary unit 2 13In, export from loudspeaker unit 32L and 32R (Figure 65) voice data that sends from video camera 1162.
In this case, even when the user watches television program, also can send view data and voice data to auxiliary unit 2 13 Video camera 1162 situation of making a video recording under know immediately the abnormal thing.
Can be that the predetermined warning tones (pipipi...) of loudspeaker unit 32L and 32R output replaces from the voice data of video camera 1162 transmissions, as shown in Figure 79 B.
In these cases, auxiliary unit 2 13 CRT 31 on demonstration become the view data that sends from video camera 1162 from the television program view data that receives, as shown in Figure 79 B.For example, when auxiliary unit 2 13Power supply when disconnecting, can first connect auxiliary unit 2 13Power supply, then will be presented on CRT 31 from the view data that video camera 1162 sends.
In addition, in can expanding television system 1161, the view data that sends from the video camera 1162 of environment of abnormal thing being made a video recording is presented at and is receiving always from the television receiver of the view data of video camera 1162 transmissions and voice data.In addition, as shown in Figure 80 A and 80B, can be to show in full frame mode the view data that sends from video camera 1162.
Specifically, for example, when forming master unit 1 and the auxiliary unit 2 that can expand television system 1161 ijAs shown in Figure 80 A when the output specific channel view data of television program and voice data, process request if form the arbitrary television receiver output warning that to expand television system 1161, switch the demonstration that forms all television receivers that to expand television system 1161, and can show the view data from video camera 1162 transmissions that the television receiver of output warning processing request is receiving in full frame mode, as shown in Figure 80 B.
Equally in this case, the user can know the abnormal thing immediately.
When carrying out full screen display, owing to not knowing abnormal thing in the environment which video camera 1162 is being made a video recording, therefore preferably is receiving in can expanding television system 1161 on all or part of display screen of television receiver of the view data that sends from the video camera 1162 of the environment of taking the abnormal thing and voice data always, showing just to represent in the message of abnormal thing etc.
Full screen display can be for example carried out when the high abnormal thing of emergency level occurs.Specifically, when the not high abnormal thing of emergency level occurs when, only have receiving from the view data that video camera 1162 of the environment of abnormal thing shooting sent and the television receiver of voice data always and just switch its demonstration, as described in reference Figure 79 A and Figure 79 B, and when the high abnormal thing of emergency level occurs, switch the demonstration that forms all television receivers that to expand television system 1161, and realize full screen display, as shown in Figure 80 A and Figure 80 B.Because image significantly shows in full screen display, the user is known the high abnormal thing of emergency level occurs.
Can also be to change from the size of the view data (being called urgent view data in case of necessity) that video camera 1162 of the environment of abnormal thing shooting sent according to the emergency level of abnormal thing and can expand television system 1161.Specifically, when emergency level is low, urgent view data can adopt the size with the display screen that forms a television receiver can expanding television system 1161 to show, when emergency level is medium, can adopt the size of the display screen of 2 * 2 adjacent television receivers to show, and when emergency level is high, can adopt the size of the display screen of 3 * 3 adjacent television receivers to show.
Emergency level can be stored in abnormal thing condition storage area 1214 (Figure 75) together with abnormal thing condition.In this case, can come the identified emergency rank according to the abnormal thing condition that is satisfied.
Next step, Figure 81 illustrates and takes on another example structure of television receiver that composition can be expanded the master unit 1 of television system 1161.In the figure, be same as Figure 64 allocation of symbols used and give corresponding to the part shown in Figure 64, and if there is no need, the descriptions thereof are omitted.In other words, warn display part 1139 except newly providing, the master unit 1 shown in Figure 81 is constructed in the mode identical with Figure 64 basically.
Warning display part 1139 is connected or glimmers according to the control of CPU 1129.
The television receiver of taking on the auxiliary unit 2 shown in Figure 65 also can adopt the mode identical with the master unit 1 shown in Figure 81, is equipped with to warn the display part.
When master unit 1 in can expanding television system 1161 was equipped with warning display part 1139 and auxiliary unit 2 and also is equipped with the warning display part, if the abnormal thing, can connect or glimmer in warning display part 1139, as shown in Figure 82 A and Figure 82 B.
Specifically, for example, when forming master unit 1 and the auxiliary unit 2 that can expand television system 1161 ijAs shown in Figure 82 A when the output specific channel view data of television program and voice data, if the warning processing section 1137D of master unit 1 will warn the request of processing to output to safety governor 1137A, the demonstration on the CRT 11 of master unit 1 becomes from sending view data and voice data to the view data (urgent view data) of video camera 1162 transmissions of master unit 1, as shown in Figure 82 B from the present television program view data that receives.
In addition, display part 1139 is warned in safety governor 1137A request CPU 1129 connections or flicker, thus warning display part 1139 connection or the flickers as shown in Figure 82 B that usually disconnect.
In addition in this case, even when the user watches television program, the user also can know abnormal thing in sending the environment that view data and voice data making a video recording to the video camera 1162 of master unit 1 immediately.
When emergency level is low, can only connect or glimmer warning display part 1139 or output alarm sound and do not switch to the demonstration of urgent view data only.
Next step, Figure 83 illustrates and takes on another example structure of television receiver that composition can be expanded the master unit 1 of television system 1161.In the figure, be same as Figure 64 allocation of symbols used and give corresponding to the part shown in Figure 64, and if there is no need, the descriptions thereof are omitted.
Specifically, the master unit 1 shown in Figure 64 is the television receiver for receiving digital broadcast, and the master unit 1 shown in Figure 83 is for the television receiver that receives analog broadcasting.
Analog tv broadcast signal in 1221 pairs of predetermined channels of tuner detects and demodulation.Tuner 1221 will send to by the view data that demodulation obtains Y/C separating part 1222, and voice data is sent to selector 1127.
Y/C separating part 1222 is brightness signal separation Y and chrominance signal R-Y and B-Y from the output of tuner 1221, and they are sent to selector 1127.
Matrix circuit 1223 will be changed in case of necessity from the color space of the view data of selector 1127 transmissions, and it is sent to CRT 11.
Even also can be used for forming with the television receiver that is used for receiving analog broadcasting of aforementioned manner structure and can expand television system 1161.
The television receiver of taking on auxiliary unit 2 also can adopt the mode identical with the television receiver of taking on the master unit 1 shown in Figure 83, constructs as the television receiver that is used for the reception analog broadcasting.
Above-mentioned processing sequence can adopt hardware or software to realize.When adopting software to realize that these process sequences, the installation that forms software is in all-purpose computer etc.
Figure 84 illustrates and according to an embodiment example structure be used to the computer of the program of carrying out above-mentioned processing sequence is installed wherein.
Program can pre-recorded hard disk 305 or ROM303 taking on the recording medium that is built in computer in.
Perhaps, program can provisional or permanently be recorded to removable recording medium 311 as floppy disk, CD-ROM (compact disc read-only memory, compact disc read-only memory), MO (magneto-optical, magneto-optic) in dish, DVD (digital versatile disk, digital versatile disc), disk or semiconductor memory.This removable recording medium 311 can be used as so-called software kit to be provided.
Program can be installed to computer from removable recording medium 311 as above.In addition, program can be transferred to computer with wireless mode from the download website by the artificial satellite that is used for digital satellite broadcasting, perhaps is transferred to computer by network such as LAN (local area network (LAN)) or internet with wired mode; And the program of transmission can be received by communications portion 308 by this way, and is installed in the hard disk 305 that is built in computer.
Computer comprises CPU (CPU) 302.CPU 302 is connected to input/output interface 310 by bus 301.When the importation 307 that user operation is formed by keyboard, mouse, microphone etc. was come by input/output interface 310 input command, CPU 302 was stored in program in ROM (read-only memory) 303 according to this command execution.Perhaps, CPU 302 will be stored in the program in hard disk 305; Via satellite or Internet Transmission, receive and be installed to program in hard disk 305 by communications portion 308; The program that perhaps reads and be installed to hard disk 305 from removable recording medium 311 is loaded in RAM (random access storage device) 304, and carries out it.Like this, CPU 302 carries out based on the processing of above-mentioned flow chart or the processing of carrying out by the structure of above-mentioned block diagram.Then, CPU 302 for example exports results from the output 306 that is formed by LCD (liquid-crystal display, liquid crystal display), loud speaker etc. by input/output interface 310 as required; From communications portion 308 transmission processing results; Perhaps result is recorded in hard disk 305.
In this manual, describe the treatment step that is used for making computer carry out various processing and not necessarily process in the time sequencing mode according to the order described in flow chart, and comprise with processing (as parallel processing or object handles) parallel or that independent mode is carried out.
Program can be processed by a computer, perhaps can be carried out by a plurality of computers by distribution process.In addition, program can be transferred to far-end computer and carries out.
Can be for example take on that master unit or auxiliary unit and it are taken on auxiliary unit used how many auxiliary unit and difference before using this auxiliary unit according to television receiver if composition can be expanded the selling price of the television receiver of television system.
Specifically, if owing to there is no as mentioned above master unit the full screen display function can not be provided in can expanding television system, so master unit has high value, and the selling price of master unit can be established height.
Can expect that the user will additionally buy a plurality of auxiliary unit in case of necessity after buying master unit.Can establish the selling price of beginning some auxiliary unit for example lower than the selling price of master unit, but higher than the price of common TV receiver.Can also further establish the selling price of the auxiliary unit bought later on low.
Taking on the television receiver that can expand the master unit in television system can for example construct by safety system part 1137 being added to the ordinary numbers television receiver and changing the program of being carried out by CPU 1129.Therefore, can be by relatively easily making with the ordinary numbers television receiver, therefore in the situation that consider that full screen display function as above and warning function provide expensive efficient owing to taking on the television receiver that to expand the master unit in television system.This point is equally applicable to take on the television receiver of auxiliary unit.
The present invention can also be applied to not to be with the display device of the tuner of the image that sends from the external world for output and sound, and as the television receiver with display device of built-in tuner.
In addition, in the safety system shown in Figure 66, view data can not be with wireless mode with voice data but wired mode (communicating by letter or USB (universal serial bus, USB) communication as IEEE-1394) sends to from video camera 1162 television receiver that composition can be expanded television system 1161.
Except the video camera of preparing for safety system, for example, can also adopt the video camera that is used for so-called " the inside and outside communication (Doorphone) of door " installed and surveillance camera as video camera 1162.
In addition, safety system can be used for monitor baby and the elderly, and monitors and offer the hot water amount of bathtub, and sends the water boil situation in the kettle of chimes of doom when water boil.
In the present embodiment, process the view data that is applied to by common camera 1162 shootings.For example can adopt the hotness video camera as video camera 1162, and process the view data that can be applied to the expression Temperature Distribution that obtains from video camera.
In addition, can also come monitor temperature to change by the sensing infrared ray, and give a warning.
Industrial applicability
As mentioned above, according to the present invention, when connecting and using a large amount of display device, realize than function more senior when independent use display device.
In addition, according to the present invention, the user can easily identify abnormal thing (emergency) by existing television receiver etc., like this, can effectively use existing television receiver.

Claims (4)

1. display device that is connected to another display device is characterized in that comprising:
Storage device is for store a scene of the view data of the input data that comprise at least view data according to the authentication of another display device;
Display unit is used for being presented at the view data that storage device is stored, and
Authenticate device is used for ID by sending display device and function information and authenticates with the execution of another display device to another display device and the ID and the function information that receive another display device,
And, be further characterized in that, when successfully carrying out authentication, storage device is stored a scene that is included in the view data in the input data according to the control of another display device; And display unit is presented at the view data of storing in storage device.
2. display device as claimed in claim 1 characterized by further comprising:
Communicator is used for communicating by letter with another display device; And
Receiving system is used for receiving the input data, and it is characterized in that,
When receiving the input data when communicator from another display device and being used to indicate the order that shows a scene that is included in the view data the input data,
Receiving system receives the input data;
Memory device stores is included in a scene of the view data in the input data that received by receiving system; And
Display unit is presented at the view data of storing in storage device.
3. display device as claimed in claim 1 characterized by further comprising:
Communicator is used for communicating by letter with another display unit; And it is characterized in that,
When communicator receives a scene that is included in the view data in the input data,
Memory device stores is included in a scene of the view data in the input data that received by communicator; And
Display unit is presented at the view data of storing in storage device.
4. control method that is used for display device, wherein, described display device is connected to another display device, and described control method is characterised in that and comprises:
Storing step is stored a scene of the view data in the input data that comprise at least view data according to the authentication of another display device;
Step display is presented at the view data that storing step is stored, and
Authenticating step, the ID by sending display device and function information are to another display device and receive the ID of another display device and function information is used for and another display device is carried out and authenticated,
And, be further characterized in that, when successfully carrying out authentication, store a scene that is included in the view data in the input data according to the control of another display device; And, be presented at the view data of storing in storing step.
CN 200610092547 2001-12-28 2002-12-12 Display apparatus and control method Expired - Fee Related CN1921632B (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP399865/01 2001-12-28
JP399867/01 2001-12-28
JP2001399867A JP2003199092A (en) 2001-12-28 2001-12-28 Display device and method for controlling, program and recording medium as well as display system
JP2001399865A JP3624883B2 (en) 2001-12-28 2001-12-28 Display control apparatus, display control method, recording medium, program, and display system

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN028074300A Division CN1500345B (en) 2001-12-28 2002-12-12 Display apparatus and control method

Publications (2)

Publication Number Publication Date
CN1921632A CN1921632A (en) 2007-02-28
CN1921632B true CN1921632B (en) 2013-05-08

Family

ID=27604720

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 200610092547 Expired - Fee Related CN1921632B (en) 2001-12-28 2002-12-12 Display apparatus and control method

Country Status (2)

Country Link
JP (1) JP3624883B2 (en)
CN (1) CN1921632B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4736659B2 (en) 2005-09-15 2011-07-27 ソニー株式会社 Multi-screen television receiver remote control system, remote controller and operation method, multi-screen television receiver and operation method, recording medium, and program
CN102724424B (en) * 2011-11-29 2017-09-12 新奥特(北京)视频技术有限公司 A kind of method that teletext packaging scene switching is carried out using data file

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0749243A1 (en) * 1995-06-13 1996-12-18 Matsushita Electric Industrial Co., Ltd. Video monitor system
JP2000324465A (en) * 1999-05-13 2000-11-24 Sony Corp Pay tv broadcasting reception system, tv receiver and pay tv broadcasting reception method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0749243A1 (en) * 1995-06-13 1996-12-18 Matsushita Electric Industrial Co., Ltd. Video monitor system
JP2000324465A (en) * 1999-05-13 2000-11-24 Sony Corp Pay tv broadcasting reception system, tv receiver and pay tv broadcasting reception method

Also Published As

Publication number Publication date
CN1921632A (en) 2007-02-28
JP3624883B2 (en) 2005-03-02
JP2003198972A (en) 2003-07-11

Similar Documents

Publication Publication Date Title
CN1500345B (en) Display apparatus and control method
US7305680B2 (en) Listening module for asynchronous messages sent between electronic devices of a distributed network
US6907301B2 (en) Method and system for selecting and controlling devices in a home network
KR101557176B1 (en) Touch sensitive video signal display for a programmable multimedia controller
US7017171B1 (en) System and method for interfacing multiple electronic devices
US6844886B1 (en) Network control system
EP0957636B1 (en) On-screen electronic resources guide
CN100390767C (en) Browser based command and control home network
US20050235329A1 (en) Systems and methods for integrated control within a home entertainment system
KR101168248B1 (en) Method for providing a user interface for controlling an appliance in a network of distributed stations, as well as a network appliance for carrying out the method
US6785720B1 (en) Method for connecting to server devices in browser-based home network apparatus therefor
US20030106062A1 (en) Home network environment as a state machine
KR101508401B1 (en) Systems and methods for activity-based control of consumer electronics
CN1921632B (en) Display apparatus and control method
JP2002007240A (en) Information processor, electrical product management system, and information control method
JP3969401B2 (en) Display device and method, and program
JP4618319B2 (en) Display control device and method, recording medium, program, display device, and display system
JP4602027B2 (en) Display control apparatus and method, display apparatus, and display system
JP2000333269A (en) Gui (graphical user interface) driven type network control system and device and controller used for the gui driven type network control system
JP2003199092A (en) Display device and method for controlling, program and recording medium as well as display system
JP4618258B2 (en) Display control apparatus and method, program
EP1345424B1 (en) Method for controlling a peripheral consumer electronic device
JP2007159158A (en) Display system
JP3606378B2 (en) Video wireless transmission / reception system
JP2005020104A (en) Display, server apparatus, controller, communication system equipped with these, communication method, program, and recording medium

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20130508

Termination date: 20191212