CN1921632A - Display apparatus and control method - Google Patents

Display apparatus and control method Download PDF

Info

Publication number
CN1921632A
CN1921632A CNA2006100925474A CN200610092547A CN1921632A CN 1921632 A CN1921632 A CN 1921632A CN A2006100925474 A CNA2006100925474 A CN A2006100925474A CN 200610092547 A CN200610092547 A CN 200610092547A CN 1921632 A CN1921632 A CN 1921632A
Authority
CN
China
Prior art keywords
unit
auxilliary
data
frame
view data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CNA2006100925474A
Other languages
Chinese (zh)
Other versions
CN1921632B (en
Inventor
近藤哲二郎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2001399867A external-priority patent/JP2003199092A/en
Application filed by Sony Corp filed Critical Sony Corp
Publication of CN1921632A publication Critical patent/CN1921632A/en
Application granted granted Critical
Publication of CN1921632B publication Critical patent/CN1921632B/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Landscapes

  • Controls And Circuits For Display Device (AREA)
  • Television Receiver Circuits (AREA)

Abstract

When a great number of television receivers are connected and used, a higher function is implemented than when a television receiver is used alone. A main unit (television receiver) 1 displays a television broadcasting program and also detects its feature. In addition, the main unit 1 detects a scene change from the feature of the television broadcasting program, and controls a subsidiary unit (television receiver) 211 so as to display the image data of the frame immediately after the scene change. The subsidiary unit 211 displays the frame immediately after the scene change. Then, when another scene change occurs in image data displayed on the main unit 1, the image data of the frame immediately after the another scene change is displayed on the subsidiary unit 211, instead of the image data which has been displayed so far. The present invention can be applied, for example, to television receivers.

Description

Display device and control method
The application be that December 12, application number in 2002 are 02807430.0 the applying date, denomination of invention divides an application for the application for a patent for invention of " display device and control method ".
Technical field
The present invention relates to display device and control method, particularly for example allow when connecting and using a large amount of display device to realize when using display device separately more Premium Features and for example allow effectively to use display device and the control method that has television receiver now.
Background technology
For example, the television receiver received tv broadcast signal shows the television program image, and the sound of output accompanying image.
The traditional tv receiver is manufactured to alone works.Therefore, when the user bought new television receiver, the user need not to have old television receiver, even and can use them, but also be to abandon them under many circumstances.
Provide more Premium Features if connect television receiver in a large number than a television receiver, prevent that then operable television receiver is dropped, thereby help the resources effective utilization.
For example,, be difficult to always monitor them, and usually the nursing staff is included in tame personnel regularly or irregularly come baby-sit, child or the elderly in the family that baby, child or the elderly are arranged.
Yet in these cases, nursing staff's workload is very big, therefore needs to reduce the means of nursing staff's workload.
For example, use surveillance system, wherein, baby, child or the elderly are within doors made a video recording as these means, and the display unit display image in another house.
Yet, in traditional surveillance system, always need image regular or that casual inspection sends from video camera, though and reduced the workload that puts on the nursing staff to a certain extent, they are still bigger.
Television receiver is used in a lot of rooms.When using surveillance system, except employed television receiver, also need a display unit to be installed, thereby reduce room space for this system.
The image that sends from the video camera of surveillance can show by the television receiver in being used for the room.
Yet when the television receiver in being used for the room showed from the image that the video camera of surveillance sends, the nursing staff can not watch television program at television receiver.
The nursing staff also can be switched between television program and the image that sends from the video camera of surveillance system.Yet this handover operation is a trouble, in addition, the nursing staff also may occur and forget this situation of switching and may not notice emergency in the worst case.
Summary of the invention
The present invention considers afore-mentioned and makes.The objective of the invention is to allow when connecting and using a large amount of display devices such as television receiver to realize when using display device separately more Premium Features, thereby and allow the user use existing television receiver easily the identified emergency situation effectively use and have television receiver now.
First display device of the present invention is characterized in that comprising: the feature detection device is used to detect the feature of the input data that comprise view data at least; And control device, carry out control and make the image that is included in the input data be presented in display unit and another display device at least one thereby be used for result according to the feature detection of carrying out by the feature detection device.
First control method of the present invention is characterized in that comprising: the feature detection step, and detection comprises the feature of the input data of view data at least; And controlled step, make the image that is included in the input data be presented in display unit and another display device at least one thereby carry out control according to the result of the feature detection of carrying out in the feature detection step.
First program of the present invention is characterized in that comprising: the feature detection step, and detection comprises the feature of the input data of view data at least; And controlled step, make the image that is included in the input data be presented in display unit and another display device at least one thereby carry out control according to the result of the feature detection of carrying out in the feature detection step.
First storage medium of the present invention is characterized in that wherein storing a kind of program that comprises the steps: the feature detection step, and detection comprises the feature of the input data of view data at least; And controlled step, make the image that is included in the input data be presented in display unit and another display device at least one thereby carry out control according to the result of the feature detection of carrying out in the feature detection step.
Second display device of the present invention is characterized in that comprising: storage device is used for storing according to the control of another display device a scene of the view data of the input data that comprise view data at least; And display unit, be used for being presented at the view data that storage device is stored.
Second control method of the present invention is characterized in that comprising: storing step, store a scene of the view data in the input data that comprise view data at least according to the control of another display device; And step display, be presented at the view data that storing step is stored.
Second program of the present invention is characterized in that comprising: storing step, store a scene of the view data in the input data that comprise view data at least according to the control of another display device; And step display, be presented at the view data that storing step is stored.
Second storage medium of the present invention is characterized in that wherein storing a kind of program that comprises the steps: storing step, store a scene of the view data in the input data that comprise view data at least according to the control of another display device; And step display, be presented at the view data that storing step is stored.
A kind of display system of the present invention, it is characterized in that first display device comprises: first display unit is used for display image data; The feature detection device is used to detect the feature of the input data that comprise view data at least; And control device, thereby being used for feature detection result according to the input data carries out control and makes the view data that is included in the input data be presented in first display device and one or more second display device at least one, and one or more second display devices comprise: storage device is used for storing according to the control of first display device scene of view data of input data; And second display unit, be used for being presented at the view data that storage device is stored.
In first display device of the present invention, first control method, first program and first storage medium, detect the feature of the input data comprise view data at least, thereby and carry out control according to the feature detection result and make the image that is included in the input data be presented in display unit and another display device at least one.
In second display device of the present invention, second control method, second program and second storage medium, store a scene of the view data in the input data that comprise view data at least according to the control of another display device, and show the image of being stored.
In display system of the present invention, in first display device, detect the feature of the input data that comprise view data at least; Thereby and carry out control according to testing result and make the view data that is included in the input data be presented in first display device and one or more second display device at least one.In one or more second display devices, store a scene of the view data in the input data according to the control of first display device; And show the view data of being stored.
According to one aspect of the present invention, a kind of display device that is connected to another display device is provided, comprising: storage device is used for storing according to the control of another display device a scene of the view data of the input data that comprise view data at least; Display unit is used for being presented at the view data that storage device is stored, and authenticate device, be used for carrying out authentication with another display device, wherein, when successfully carrying out authentication, storage device is stored a scene that is included in the view data in the input data according to the control of another display device; And display unit is presented at the view data of storing in the storage device.
According to another aspect of the present invention, a kind of control method that is used for display device is provided, wherein, described display device is connected to another display device, described control method comprises: storing step, store a scene of the view data in the input data that comprise view data at least according to the control of another display device; Step display is presented at the view data that storing step is stored, and authenticating step, be used for carrying out authentication with another display device, wherein, when successfully carrying out authentication, store a scene that is included in the view data in the input data according to the control of another display device; And, be presented at the view data of storing in the storing step.
Description of drawings
Figure 1A is the perspective view that the example structure of using an embodiment who expands television system of the present invention is shown;
Figure 1B is the perspective view that the example structure of using an embodiment who expands television system of the present invention is shown;
Fig. 2 is the perspective view that the exemplary external structure of master unit 1 is shown;
Fig. 3 A is the front view that the exemplary external structure of master unit 1 is shown;
Fig. 3 B is the top view that the exemplary external structure of master unit 1 is shown;
Fig. 3 C is the right side view that the exemplary external structure of master unit 1 is shown;
Fig. 3 D is the left side view that the exemplary external structure of master unit 1 is shown;
Fig. 3 E is the bottom view that the exemplary external structure of master unit 1 is shown;
Fig. 3 F is the dorsal view that the exemplary external structure of master unit 1 is shown;
Fig. 4 is the perspective view that the exemplary external structure of auxilliary unit 2 is shown;
Fig. 5 A is the front view that the exemplary external structure of auxilliary unit 2 is shown;
Fig. 5 B is the top view that the exemplary external structure of auxilliary unit 2 is shown;
Fig. 5 C is the right side view that the exemplary external structure of auxilliary unit 2 is shown;
Fig. 5 D is the left side view that the exemplary external structure of auxilliary unit 2 is shown;
Fig. 5 E is the bottom view that the exemplary external structure of auxilliary unit 2 is shown;
Fig. 5 F is the dorsal view that the exemplary external structure of auxilliary unit 2 is shown;
Fig. 6 illustrates to be used to hold the perspective view of exemplary external structure that composition can be expanded the special machine cabinet of the master unit 1 of television system and auxilliary unit 2;
Fig. 7 is the plane graph that the exemplary external structure of controller 15 is shown;
Fig. 8 is the plane graph that the exemplary external structure of controller 35 is shown;
Fig. 9 is the plane graph that another exemplary external structure of controller 15 is shown;
Figure 10 is the block diagram that the example electrical structure of master unit 1 is shown;
Figure 11 is the block diagram that the example electrical structure of auxilliary unit 2 is shown;
Figure 12 is the figure that the hierarchy of IEEE-1394 communication protocol is shown;
Figure 13 is the figure that the address space of CSR framework is shown;
Figure 14 is the figure of offset address, title and function that CSR is shown;
Figure 15 is the figure that general ROM form is shown;
Figure 16 is the figure that the details of bus message piece, root and unit catalogue are shown;
Figure 17 is the figure that the PCR structure is shown;
Figure 18 A is the figure that the oMPR structure is shown;
Figure 18 B is the figure that the oPCR structure is shown;
Figure 18 C is the figure that the iMPR structure is shown;
Figure 18 D is the figure that the iPCR structure is shown;
Figure 19 is the figure that the data structure of the grouping of transmitting with asynchronous transfer mode in the AV/C order is shown;
Figure 20 A is the figure that command type/response (ctype/response) field in the AV/C order is shown;
Figure 20 B is the figure that subelement type (subunit type) field in the AV/C order is shown;
Figure 20 C is the figure that command code (opcode) field in the AV/C order is shown;
Figure 21 A is the figure that the example particular command in the AV/C order is shown;
Figure 21 B is the figure that the example specific response in the AV/C order is shown;
Figure 22 is the block diagram that the example electrical structure of controller 15 is shown;
Figure 23 is the figure that the form of the frame data that sent and received by controller 15 is shown;
Figure 24 is the block diagram that the example electrical structure of controller 35 is shown;
Figure 25 is the block diagram that the example electrical structure of IR interface 135 is shown;
Figure 26 is the flow chart of the processing of explanation controller 15;
Figure 27 is the flow chart of the processing of explanation IR interface 135;
Figure 28 is the flow chart of the processing of explanation master unit 1;
Figure 29 is the flow chart of the authentication processing of explanation master unit 1;
Figure 30 is the flow chart of the processing of the auxilliary unit 2 of explanation;
Figure 31 is the flow chart of the authentication processing of the auxilliary unit 2 of explanation;
Figure 32 is the block diagram that first example structure of signal processing 127 is shown;
Figure 33 is explanation is shown processing by first virtual many viewpoints of the master unit of signal processing unit 127 execution a flow chart;
Figure 34 is the block diagram that first example structure of signal processing 147 is shown;
Figure 35 is explanation is shown processing by first virtual many viewpoints of the auxilliary unit of signal processing unit 147 execution a flow chart;
Figure 36 A illustrates composition can expand the master unit 1 of television system and the figure that the example in the auxilliary unit 2 shows;
Figure 36 B illustrates composition can expand the master unit 1 of television system and the figure that the example in the auxilliary unit 2 shows;
Figure 36 C illustrates composition can expand the master unit 1 of television system and the figure that the example in the auxilliary unit 2 shows;
Figure 37 A illustrates composition can expand the master unit 1 of television system and the figure that the example in the auxilliary unit 2 shows;
Figure 37 B illustrates composition can expand the master unit 1 of television system and the figure that the example in the auxilliary unit 2 shows;
Figure 37 C illustrates composition can expand the master unit 1 of television system and the figure that the example in the auxilliary unit 2 shows;
Figure 38 is the block diagram that second example structure of signal processing 127 is shown;
Figure 39 is explanation is shown processing by second virtual many viewpoints of the master unit of signal processing unit 127 execution a flow chart;
Figure 40 is the block diagram that the 3rd example structure of signal processing 127 is shown;
Figure 41 is explanation is shown processing by the 3rd virtual many viewpoints of the master unit of signal processing unit 127 execution a flow chart;
Figure 42 is the block diagram that second example structure of signal processing 147 is shown;
Figure 43 is explanation is shown processing by second virtual many viewpoints of the auxilliary unit of signal processing unit 147 execution a flow chart;
Figure 44 is the block diagram that the 4th example structure of signal processing 127 is shown;
Figure 45 is explanation is shown processing by the 4th virtual many viewpoints of the master unit of signal processing unit 127 execution a flow chart;
Figure 46 is the figure that the command transfer from master unit 1 to auxilliary unit 2 by infrared communication is shown;
Figure 47 is explanation when order is sent to the flow chart of the processing of controller 15 performed when assisting unit 2 from master unit 1;
Figure 48 is the block diagram that the 5th example structure of signal processing 127 is shown;
Figure 49 is explanation is shown processing by the 5th virtual many viewpoints of the master unit of signal processing unit 127 execution a flow chart;
Figure 50 is the block diagram that the 6th example structure of signal processing 127 is shown;
Figure 51 is explanation is shown processing by the 6th virtual many viewpoints of the master unit of signal processing unit 127 execution a flow chart;
Figure 52 A illustrates composition can expand the master unit 1 of television system and the figure that the example in the auxilliary unit 2 shows;
Figure 52 B illustrates composition can expand the master unit 1 of television system and the figure that the example in the auxilliary unit 2 shows;
Figure 52 C illustrates composition can expand the master unit 1 of television system and the figure that the example in the auxilliary unit 2 shows;
Figure 53 illustrates composition can expand the master unit 1 of television system and the figure that the example in the auxilliary unit 2 shows;
Figure 54 illustrates composition can expand the master unit 1 of television system and the figure that the example in the auxilliary unit 2 shows;
Figure 55 is the block diagram that the 7th example structure of signal processing 127 is shown;
Figure 56 is explanation is shown processing by the 7th virtual many viewpoints of the master unit of signal processing unit 127 execution a flow chart;
Figure 57 is the block diagram that the 8th example structure of signal processing 127 is shown;
Figure 58 is explanation is shown processing by the 8th virtual many viewpoints of the master unit of signal processing unit 127 execution a flow chart;
Figure 59 is the block diagram that the 9th example structure of signal processing 127 is shown;
Figure 60 is explanation is shown processing by the 9th virtual many viewpoints of the master unit of signal processing unit 127 execution a flow chart;
Figure 61 A illustrates composition can expand the master unit 1 of television system and the figure that the example in the auxilliary unit 2 shows;
Figure 61 B illustrates composition can expand the master unit 1 of television system and the figure that the example in the auxilliary unit 2 shows;
Figure 61 C illustrates composition can expand the master unit 1 of television system and the figure that the example in the auxilliary unit 2 shows;
Figure 62 is the block diagram that another example electrical structure of master unit 1 is shown;
Figure 63 A is the perspective view that the example structure of using another embodiment that expands television system of the present invention is shown;
Figure 63 B is the perspective view that the example structure of using another embodiment that expands television system of the present invention is shown;
Figure 64 is the block diagram that the example electrical structure of master unit 1 is shown;
Figure 65 is the block diagram that the example electrical structure of auxilliary unit 2 is shown;
Figure 66 is the block diagram that the example structure of an embodiment who uses safety system of the present invention is shown;
Figure 67 is the block diagram that the example structure of video camera 1162 is shown;
Figure 68 is the flow chart of the processing of explanation master unit 1;
Figure 69 is the flow chart of the authentication processing of explanation master unit 1;
Figure 70 is the flow chart of the processing of the auxilliary unit 2 of explanation;
Figure 71 is the flow chart of the authentication processing of the auxilliary unit 2 of explanation;
Figure 72 A illustrates the figure that the example in the master unit 1 that can expand television system 1161 shows;
Figure 72 B is the figure that the example full screen display that can expand in the television system 1161 is shown;
Figure 73 is the block diagram that the example structure of data processing section 1137C is shown;
Figure 74 A is the flow chart that the view data of explanation data processing section 1137C is handled;
Figure 74 B is the flow chart that the voice data of explanation data processing section 1137C is handled;
Figure 75 is the block diagram that the example structure of warning processing section 1137D is shown;
Figure 76 is the flow chart that the unusual thing judgement/warning of explanation warning processing section 1137D is handled;
Figure 77 A is the figure that the specification exception thing is judged the processing of part 1213;
Figure 77 B is the figure that the specification exception thing is judged the processing of part 1213;
Figure 77 C is the figure that the specification exception thing is judged the processing of part 1213;
Figure 77 D is the figure that the specification exception thing is judged the processing of part 1213;
Figure 78 A is the figure that the specification exception thing is judged the processing of part 1213;
Figure 78 B is the figure that the specification exception thing is judged the processing of part 1213;
Figure 78 C is the figure that the specification exception thing is judged the processing of part 1213;
Figure 78 D is the figure that the specification exception thing is judged the processing of part 1213;
Figure 78 E is the figure that the specification exception thing is judged the processing of part 1213;
Figure 78 F is the figure that the specification exception thing is judged the processing of part 1213;
Figure 78 G is the figure that the specification exception thing is judged the processing of part 1213;
Figure 78 H is the figure that the specification exception thing is judged the processing of part 1213;
Figure 78 I is the figure that the specification exception thing is judged the processing of part 1213;
Figure 79 A is the figure that the warning of explanation safety governor 1137A is handled;
Figure 79 B is the figure that the warning of explanation safety governor 1137A is handled;
Figure 80 A is the figure that the warning of explanation safety governor 1137A is handled;
Figure 80 B is the figure that the warning of explanation safety governor 1137A is handled;
Figure 81 is the block diagram that another example electrical structure of master unit 1 is shown;
Figure 82 A is the figure that the warning of explanation safety governor 1137A is handled;
Figure 82 B is the figure that the warning of explanation safety governor 1137A is handled;
Figure 83 is the block diagram that another example electrical structure of master unit 1 is shown;
Figure 84 is the block diagram that the example structure of an embodiment who uses computer of the present invention is shown.
Embodiment
Figure 1A and Figure 1B are the perspective views of example structure that an embodiment of the TV of expansion of the present invention (TV) system that uses (whether system is meant the entity that a plurality of apparatus logics fit together, and be included in the same cabinet irrelevant with these equipment) is shown.
In the embodiment shown in Figure 1A, can expand television system by nine television receivers 1,2 11, 2 12, 2 13, 2 21, 2 23, 2 31, 2 32With 2 33Form.In the embodiment shown in Figure 1B, can expand television system by 25 television receivers 1,2 11, 2 12, 2 13, 2 14, 2 15, 2 21, 2 22, 2 23, 2 24, 2 25, 2 31, 2 32, 2 34, 2 35, 2 41, 2 42, 2 43, 2 44, 2 45, 2 51, 2 52, 2 53, 2 54With 2 55Form.
The number that composition can be expanded the television receiver of television system is not limited to 9 or 25.In other words, can expand television system can be formed by any a plurality of television receivers.The arrangement that composition can be expanded the television receiver of television system is not limited to 3 * 3 or 5 * 5 horizontal vertical forms shown in Figure 1A or Figure 1B.In other words, the arrangement that composition can be expanded the television receiver of television system can be other horizontal vertical forms, as 1 * 2,2 * 1,2 * 3 horizontal vertical forms.In addition, the spread geometry that composition can be expanded the television receiver of television system is not limited to the matrix shape shown in Figure 1A and the 1B, and can be Pyramid.
Can be like this constitute for any a plurality of television receiver arbitrarily owing to can expand television system, so it can be called " can expand " system by the arrangement number on the horizontal vertical direction.
Can expand in the television receiver of television system at composition, have two kinds of television receivers: main television receiver (being called master unit in case of necessity), can control other television receivers; And auxilliary television receiver (being called auxilliary unit in case of necessity), can control by another television receiver, but can not control other television receivers.
For carrying out various types of processing described later, can expand television system and need have and follow the television receiver that to expand television system (below be called to expand use the unit), and needs are arranged in them is master units.Therefore, in each embodiment shown in Figure 1A and Figure 1B, can expand in the television receiver of television system at composition, the television receiver that for example is positioned at central authorities is taken on master unit 1.
For above-mentioned reasons, using in forming the television receiver that can expand television system is not that this television receiver can not use the function that can expand television system in the time of can expanding the television receiver that uses the unit.In addition, even all be can expand when using the unit when forming all television receivers can expand television system, if but they are auxilliary unit entirely, then they can not use the function that can expand television system.
Therefore, for using the function can expand television system, the user need buy at least one or a plurality of master unit or a master unit and one or more auxilliary unit.
Because master unit can be taken on auxilliary unit, therefore in forming the television receiver that to expand television system, can use a plurality of master units.
In the embodiment shown in Figure 1A, the television receiver 1 that is positioned at central authorities (left side rise second and gone up second) in 3 * 3 television receivers is taken on master unit, and other eight television receivers 2 11, 2 12, 2 13, 2 21, 2 23, 2 31, 2 32With 2 33Take on auxilliary unit.In the embodiment shown in Figure 1B, the television receiver 1 that is positioned at central authorities (left side rise the 3rd and gone up the 3rd) in 5 * 5 television receivers is taken on master unit, and other 24 television receivers 2 11, 2 12, 2 13, 2 14, 2 15, 2 21, 2 22, 2 23, 2 24, 2 25, 2 31, 2 32, 2 34, 2 35, 2 41, 2 42, 2 43, 2 44, 2 45, 2 51, 2 52, 2 53, 2 54With 2 55Take on auxilliary unit.
Therefore, in each embodiment shown in Figure 1A and Figure 1B, master unit 1 is positioned at the central authorities that composition can be expanded the television receiver of television system.The position of master unit 1 is not limited to form the central authorities of the television receiver that can expand television system.Master unit 1 can be positioned at any position, as the upper left corner or the lower right corner.
Even when master unit 1 is arranged in any position that can expand television system, each process of aftermentioned also all is to carry out under the situation of the television receiver that is positioned at central authorities being regarded as master unit.
For simplification describes below, suppose that can expand television system is formed by 3 * 3 television receivers shown in Figure 1A, and master unit 1 is positioned at the central authorities that composition can be expanded the television receiver of television system.
Be used to form the auxilliary unit 2 that to expand television system IjIn subfix i and j be meant auxilliary unit 2 IjBe arranged in the capable and j of the i that can expand television system row (on play the capable and left j of rising of i be listed as).
Below, if there is no need to distinguish auxilliary unit 2 Ij, then their general designations are made auxilliary unit 2.
Fig. 2 is the perspective view that the example structure of the television receiver of taking on master unit 1 is shown.
Master unit 1 is that display screen size is the television receiver of 14 inches or 15 inches for example, and the central authorities in the front are useful on CRT (the Cathode Ray Tube of display image, cathode ray tube) 11, and left hand end and right hand end in the front loudspeaker unit 12L and the 12R that are useful on output sound.
The image that is included in the television broadcasting signal that is received by unshowned antenna is presented on the CRT 11, and from L (left side) channel sound and R (right side) channel sound of loudspeaker unit 12L and 12R output accompanying image.
Master unit 1 is connected to the remote controller (being called controller in case of necessity) 15 that is used to launch infrared ray IR.The user can change receiving channels and volume by operation control 15, and various command is sent to master unit 1.
Controller 15 is not limited to carry out the remote controller of infrared communication.Can also adopt and carry out radio communication such as BlueTooth (bluetooth) TMRemote controller.
Controller 15 not only can be controlled master unit 1, can also control auxilliary unit 2.
Next step, Fig. 3 A illustrates six faces of the example structure of master unit shown in Figure 21 to Fig. 3 F.
Fig. 3 A illustrates the front of master unit 1, and Fig. 3 B illustrates the end face of master unit 1, and Fig. 3 C illustrates the bottom surface of master unit 1, and Fig. 3 D illustrates the left hand side of master unit 1, and Fig. 3 E illustrates the right hand side of master unit 1, and Fig. 3 F illustrates the back side of master unit 1.
Fixed mechanism is installed on the end face (Fig. 3 B), bottom surface (Fig. 3 C), left hand side (Fig. 3 D) and right hand side (Fig. 3 E) of master unit 1.As described later, similarly release mechanism is installed on end face, bottom surface, left hand side and the right hand side of the television receiver of taking on auxilliary unit.When end face side, bottom surface side, left hand side side or right hand side side at master unit 1 settle auxilliary unit 2 or another master unit, be installed in fixed mechanism and fixed mechanism on the opposite face that is installed in auxilliary unit 2 or another master unit on end face, bottom surface, left hand side or the right hand side of master unit 1 and for example mesh master unit 1 being fixed to auxilliary unit 2 or another master unit, thereby be not easy separately.Like this, preventing to form the television receiver that can expand television system misplaces.
Fixed mechanism can be formed by mechanical mechanism.They also can be formed by for example magnet.
At the back side of master unit 1, shown in Fig. 3 F, be installed on terminal panel 21, antenna terminal 22, input terminal 23 and lead-out terminal 24.
Terminal panel 21 has eight IEEE (Institute of Electrical and Electronics Engineers, Institute of Electrical and Electric Engineers), 1394 terminals 21 11, 21 12, 21 13, 21 21, 21 23, 21 31, 21 32With 21 33, be used for master unit 1 is connected electrically to eight auxilliary unit 2 forming the television system expanded shown in Figure 1A 11, 2 12, 2 13, 2 21, 2 23, 2 31, 2 32With 2 33
In the embodiment shown in Fig. 3 F, in terminal panel 21, be connected to auxilliary unit 2 Ij IEEE1394 terminal 21 IjBe installed in when the user watch from back side side obtained can expand television system the time with the television system expanded shown in Figure 1A auxilliary unit 2 IjCorresponding position, position, thereby master unit 1 is known auxilliary unit 2 IjPosition in the television system expanded shown in Figure 1A.
Therefore, in the television system expanded shown in Figure 1A, require the user to pass through IEEE-1394 terminal 21 11Carry out master unit 1 to auxilliary unit 2 11Connection, by IEEE-1394 terminal 21 12Carry out master unit 1 to auxilliary unit 2 12Connection, by IEEE-1394 terminal 21 13Carry out master unit 1 to auxilliary unit 2 13Connection, by IEEE-1394 terminal 21 21Carry out master unit 1 to auxilliary unit 2 21Connection, by IEEE-1394 terminal 21 23Carry out master unit 1 to auxilliary unit 2 23Connection, by IEEE-1394 terminal 21 31Carry out master unit 1 to auxilliary unit 2 31Connection, by IEEE-1394 terminal 21 32Carry out master unit 1 to auxilliary unit 2 32Connection, and by IEEE-1394 terminal 21 33Carry out master unit 1 to auxilliary unit 2 33Connection.
In the television system expanded shown in Figure 1A, which the IEEE-1394 terminal in the terminal panel 21 is connected to auxilliary unit IjUnrestricted.When auxilliary unit IjBe connected to and be different from IEEE-1394 terminal 21 IjThe IEEE-1394 terminal time, need in master unit 1, specify (requiring the user to specify) to assist the unit IjBe arranged in the capable and j row of i of the television system expanded shown in Figure 1A.
In the embodiment shown in Fig. 3 F, terminal panel 21 is installed on eight IEEE-1394 terminals 21 11To 21 31, and master unit 1 is connected to eight auxilliary unit 2 with parallel mode 11To 2 33 Master unit 1 can be connected to eight auxilliary unit 2 with serial mode 11To 2 33Equally in this case, need in master unit 1, specify auxilliary unit IjBe arranged in the capable and j row of i of the television system expanded shown in Figure 1A.Therefore, the number of the IEEE-1394 terminal installed of terminal panel 21 is not limited to eight.
In addition, can expand that being electrically connected between the television receiver is not limited to IEEE 1394 in the television system.Can also adopt LAN (IEEE 802) etc.Can expand in the television system being electrically connected between the television receiver can adopt wireless mode rather than wired mode to carry out.
Antenna terminal 22 is connected to the cable that is connected with unshowned antenna.Like this, the television broadcasting signal that is received by antenna is input to master unit 1.For example view data and the voice data from VTR (video tape recorder, video tape recorder) output is input to input terminal 23.Export the view data and the voice data of the television broadcasting signal that master unit 1 for example receiving from lead-out terminal 24.
Fig. 4 is the perspective view that the example structure of the television receiver of taking on auxilliary unit 2 is shown.
Auxilliary unit 2 is the television receivers with display screen size identical with master unit shown in Figure 21, and the central authorities in the front are useful on the CRT (cathode ray tube) 31 of display image, and left hand end and right hand end in the front loudspeaker unit 32L and the 32R that are useful on output sound.For master unit 1 and auxilliary unit 2, also can use different display screen size.
The image that is included in the television broadcasting signal that is received by unshowned antenna is presented on the CRT 31, and from L (left side) channel sound and R (right side) channel sound of loudspeaker unit 32L and 32R output accompanying image.
As master unit 1, auxilliary unit 2 is connected to the controller 35 that is used to launch infrared ray IR.The user can change receiving channels and volume by operation control 35, and various command is sent to auxilliary unit 2.
Controller 35 not only can be controlled auxilliary unit 2, can also control master unit 1.
For constituting the television system expanded shown in Figure 1A, the user need buy a master unit 1 and eight auxilliary unit 2 11To 2 33In this case, because master unit 1 is connected to controller 15, and eight auxilliary unit 2 11To 2 33Be connected to controller 35, so the user has 9 controllers, and their management is a trouble.
Can be with the controller 35 of assisting unit 2 and 2 distributions of auxilliary unit as an option.Also can be with the controller 15 of master unit 1 and master unit 1 distribution as an option.
As mentioned above, controller 15 and 35 can not only be controlled master unit 1 but also control auxilliary unit 2.Therefore, if the user only has one of controller 15 and 35, just can not only control master unit 1 but also control auxilliary unit 2.
Next step, Fig. 5 A illustrates six faces of the example structure of auxilliary unit 2 shown in Figure 4 to 5F.
Fig. 5 A illustrates the front of auxilliary unit 2, and Fig. 5 B illustrates the end face of auxilliary unit 2, and Fig. 5 C illustrates the bottom surface of auxilliary unit 2, and Fig. 5 D illustrates the left hand side of auxilliary unit 2, and Fig. 5 E illustrates the right hand side of auxilliary unit 2, and Fig. 5 F illustrates the back side of auxilliary unit 2.
Fixed mechanism is installed in end face (Fig. 5 B), bottom surface (Fig. 5 C), left hand side (Fig. 5 D) and right hand side (Fig. 5 E) of auxilliary unit 2.When end face side, bottom surface side, left hand side side or right hand side side in auxilliary unit 2 settle master unit 1 or another auxilliary unit, the fixed mechanism that is installed on end face, bottom surface, left hand side or the right hand side of auxilliary unit 2 is fixed to master unit 1 or another auxilliary unit with fixed mechanism engagement on the opposite face that is installed in master unit 1 or another auxilliary unit will assist unit 2, thereby is not easy separately.
The back side in auxilliary unit 2 shown in Fig. 5 F, is installed on terminal panel 41, antenna terminal 42, input terminal 43 and lead-out terminal 44.
Terminal panel 41 has an IEEE-1394 terminal 41 1, be used for master unit 1 is connected electrically to auxilliary unit 2.When auxilliary unit 2 for example is to be positioned at upper left auxilliary unit 2 in the television system expanded shown in Figure 1A 11The time, the IEEE-1394 terminal 41 in the terminal panel 41 1Be connected to IEEE-1394 terminal 21 in the terminal panel 21 shown in Fig. 3 F by unshowned IEEE-1394 cable 11
The number of the IEEE-1394 terminal that terminal panel 41 is installed is not limited to 1.
Antenna terminal 42 is connected to the cable that is connected with unshowned antenna.Like this, the television broadcasting signal that is received by antenna is input to auxilliary unit 2.For example view data and the voice data from VTR output is input to input terminal 43.From lead-out terminal 44, the view data and the voice data of the television broadcasting signal that for example auxilliary unit 2 of output is receiving.
Gou Zao nine television receivers altogether as mentioned above, a master unit 1 and eight auxilliary unit 2 11To 2 33In the horizontal and vertical directions all with three unit cell arrangement, thereby form the television system expanded shown in Figure 1A.
The television system expanded shown in Figure 1A is by directly being placed on television receiver top, below, left or right-hand structure of the television receiver of taking on master unit or auxilliary unit.Can expand television system also can for example construct by television receiver being placed in the system-specific rack shown in Figure 6.When using special machine cabinet by this way, can prevent more lustily that forming the television receiver that can expand television system misplaces.
In the time can expanding the top of television system by television receiver directly being placed on the television receiver of taking on master unit or auxilliary unit, below, left or right-hand the structure, for example, in auxilliary unit 2 32Situation under, master unit 1 can not be placed on second shown in Figure 1A row and secondary series.On the contrary, in the time can expanding television system and use special machine cabinet shown in Figure 6, even in auxilliary unit 2 32Situation under, master unit 1 also can be placed on second the row and secondary series.
Next step, Fig. 7 is the plane graph that the example structure of controller 15 is shown.
Selector button switch 51 can be in the operation (operating on a direction therein) on eight directions altogether of four direction up and down and four angle directions thereof.In addition, can also come perpendicular to the end face of controller 15 by selector button switch 51 (to elect).Menu button switch 54 is used for going up menu screen of demonstration at the CRT11 of master unit 1 (or CRT 31 of auxilliary unit 2), is used for the various settings of input indication and (for example, specifies auxilliary unit as mentioned above IjBe arranged in the capable and j of the i that can expand television system row) and the order of a predetermined process.
When the display menu screen, on CRT 11, show the cursor of the project on menu screen that is used in reference to.Cursor moves on the pairing direction of direction of operation selector button switch 51.When cursor is positioned on the predetermined item,, then judge this project of selection if press selector button switch 51.In the present embodiment, as described later, display icon in the project on being presented at menu, and also be to come clickable icon by selector button switch 51.
Can on controller 15, install and be presented at project and icon corresponding buttons switch on the menu screen.In this case, the user can specify project or the icon that is presented on the menu screen by direct-acting controller 15 under the situation of display menu screen not.
For example, want when menu screen is got back to initial screen usually operation exit button switch 55 as the user.
Volume button switch 52 is used for increasing or reducing volume.Channel up/down push-button switch 53 is used for the broadcasting channel that increases or reduce to receive number.
Digital button (numeric keypad) switch 58 is represented from zero to nine numeral, and is used for importing represented numeral.Immediate operation load button switch 57 after complete operation digital button switch 58, thus the input of expression numeral is finished.When switching channels, new channel number etc. are that the CRT 11 (or CRT 31 of auxilliary unit 2) at master unit 1 goes up the OSD (on-screen display, screen shows) that shows the scheduled time.The Show Button 56 is used for switching the ON/OFF OSD that shows current selected channel number or current volume and shows.
TV/video/switching push button switch 59 is used for from the input (perhaps from tuner 141 shown in Figure 11 described later) of built-in tuner 121 shown in Figure 10 described later and the input of switching master unit 1 (or auxilliary unit 2) between from the input of the input terminal 23 shown in Fig. 3 F (perhaps from the input terminal 43 shown in Fig. 5 F).TV/DSS switching push button switch 60 is used in tuner 121 selecting being used to DDS (digital satellite system, the trade mark of the Hughes communication common carrier) pattern that receives the television mode of terrestrial broadcasting or be used for receiving satellite broadcast.When operand word push-button switch 58 changes channel, selected channel before storage changes.Selected channel before hop button switch 61 is used for turning back to and changes.
Language button 62 is used for selecting a scheduled instruction when with two or more language broadcasting.Guided push button switch 63 is used for showing EPG (electric program guide, electric program guiding).Collection (favorite) push-button switch 64 is used for selecting preassigned user's favorite channel.
Wired push-button switch 65, TV switch 66 and DSS switch 67 are used for switching the unit classification of the pairing command code of infrared ray that slave controller 15 sends.More particularly, except the television receiver of taking on master unit 1 and auxilliary unit 2, controller 15 (and controller 35) can also unshowned STB of remote control (set-top box, set-top box) and IRD (integrated receiver and decoder, integrated receiver and decoder).Wired push-button switch 65 is used for controlling the STB that is used to receive by the signal of cable TV network transmission by controller 15.When the wired push-button switch 65 of operation, the pairing infrared ray of other command code of unit class of STB is distributed in controller 15 emissions.Equally, TV push-button switch 66 is used for by controller 15 control master units 1 (or auxilliary unit 2).DSS push-button switch 67 is used for controlling the IRD that is used to receive from the signal of satellite transmits by controller 15.
When connecting wired push-button switch 65, TV push-button switch 66 or DSS push-button switch 67, connect LED (light emitting diode, light-emitting diode) 68,69 or 70 respectively.By these LED, which kind of other unit what controller 15 showed current permission control to the user is.When disconnecting wired push-button switch 65, TV push-button switch 66 or DSS push-button switch 67, disconnect LED 68,69 or 70 respectively.
Cable power push-button switch 71, TV power supply push-button switch 72 or DSS power knob switch 73 are used for the power supply of on/off STB, master unit 1 (or auxilliary unit 2) or IRD respectively.
Mute button switch 74 is used for being provided with or removing the mute state of master unit 1 (or auxilliary unit 2).Sleep push-button switch 75 is used for being provided with or removing when arriving the scheduled time or the sleep pattern of automatic powered-down when passing by the scheduled time.
Next step, Fig. 8 is the plane graph that the example structure of the controller 35 of assisting unit 2 is shown.
Since controller 35 have with controller 15 shown in Figure 7 in the selector button switch 51 selector button switch 81 of constructing to the identical mode of sleep push-button switch 75 to sleep push-button switch 105, so the descriptions thereof are omitted.
Fig. 9 is the plane graph of another example structure that the controller 15 of master unit 1 is shown.
In the embodiment shown in fig. 9, replace the selector button switch 51 that on eight directions, to operate shown in Figure 7, be provided for the arrow button switch 111,112,113 and 114 and the push-button switch 110 that is used to carry out selection operation of four direction up and down.In addition, in the embodiment shown in fig. 9, wired push-button switch 65, TV push-button switch 66 and DSS push-button switch 67 are the internal illumination type, and omit LED shown in Figure 7 68 to 70.Unshowned LED is positioned at the dorsal part of push-button switch 65 to 67.When pushbutton switch 65 to 67, the LED that is positioned at dorsal part responds these operations and switch.
Though other push-button switch positions are different basically with shown in Figure 7 identical to inductive switch.
The controller 35 of auxilliary unit 2 can be constructed as shown in Figure 9 equally.
Controller 15 can comprise a gyroscope, is used to detect the motion of controller 15.In this case, the gyroscope that is built in controller 15 can detect the direction of motion and the move distance of controller 15, and moves the cursor that is presented on the menu screen according to travel direction and distance.When controller 15 comprises a gyroscope by this way, in the embodiment shown in fig. 7 need be at eight directions drilling push-button switch 51 that elects, and arrow button switch 111 to 114 is unnecessary in the embodiment shown in fig. 9.Controller 35 can comprise a gyroscope equally.
Next step, Figure 10 illustrates the example electrical structure of master unit 1.
The television broadcasting signal that is received by unshowned antenna sends to tuner 121, and process detects and demodulation under the control of CPU 129.The output of tuner 121 sends to QPSK (quadrature phaseshift keying, orthogonal PSK) demodulator circuit 122, and process QPSK demodulation under the control of CPU 129.The output of QPSK demodulator circuit 122 sends to error correction circuit 123.Under the control of CPU 129, pass through error detection and correction, and output is sent to demodulation multiplexer 124.
Demodulation multiplexer 124 carries out descrambling (de-scramble) to the output of error correction circuit 123 in case of necessity under the control of CPU 129, and extracts TS (transport stream, transport stream) grouping in predetermined channel.Then, demodulation multiplexer 124 sends to MPEG (MovingPicture Experts Group with view data (video data) TS grouping, motion picture expert group) Video Decoder 125, and voice data (voice data) TS grouping is sent to mpeg audio decoder 126.The TS grouping that demodulation multiplexer 124 also will be included in the output of error correction circuit 123 in case of necessity sends to CPU 129.Demodulation multiplexer 124 also receives view data or the voice data (comprising those data the TS grouping) that sends from CPU 129, and it is sent to mpeg video decoder 125 or mpeg audio decoder 126.
Mpeg decode is carried out in 125 pairs of view data TS groupings that send from demodulation multiplexer 124 of mpeg video decoder, and sends to signal processing 127.126 pairs of voice data TS grouping carrying out mpeg decodes that send from demodulation multiplexer 124 of mpeg audio decoder.The L channel voice data and the right audio channel data that obtain by decoding in mpeg audio decoder 126 send to signal processing 127.
Signal processing 127 will send to matrix circuit 128 from the view data that mpeg video decoder 125 sends, and will send to amplifier 137 from the voice data (acoustic data) that mpeg audio decoder 126 sends.
Signal processing 127 is by DSP (digital signal processor, digital signal processor) 127A, EEPROM (electrically erasable programmable read only memory, Electrically Erasable Read Only Memory) 127B, RAM (random access memory, random access memory) formation such as 127C, and the view data of its transmission of subtend under the control of CPU 129 and voice data apply various Digital Signal Processing.
More particularly, DSP 127A uses the data that are stored among the EEPROM 127B to carry out various signal processing in case of necessity according to the program that is stored among the EEPROM 127B.EEPROM 127B stores program and necessary data used when DSP 127A carries out various the processing.RAM 137C stores DSP 137A temporarily and carries out required data of various processing and program.
The data and the program that are stored among the EEPROM 127B can be upgraded by overriding.
Matrix circuit 128 will convert RGB (RGB) view data to from the view data that signal processing 127 sends, and it is sent to CRT to show.Matrix circuit 128 has D/A (digital-to-analog) transducer, and view data is applied the D/A conversion and exports it.
CPU 129 carries out various processing for example to control tuner 121, QPSK demodulator circuit 122, error correction circuit 123, demodulation multiplexer 124, signal processing 127, IEEE-1394 interface 133, IR interface 135 and modulator-demodulator 136 according to the program that is stored among EEPROM 130 and the ROM (read only memory, read-only memory) 131.In addition, CPU 129 also will send to IEEE-1394 interface 133 from the data that demodulation multiplexer 124 sends, and will send to demodulation multiplexer 124 and signal processing 127 from the data that IEEE-1394 interface 133 sends.In addition, CPU 129 also carries out from the pairing processing of order of front panel 134 or 135 transmissions of IR interface.And CPU 129 control modulator-demodulators 136 obtain refresh routine and necessary data to visit unshowned server by telephone wire.
Even also need the data and the program of preserving after EEPROM 130 storage power supplys disconnect.ROM131 is storing IP L (initial program loader, initial program loader) program for example.The data and the program that are stored among the EEPROM 130 can be upgraded by overriding.
Required data and the program of operation of RAM 132 interim storage CPU 129.
IEEE-1394 interface 133 is connected to terminal panel 21 (its IEEE-1394 terminal 21 11To 21 33(Fig. 3 F)), and as the interface of carrying out the communication of following IEEE 1394 standards.Like this, IEEE-1394 interface 133 will send to the external world from the data that CPU 129 sends according to IEEE 1394 standards, and receive the data that send from the external world according to IEEE 1394 standards, and it is sent to CPU 129.
Though front panel 134 is not shown in Fig. 3 F at Fig. 2 or Fig. 3 A, it for example is installed in a certain position in the front of master unit 1.Front panel 134 is included as the part (Fig. 7 and Fig. 9) in the push-button switch that controller 15 provides.More particularly, front panel 134 for example comprises the push-button switch corresponding to volume button switch 52, channel up/down push-button switch 53, menu button switch 54, digital button switch 58 and TV power supply push-button switch 72.When the push-button switch of operation on the front panel 134, send to CPU 129 corresponding to the operation signal of this operation.In this case, CPU 129 carries out the pairing processing of operation signal that sends from front panel 134.
Operation on IR interface 135 response controllers 15 receives the infrared ray (light) that slave controller 15 sends.In addition, 135 pairs of receiving infrared-rays of IR interface carry out opto-electronic conversion, and the signal that will therefore obtain sends to CPU 129.In this case, CPU 129 carries out the pairing processing of signal that sends from IR interface 135, just, and the pairing processing of the operation on the controller 15.In addition, IR interface 135 is launched infrared ray according to the control of CPU 129.In other words, in the present embodiment, master unit 1 can be carried out IEEE-1394 communication by above-mentioned IEEE-1394 interface 133, by modulator-demodulator 136 executive communications described later, and by IR interface 135 execution infrared communications.
Modulator-demodulator 136 is communicated by letter by telephone wire control, thereby sends the data that send from CPU 129 by telephone wire, receives the data that send by telephone wire, and it is sent to CPU 129.
Amplifier 137 amplifies the voice data that sends from signal processing 127 in case of necessity, and it is sent to loudspeaker unit 12L and 12R to export.Amplifier 137 has D/A converter, voice data is applied the D/A conversion, and export it.
Gou Zao master unit 1 is exported image and the sound (display image and output sound) in the television program in the following manner as mentioned above.
Transport stream as the television broadcasting signal that is received by antenna sends to demodulation multiplexer 124 by tuner 121, QPSK demodulator circuit 122 and error correction circuit 123.Demodulation multiplexer 124 extracts the TS grouping of scheduled program from transport stream, view data TS grouping is sent to mpeg video decoder 125, and voice data TS grouping is sent to mpeg audio decoder 126.
125 pairs of TS grouping carrying out mpeg decodes that send from demodulation multiplexer 124 of mpeg video decoder.Therefore the view data that obtains sends to CRT 11 to show by signal processing 127 and matrix circuit 128 from mpeg video decoder 125.
126 pairs of TS grouping carrying out mpeg decodes that send from demodulation multiplexer 124 of mpeg audio decoder.Therefore the voice data that obtains sends to loudspeaker unit 12L and 12R to export by signal processing 127 and amplifier 137 from mpeg audio decoder 126.
Next step, Figure 11 illustrates the example electrical structure of auxilliary unit 2.
Because auxilliary unit 2 is formed to amplifier 157 to amplifier 137 identical tuners 141 by structure and tuner 121 shown in Figure 10, so the descriptions thereof are omitted.
Because master unit 1 and auxilliary unit 2 have antenna terminal 22 and antenna terminal 42 separately shown in Fig. 3 F and 5F, so antenna (arising from this cable) can be connected to each master unit 1 and the auxilliary unit 2 of taking on the television receiver of forming the television system expanded shown in Figure 1A or Figure 1B.Yet when antenna was connected to master unit 1 and auxilliary unit 2, wiring may be pretty troublesome.Can be connected to composition and can expand one of television receiver of television system can expanding in the television system antenna, and the television broadcasting signal that receives by this television receiver by IEEE-1394 communications distribution for example to other television receivers.
Next step, in the present embodiment, the IEEE-1394 terminal 21 of master unit 1 middle-end sub-panel 21 Ij(Fig. 3 F) is connected to auxilliary unit 2 by the IEEE-1394 cable IjThe IEEE-1394 terminal 41 of middle-end sub-panel 41 1(Fig. 5 F) communicates by letter (following the communication of IEEE-1394 standard) with the various types of data of transmission between master unit 1 and auxilliary unit 2 to be electrically connected master unit 1 and auxilliary unit 2 thereby carry out IEEE-1394.
Communicate by letter to Figure 21 B Short Description IEEE-1394 with reference to Figure 12 below.
IEEE 1394 is one of serial bus standards.IEEE-1394 communication allows the data sync transmission, therefore is suitable for transmitting data such as image and the sound that needs reproduce in real time.
More particularly, the data sync transmission can be carried out between the unit with IEEE-1394 interface (IEEE-1394 unit) with maximum 100 μ s (microsecond) transmission bands (although it is with time measurement, such appellation) and the interval of 125 μ s.In above-mentioned transmission band, allow a plurality of channels are carried out synchronous transmission.
Figure 12 illustrates the hierarchy of IEEE-1394 communication protocol.
The IEEE-1394 agreement has three layers of hierarchy, and it comprises transaction layer, link layer and physical layer.Each layer intercoms mutually, and communicates by letter with serial bus management.In addition, transaction layer is also communicated by letter with higher layer applications with link layer.Four kinds of transmissions are used for communicating by letter with reception message.They are request, expression (demonstration), response and confirm.Arrow among Figure 12 is represented these communication information types.
With " .req " is that request message is represented in the communication that the arrow title ends up, and represents message with the communication of " .ind " ending, represents response message with the communication of " .resp " ending, and represents acknowledge message with the communication of " .conf " ending.For example, TR CONT.req is meant the request communication information that sends to transaction layer from serial bus management.
Transaction layer is provided for carrying out the asynchronous transmission service of data communication according to the request that comes self-application and another IEEE-1394 unit (unit with IEEE-1394 interface), and realizes the required request response protocol of ISO/IEC13213.More particularly, the data transmission method according to the IEEE-1394 standard comprises above-mentioned synchronous transmission and asynchronous transmission.Transaction layer is carried out asynchronous transmission and is handled.The asynchronous transmission that the three kind processing units required by transaction layer protocol promptly read affairs, write affairs and locking transaction is transmitted data between the IEEE-1394 unit.
Link layer provides the data transport service of using affirmation, address process, error in data affirmation and data framing.An operation of the transmitted in packets that realizes by link layer is called child-operation.There are two kinds of child-operations: asynchronous child-operation and synchronous child-operation.
Asynchronous child-operation adopts the physical label (ID) of a node of appointment (the addressable unit among the IEEE 1394) and the address in the specified node to carry out, and receives affirmation of node loopback of data.In the asynchronous broadcast child-operation that data is set to all nodes in the IEEE-1394 universal serial bus, the not loopback of node that receives data is confirmed.
In synchronous child-operation, adopt specified channel number with fixed intervals (125 μ s, transmission data as previously mentioned).Do not returning affirmation in the child-operation synchronously.
The logical symbol that physical layer will be used for link layer converts the signal of telecommunication to.In addition, the congested request of the arbitration that physical layer process sends from link layer (contention of attempting to carry out the IEEE-1394 nodes in communication is arbitrated), and when bus reset the reconfiguring of execution IEEE-1394 universal serial bus when distributing physics ID automatically.
Serial bus management realizes the essential bus controlled function, and CSR (the control ﹠amp of ISO/IEC 13212 is provided; Status register architecture, control and status register framework).Serial bus management has the function of Node Controller, Isochronous Resource Manager and bus manager.The state of Node Controller Control Node, physical I D etc., and control transaction layer, link layer and physical layer.Isochronous Resource Manager is provided for the user mode of the resource of synchronous communication.For carrying out synchronous communication, in the unit that is connected to the IEEE-1394 universal serial bus, need at least one to have the IEEE-1394 unit of isochronous resource manager function.Bus manager is compared with other has highest function, and its purpose is to attempt the most suitably to use the IEEE-1394 universal serial bus.Isochronous Resource Manager and bus manager can be omitted.
The IEEE-1394 unit can connect by node branch or node daisy chain.When connecting a new IEEE-1394 unit,, and determine tree sign, root node, physical I D, Isochronous Resource Manager, cycle controller (cycle master), bus manager etc. to bus reset.
Determine with the tree sign as the hierarchical relationship between the IEEE-1394 unit of node.Root node is specified the node that obtains to use the authority of IEEE-1394 universal serial bus in arbitration.Physical I D determines when the grouping that will be called the self ID grouping sends to each node.Self ID grouping also comprises the information of whether taking on Isochronous Resource Manager as node data transmission rate and node.
Isochronous Resource Manager is the node of user mode that is provided for the resource of synchronous communication as mentioned above, and has as described later bandwidth register (bandwidth_available (available bandwidth) register) and channel number register (channels_available (available channel) register).In addition, Isochronous Resource Manager has the register of physical I D that the node of bus manager is taken in expression.When not having bus manager in the node as the IEEE-1394 unit that connects by the IEEE-1394 universal serial bus, Isochronous Resource Manager is as simplifying bus manager.
Cycle controller begins to be grouped into the IEEE-1394 universal serial bus with the interval transmission circulation of 125 μ s, and wherein, 125 μ s are synchronous transmission intervals.For this reason, cycle controller has register circulation timei (cycle_time register) that is used for interval (125 μ s) counting.Root node is taken on cycle controller.When root node did not have the function of cycle controller, bus manager changed root node.
Power supply on the bus manager managing I EEE-1394 universal serial bus, and change root node as mentioned above.
When after bus reset, determining Isochronous Resource Manager etc. as mentioned above, ready by the transfer of data of IEEE-1394 universal serial bus.
In synchronous transmission as one of IEEE-1394 data transmission method, obtain transmission band and transmission channel, transmission wherein arranges to have the grouping (synchronized packets) of data then.
More particularly, in synchronous transmission, cycle controller will circulate with the interval of 125 μ s and begin to divide into groups to be broadcast to the IEEE-1394 universal serial bus.When broadcast cycle began to divide into groups, the transmission of synchronized packets was ready.
Be to carry out synchronous transmission, data rewrite need be used for the channel number register that obtains the bandwidth register of transmission band and be used for obtaining channel to being provided by Isochronous Resource Manager, obtain the resource that is used for synchronous transmission with statement.
Bandwidth register and channel number register all are assigned as a CSR (control and status register) as described later, and it has 64 bit address space by ISO-IEC 13213 definition.
Bandwidth register is one 32 bit register, and wherein, high-order 19 bits are as reserve area, and low level 13 bits are represented current operable transmission band (bw_remaining).
More particularly, the initial value of bandwidth register is 00000000000000000001001100110011B (B represents that value before is a binary system) (=4915).This is owing to following reason.IEEE 1394 definition with 1572.864Mbps (bps) the transmission required time of 32 bits is 1.Above-mentioned interval 125 μ s are corresponding to 00000000000000000001100000000000B (=6144).Yet IEEE 1394 specifies the transmission band that can be used for synchronous transmission to be 80% of interval 125 μ s.Therefore, the maximum transmitted frequency band that can be used for synchronous transmission is 100 μ s, promptly aforesaid 00000000000000000001001100110011B (=4915).
Be used for asynchronous transmission by from 125 μ s, deducting the residue transmission band 25 μ s that the maximum transmitted frequency band 100 μ s that are used for synchronous transmission obtain.Asynchronous transmission is used for reading the value that is stored in bandwidth register and channel number register or is used for other purposes.
The beginning synchronous transmission need be obtained transmission band.For example, when 10 μ s transmitting pins in using 125 μ s interval bring and carry out synchronous transmission, need obtain transmission band 10 μ s.Obtaining transmission band realizes by the value that rewrites bandwidth register.More particularly, for obtaining aforesaid 10 μ s transmission bands, from the value of bandwidth register, deduct value 492, and difference is stored in the bandwidth register corresponding to 10 μ s.Therefore, when the value of bandwidth register is that 4915 (if not carrying out synchronous transmission) are when obtaining the transmission band of 10 μ s, the value of bandwidth register becomes 4423 (00000000000000000001000101000111B) from above-mentioned 4915, and it is to obtain by deducting from 4915 corresponding to 492 of 10 μ s.
When deducting value that the transmission band that will obtain (use) obtains by value less than zero the time from bandwidth register, can not obtain transmission band, do not rewrite the value of bandwidth register, can't carry out synchronous transmission.
For carrying out synchronous transmission, except that transmission band, also need to obtain transmission channel.Rewrite the channel number register and obtain a transmission channel.
The channel number register is one 64 bit register, and each bit is corresponding to a channel.More particularly, when n bit (the n bit that begins from minimum effective bit) when being 1, (n-1) channel does not use, and when the n bit was 0, (n-1) channel used.Therefore, when there not being channel just in use, the channel number register is 11111111111111111111111111111111111111111111111111111111 11111111B.When for example obtaining first channel, the channel number register makes 11111111111111111111111111111111111111111111111111111111 11111101B into.
Because the channel number register has 64 bits as mentioned above, therefore maximum 64 channels altogether that can obtain the from the 0th to the 63rd channel in synchronous transmission.The 63rd channel is used for the broadcast synchronization grouping.
As mentioned above, owing to obtain transmission band and transmission channel, realize synchronous transmission then, so data are to transmit with the transmission rate that is guaranteed.As mentioned above, it is particularly suitable for needing to reproduce in real time the transfer of data of data, as image and sound.
Next step, the CSR framework with 64 bit address space by ISO/IEC 13213 definition is followed in IEEE-1394 communication.
Figure 13 illustrates the address space of CSR framework.
A high position 16 bits of CSR are represented the ID of each node, are used for specifying distributing to each space, address of node and remain 48 bits.These high-order 16 bits further are divided into 10 bits of expression bus ID and 6 bits of expression physical I D (sense stricto node ID).Owing to be used for specific purposes by all bits being made as the value that " 1 " obtains, so can specify 1023 buses and 63 nodes.
In 256 gigabyte address spaces by the low level 48 bit appointments of CSR, the space of being determined by high-order 20 bits is divided into the initial register space of being used by 2048 byte CSR unique register, IEEE-1394 unique register etc.; Private room; The initial memory space; And other, and when the space of being determined by high-order 20 bits was used for the initial register space, the space of being determined by low level 28 bits was used for configuration ROM, to the initial cell space of the unique use of node, plug control register (PCR) etc.
Figure 14 illustrates offset address, title and the function of main CSR.
In Figure 14, " skew " tabulation show from the initial register space initial FFFFF000000h (value before " h " expression the is a hexadecimal) offset address that begins to count.Skew represents to distribute to the bandwidth of synchronous communication as mentioned above for the bandwidth register of 220h.Only the value of being represented by the bandwidth register of the node of taking on Isochronous Resource Manager is just effective.In other words, each node all has CSR shown in Figure 13, but has only the bandwidth register of Isochronous Resource Manager just effective.Therefore, in fact have only Isochronous Resource Manager that bandwidth register is just arranged.
Skew for 224h in the channel number register of 228h, as mentioned above, each bit is corresponding to one of channel number 0 to 63, and the bit with " 0 " represents that respective channels is assigned with.The channel number register of only taking on Isochronous Resource Manager is just effective.
Get back to Figure 13, the configuration ROM of following general ROM form is arranged in the address 400h in initial register space to 800h.
Figure 15 illustrates general ROM form.
Node as the addressed location among the IEEE 1394 can have a plurality of shared addresses space but the intranodal unit of independent operation.The unit catalogue can represent to distribute to the version and the position of the software of this unit.The position of bus message piece and root is fixed, but the position of other pieces is specified by offset address.
Figure 16 illustrates the details of bus message piece, root and unit catalogue.
ID number of the ID of company storage representation unit manufacturer in the bus message piece.The chip id storage is unique in the ID of unit, and this ID is not equal to any ID of other unit.According to IEC 1833 standards, the one or the eight bit group of unit specific ID of unit catalogue that satisfies the unit of IEC 1883 is made as 00h, and its two or eight bit group is made as A0h, and its three or eight bit group is made as 2Dh.In addition, the one or eight bit group of unit sw version is made as 01h, and the LSB of its three or eight bit group (least significant bit, minimum effective bit) is made as " 1 ".
The address 900h of node in initial register space shown in Figure 13 has the PCR (plug control register) of appointment in IEC1883 to 9FFh.This is to obtain when logic forms the signal path that is similar to analog interface when the plug notion is reduced to.
Figure 17 illustrates the PCR structure.
PCR has the oPCR (output plug control register) of expression output plug and the iPCR (input plug control register) of expression input plug.PCR also have represent respectively unique in the output plug of each unit and oMPR of input plug information (exporting main plug register) and iMPR (importing main plug register).The IEEE-1394 unit does not have a plurality of oMPR or a plurality of iMPR, but can have oPCR and iPCR corresponding to each plug according to the performance of IEEE-1394 unit.PCR shown in Figure 17 have 31 oPCR#0 to #30 and 31 iPCR#0 to #30.Synchronous data flow is controlled corresponding to the register of these plugs by operation.
Figure 18 A illustrates the structure of oMPR, oPCR, iMPR and iPCR to 18D.
Figure 18 A illustrates the oMPR structure, and Figure 18 B illustrates the oPCR structure, and Figure 18 C illustrates the iMPR structure, and Figure 18 D illustrates the iPCR structure.
OMPR and the iMPR data rate capacity field store in the highest dibit of MSB side is represented the code of the peak transfer rate of the synchrodata that the unit can send or receive.Basic broadcast channel among the oMPR (broadcast channel base) field is specified the channel number that is used to broadcast output.
The output plug digital section storage representation unit of oMPR in minimum five bits of LSB side has the value of output plug number, just, and the oPCR number.Perishability extended field and persistence extended field are the zones that defines for expansion in the future.
OPCR and iPCR represent the user mode of plug in the online field of MSB.More particularly, on duty when being " 1 ", plug is on line state, and on duty when being " 0 ", plug is in off-line state.OPCR represents to broadcast to be connected whether have (" 1 " expression exists, and " 0 " expression does not exist) with the broadcasting linkage counter field among the iPCR.The point-to-point linkage counter field of six bit widths among oPCR and the iPCR has the value of expression point-to-point linking number that plug has.
OPCR and six bit width channel number fields among the iPCR have the value of the expression synchronizing channel that plug was connected number.Dibit width data speed field among the oPCR is represented from the actual transfer rate of the isochronous data packet of plug output.Four bit width expense ID among the oPCR have the code that exceeds bandwidth of expression synchronous communication.Ten bit width payload fields among the oPCR are represented can be by the maximum that is included in the data in the synchronized packets of plug processing.
The AV/C command set is designated as the control command of the IEEE-1394 unit that is used for carrying out as mentioned above IEEE-1394 communication.In the present embodiment, master unit 1 and auxilliary unit 2 can be controlled mutually by using the AV/C command set.Also can use the specific command system that is different from the AV/C command set to control master unit 1 and auxilliary unit 2.
To be briefly described the AV/C command set below.
Figure 19 illustrates the data structure with the AV/C command set grouping of asynchronous transfer mode transmission.
The AV/C command set is the command set that is used to control AV (audio frequency and video) unit.In the control system of using the AV/C command set, AV/C command frame and response frame transmit by FCP (function control protocol, function control protocol).In order bus and AV unit not to be applied load, command response should send in 100ms.
As shown in figure 19, the data in the asynchronous packet are formed by 32 bits (quadlets (quadlet)) in the horizontal direction.Packet header is represented on top among the figure, and data block is represented in the bottom among the figure.Destination_ID represents the destination.
The CTS field is represented command set ID, and is " 0000 " for the AV/C command set.When grouping comprised order, command type/response (ctype/response) field was represented the functional classification of ordering, and when grouping comprises response, its expression process of commands result.Order roughly can be divided into four types: (1) is used for the order (control) from extraneous controlled function; (2) be used for order (state) from extraneous query State; (3) be used for the order (general inquiry (whether command code is supported) and ad hoc inquiry (whether command code and operand are supported)) that whether is supported from external world inquiry control command; And (4) are used to ask state variation is reported to extraneous order (notice).
Response comes loopback according to command type.Response to control command comprises " being unrealized " (not installing), " acceptance " (acceptance), " refusal " (refusal) and " temporarily " (temporarily).Response to status command comprises " being unrealized ", " refusal ", " in the conversion " (transition period) and " stablizing " (stability).Response to general inquiry order and ad hoc inquiry order comprises " realization " (installation) and " being unrealized ".Response to notification command comprises " being unrealized ", " refusal ", " temporarily " (temporarily) and " variation " (variation).
The subelement type field is used for the identify unit function.For example, distribute magnetic tape recorder/player or tuner.For distinguishing the subelement of a plurality of same types, subelement id field (being positioned at after the subelement type field) is with the differentiation that acts on addressing number.Command code (opcode) field is represented an order.Operand (operand) field is represented command parameter.The additional operations digital section is the field of placing the additional operations number.Filling field is to place the dummy argument data so that block length equals the field of predetermined number of bits.Data CRC (cycleredundancy check, cyclic redundancy check (CRC)) field represents to be used for the mistake during the transfer of data is carried out the CRC of verification.
Next step, Figure 20 A illustrates the specific examples of AV/C order to Figure 20 C.
Figure 20 A illustrates the specific examples in command type/response.Order is represented on the top of this figure, and response is represented in the bottom." control " is assigned as " 0000 ", and " state " is assigned as " 0001 ", and " ad hoc inquiry " is assigned as " 0010 ", and " notice " is assigned as " 0011 ", and " general inquiry " is assigned as " 0100 "." 0101 " to " 01111 " is preserved for standard in the future." being unrealized " is assigned as " 1000 ", and " acceptance " is assigned as " 1001 ", and " refusal " is assigned as " 1010 ", " in the conversion " is assigned as " 1011 ", " realize/stable " being assigned as " 1100 ", " variation " is assigned as " 1101 ", and " temporarily " is assigned as " 1111 "." 1110 " are preserved for standard in the future.
Figure 20 B illustrates the specific examples in subelement type (subunit_type) field." video monitor " is assigned as " 00000 ", " dish recorder/player " is assigned as " 00011 ", " magnetic tape recorder/player " is assigned as " 00100 ", " tuner " is assigned as " 00101 ", " video camera " is assigned as " 00111 ", " unique in manufacturer " is assigned as " 11100 ", and " expanding to the subelement type of next byte " is assigned as " 11110 "." unit " is assigned as " 11111 ".This is to use in the time will sending to unit itself as the setting that is switched on or switched off power supply.
Figure 20 C illustrates the specific examples in command code (opcode) field.All there is a function code table for each subelement type, and the cell type of group shown in the figure employed command code when being magnetic tape recorder/player.Operand defines at each command code." manufacturer is relevant " is assigned as " 00h ", " search pattern " is assigned as " 50h ", " timing code " is assigned as " 51h ", and " ATN " is assigned as " 52h ", and " opening MIC " is assigned as " 60h ", " read MIC " and be assigned as " 61h ", " write MIC " and be assigned as " 62h ", " being written into medium " is assigned as " C1h ", and " record " is assigned as " C2h ", " broadcast " is assigned as " C3h ", and " rewinding " is assigned as " C4h ".
Figure 21 A and Figure 21 B illustrate the specific examples of AV/C order and AV/C response.
For example, when when reproducing as the reproduction units indication of destination end (consumer) (side that will control), controller (controlling party) sends to destination end with the order shown in Figure 21 A.Because the AV/C command set is used in this order, so CTS is made as " 0000 ".Owing to use order (control), so ctype is made as " 0000 " (Figure 20 A) from extraneous control unit.Owing to use magnetic tape recorder/player, so the subelement type field is made as " 00100 " (Figure 20 B).The id field is made as " 000 ", and this represents that this ID is non-vanishing.Opcode field is made as " C3h ", and (Figure 20 C) reproduced in this expression.Operand field is made as " 75h ", this expression " F.F. ".When reproducing, destination end is transmitted back to controller with the response shown in Figure 21 B." acceptance " that expression is accepted places response field.Response field is made as " 1001 " (seeing Figure 20 A).Because the field except response field has the value identical with Figure 21 A, therefore the descriptions thereof are omitted.
Can expand in the television system, between master unit 1 and auxilliary unit 2, carry out various controls by using above-mentioned AV/C command set.In the present embodiment, between master unit 1 and auxilliary unit 2 in the performed control, control definition newer command and the response that can not handle for existing order and response, and use these newer commands and response to carry out various controls.
Above-mentioned IEEE-1394 communication and AV/C command set see " WHITESERIES No.181 IEEE-1394 MULTIMEDIA INTERFACE (the white paper sequence number 181IEEE-1394 multimedia interface) " published by Triceps for details.
As described in reference Figure 10, the IR interface 135 of master unit 1 can receive and launch infrared ray, and the controller 15 of master unit 1 is with can to transmit and receive ultrared IR interface 135 corresponding, can not only send but also receiving infrared-ray.
Figure 22 illustrates the example electrical structure of controller 15.
The described various push-button switches that provide for controller 15 with reference to Fig. 7 or Fig. 9 are provided operation part 161, and will send to control section 162 corresponding to the operation signal of pushbutton switch.
Control section 162 receives operation signal from operation part 161, and will represent that the command code (command code) that operation signal asks to handle sends to frame generating portion 163.Control section 162 also divides 167 output to carry out various processing according to Return Reception Dept..In addition, control section 162 also stores device code in the device code storage area 168 into.
Frame generating portion 163 produces and wherein is equipped with from control section 162 command code that sends and the frame structure data (frame data) that are stored in the device code the device storage part 168, and these data are sent to transmission processing section 164.
Send processing section 164 according to the frame data that send from frame generating portion 163, the carrier wave with preset frequency is modulated, and driven light radiating portion 165 according to the modulation signal that therefore obtains.
Light radiating portion 165 is for example formed by LED, and when launching infrared ray when processing section 164 drives by sending.Infrared ray by 165 emissions of light radiating portion is for example received by IR interface 135 (Figure 10).
Light receiving part 166 receiving infrared-rays convert it to the signal of telecommunication, and this signal is sent to Return Reception Dept. divide 167.Light receiving part 166 for example receives the infrared ray by 135 emissions of IR interface.
Return Reception Dept. divides the output of 167 pairs of light receiving parts 166 to carry out demodulation, and the frame data that will therefore obtain send to control section 162.
Next step, Figure 23 illustrates the frame format of the frame data that generated by frame generating portion 163.
Frame data are formed by the frame header part (leader) that is positioned at stem and two the data part #1 and the #2 that are positioned at thereafter.
Frame header partly has the data that formed by the predetermined bit string of representing the frame stem.
Data division #1 comprises device code and command code.
Device code is to distribute to the code of the equipment of carrying out the frame data transmission.When an equipment frames received according to the time, if be arranged in the device code that the device code coupling of frame data is distributed to this equipment, then this equipment is thought frame data towards this equipment, and carries out the pairing processing of command code that is arranged in frame data.
More particularly, in controller shown in Figure 22 15, when control section 162 from Return Reception Dept. divide 167 frames receiveds according to the time, control section 162 relatively is arranged in the device code and the device code that is stored in device code storage area 168 of frame data, and have only their couplings, control section 162 is just carried out the pairing processing of command code that is arranged in frame data.
Device code when divide 167 frame data that send from Return Reception Dept. does not match when being stored in device code in the device code storage area 168, and control section 162 is ignored (abandoning) these frame data.Therefore, in this case, control section 162 is not carried out any processing.
Data division #2 has the data identical with data division #1.
In controller shown in Figure 22 15, when control section 162 from Return Reception Dept. divide 167 frames receiveds according to the time, control section 162 is data division #1 and the #2 in the frame data relatively, and has only their couplings, control section 162 is compare facilities sign indicating number as mentioned above.Therefore, if data division #1 and #2 in the frame data do not match, then control section 162 is not carried out any processing.
If because therefore the data division #1 in the frame data and #2 does not match then control section 162 is not carried out any processing as mentioned above prevents to handle the frame data (erroneous frame data) that are not successfully received.
As mentioned above, the IR interface 155 (Figure 11) of auxilliary unit 2 also is to construct in the mode identical with the IR interface 135 of master unit 1 shown in Figure 10, therefore can transmit and receive infrared ray.The controller 35 of auxilliary unit 2 is with can to transmit and receive ultrared IR interface 155 corresponding, can not only launch but also receiving infrared-ray.
Figure 24 illustrates the example electrical structure of controller 35.
Because controller 35 forms to device code storage area 178 by operation part 171, wherein, these parts are constructing to the identical mode of device code storage area 168 with operation part 161 shown in Figure 22, so the descriptions thereof are omitted.
Next step, Figure 25 illustrates the example detailed structure of the IR interface 135 (Figure 10) of master unit 1.
Control section 182 receives an order from CPU 129 (Figure 10), and will send to frame generating portion 183 corresponding to the command code of this order.Control section also divides 187 frames received certificates from Return Reception Dept., and judges whether the data division #1 and the #2 (Figure 23) of frame data mate.In addition, when the data division #1 of frame data and #2 coupling, control section 182 is the device code (Figure 23) and the device code that is stored in the device code storage area 188 of frame data relatively, if and their couplings, then control section 182 sends to CPU 129 with the pairing order of command code (Figure 23) of frame data.
In addition, control section 182 also is stored in device code in the device code storage area 188.
Frame generating portion 183 produces and wherein is equipped with from control section 182 command code that sends and the frame data that are stored in the device code the device code storage area 188, and these data are sent to transmission processing section 184.
Send processing section 184 according to the frame data that send from frame generating portion 183, the carrier wave with preset frequency is modulated, and driven light radiating portion 185 according to the modulation signal that therefore obtains.
Light radiating portion 185 is for example formed by LED, and when launching infrared ray when processing section 184 drives by sending.Infrared ray by 185 emissions of light radiating portion is for example received by the light receiving part 166 (Figure 22) of controller 15.
Light receiving part 186 receiving infrared-rays convert it to the signal of telecommunication, and this signal is sent to Return Reception Dept. divide 187.Light receiving part 186 for example receives the infrared ray by light radiating portion 165 (Figure 22) emission of controller 15.
Return Reception Dept. divides the output of 187 pairs of light receiving parts 186 to carry out demodulation, and the frame data that will therefore obtain send to control section 182.
The IR interface 155 of auxilliary unit 2 also is to construct in the mode identical with the IR interface 135 of master unit 1 shown in Figure 25.
Next step processing (controller processing) with reference to the controller 15 of flow chart description shown in Figure 26 master unit 1 shown in Figure 22.In the controller 35 of auxilliary unit 2 shown in Figure 24, also carry out identical processing.
At step S1, control section 162 judges whether it receives when user's operating operation part 161 the operation signal of taking on order from operation part 161.
When step S1 judgement does not receive the operation signal of taking on order as yet, in other words, when controller 15 is not operated, handles and enter step S2, and control section 162 judges whether it receives frame data.
When step S2 judgement does not receive frame data as yet, handle and turn back to step S1, and repeat identical process later on.
When step S2 judgement receives frame data, in other words, when light receiving part 166 receives infrared ray and Return Reception Dept. when dividing 167 will send to control section 162 corresponding to ultrared frame data, processing enters step S3, and control section 162 judges whether the device code of frame data mates the device code that is stored in the device code storage area 168.
When the device code of judging frame data at step S3 does not match when being stored in device code in the device code storage area 168, handle and turn back to step S1, and repeat identical process later on.
When the device code of judging frame data at step S3 is mated the device code that is stored in the device code storage area 168, handle and enter step S4, and the pairing processing of command code in the control section 162 execution frame data, and processing turns back to step S1.
When step S1 judgement receives the operation signal of taking on order, in other words, when having sent to control section 162 when user's operating operation part 161 and corresponding to the operation signal of this operation, processing enters step S5, and whether control section 162 decision signals ask to be provided with device code.
Device code storage area 168 storage default device sign indicating numbers.The user can change this device code.More particularly, the device code of controller 15 can be specified by scheduled operation such as the user menu button switch 54 of operation control 15 and the operation of TV power supply push-button switch 72 (Fig. 7) simultaneously.At step S5, operate the decision signal whether to ask to be provided with device code in the time of according to menu button switch 54 and TV power supply push-button switch 72.
When the operation signal that sends from operation part 161 in step S5 judgement does not ask device code is set, control section 162 will send to frame generating portion 163 corresponding to the command code of operation signal, and processing enters step S6.
At step S6, frame generating portion 163 is placed the command code that sends from control section 162 and is stored in device code the device storage part 168, has the frame data of form shown in Figure 23 with generation, and these data are sent to sends processing section 164.Processing enters step S7.
At step S7, send processing section 164 according to the frame data that send from frame generating portion 163, drive light radiating portion 165, and processing turns back to step S1.165 emissions of light radiating portion are corresponding to the infrared ray of frame data.
When judging that at step S5 the operation signal request that sends from operation part 161 is provided with device code, in other words,, handle entering step S8 as user simultaneously when the menu button switch 54 of operation control 15 and TV power supply push-button switch 72 (Fig. 7).Control section 162 is waited for the operation signal that sends corresponding to device code from operation part 161, receives this operation signal, and (overriding) device code corresponding to this operation signal is set in device code storage area 168.Then, handle and turn back to step S1, and repeat identical process later on.
Device code for example can be the numerical value with predetermined numerical digit.In this case, the user comes the input equipment sign indicating number by the digital button switch 58 (Fig. 7) of for example operation control 15.
Next step is with reference to the processing (processing of IR interface) of the IR interface 135 of flow chart description shown in Figure 27 master unit 1 shown in Figure 25.The IR interface (Figure 11) of auxilliary unit 2 is also carried out identical processing.
At step S21, control section 182 judges whether it receives an order from CPU 129.
When step S21 judgement does not receive order as yet, handle and enter step S22, and control section 182 judges whether it divides 187 to receive frame data from Return Reception Dept..
When step S22 judgement does not receive frame data as yet, handle and turn back to step S21, and repeat identical process later on.
When step S22 judgement receives frame data, in other words, when light receiving part 186 receives infrared ray that slave controller for example 15 sends and Return Reception Dept. when dividing 187 will send to control section 182 corresponding to ultrared frame data, processing enters step S23, and control section 182 judges whether the device code of frame data mates the device code that is stored in the device code storage area 188.
When the device code of judging frame data at step S23 does not match when being stored in device code in the device code storage area 188, handle and turn back to step S21, and repeat identical process later on.
When the device code of judging frame data at step S23 is mated the device code that is stored in the device code storage area 188, processing enters step S24, and control section 182 sends to CPU 129 with the pairing order of the command code in the frame data, and processing turns back to step S21.
Therefore, in this case, CPU 129 carries out the pairing processing of order that sends from IR interface 135 (control section 182).
Handle and enter step S25, and control section 182 judges whether these orders ask to be provided with device code when CPU 129 receives an order when judging at step S21.
Device code storage area 188 is stored the default device sign indicating number in the mode identical with device code storage area shown in Figure 22 168.The user can change this device code.More particularly, the device code of IR interface 135 can be specified by the operation that scheduled operation such as user operate two push-button switches on the front panel 134 (Figure 10) simultaneously, wherein, these two push-button switches are corresponding to the menu button switch 54 and the TV power supply push-button switch 72 (Fig. 7) of controller 15.At step S25, operate the order of judging from CPU 129 transmissions in the time of according to these two push-button switches and whether ask to be provided with device code.
When the operation signal that sends from CPU 129 in step S25 judgement does not ask device code is set, control section 182 will send to frame generating portion 183 corresponding to the command code of this order, and processing enters step S26.
At step S26, frame generating portion 183 is placed the command code that sends from control section 182 and is stored in device code the device code storage area 188, has the frame data of form shown in Figure 23 with generation, and these data are sent to sends processing section 184.Processing enters step S27.
At step S27, send processing section 184 according to the frame data that send from frame generating portion 183, drive light radiating portion 185, and processing enters step S21.185 emissions of light radiating portion are corresponding to the infrared ray of frame data.Infrared ray is for example received by controller 15.
When the operation signal request that sends from CPU 129 in step S25 judgement is provided with device code, in other words, when the user operates two push-button switches on the front panel 134 (Figure 10) simultaneously, wherein, these two push-button switches are then handled and are entered step S28 corresponding to the menu button switch 54 and the TV power supply push-button switch 72 (Fig. 7) of controller 15.Control section 182 is waited for from CPU 129 transmitting apparatus sign indicating numbers, receives this device code, and in device code storage area 188 (overriding) this device code is set.Then, handle and turn back to step S21, and repeat identical process later on.
In master unit 1, the push-button switch of user on can the digital button switch 58 pairing front panels 134 (Figure 10) of operation control 15 comes the input equipment sign indicating number.
As mentioned above, the user can be provided with device code for the IR interface 135 of controller 15 and 35, master unit 1 and the IR interface 155 of auxilliary unit 2.In addition, in the middle of them, command code can be only transmitted having between those equipment of identical device sign indicating number.
Therefore, for example, for controlling master units 1 by controller 15, the device code of the IR interface 135 of the device code of controller 15 and master unit 1 need be made as identical value.In addition, for example, for assisting unit 2 by controller 15 controls, the device code of the IR interface 155 of the device code of controller 15 and auxilliary unit 2 need be made as identical value.And, for example, when the device code of the IR interface 155 of the IR interface 135 of controller 15, master unit 1 and auxilliary unit 2 makes when equating, if user's operation control 15 is then carried out identical processing in master unit 1 and auxilliary unit 2.
For example, even when the user only has controller 15, if in master unit 1 and the auxilliary unit 2 that can expand the television receiver of television system as composition IjIn different device codes is set, then the user can be made as the device code of required television receiver by the device code with controller 15, uses the 15 independent remote controls of a controller as master unit 1 and the auxilliary unit 2 of forming the television receiver can expand television system Ij
Next step is with reference to the processing of flow chart description shown in Figure 28 master unit 1 shown in Figure 10.
At first, at step S41, CPU 129 judges whether the incident that any unit is connected to terminal panel 21 or sends any order by IEEE-1394 interface 133, IR interface 135 etc. takes place.When judgement any incident does not take place, handle turning back to step S41.
When judging that at step S41 a unit takes place is connected to the incident of terminal panel 21, handle and enter step S42, and CPU 129 carries out authentication processing shown in Figure 29 described later.Then, processing turns back to step S41.
Whether there is a unit to be connected to terminal panel 21 for judging, needs to detect this unit and be connected to terminal panel 21 these states.This test example is as carrying out in following mode.
The IEEE-1394 terminal 21 that provides for terminal panel 21 (Fig. 3 F) is provided when a unit (by the IEEE-1394 cable) IjThe time, IEEE-1394 terminal 21 IjTerminal voltage change.IEEE-1394 interface 133 is configured to make it that terminal voltage is changed and reports to CPU 129.CPU 129 has been connected to terminal panel 21 these states from the report of IEEE-1394 interface 133 reception terminal change in voltage to detect new unit.CPU 129 also for example by Same Way discern a unit with this state of terminal panel 21 disconnections.
When judging that at step S41 IEEE-1394 interface 133, IR interface 135 etc. take place to pass through sends the incident of any order, handle and enter step S43, and master unit 1 is carried out corresponding to this process of commands.Then, processing turns back to step S41.
Next step, the authentication processing of carrying out by master unit 1 at step S42 shown in Figure 28 with reference to flow chart description shown in Figure 29.
In the authentication processing of carrying out by master unit 1, carry out two kinds of authentications: whether the unit (being called linkage unit in case of necessity) that newly is connected to terminal panel 21 is to authorize the IEEE-1394 unit, and whether the IEEE-1394 unit is the television receiver (can expand and use the unit) of taking on master unit or auxilliary unit.
More particularly, in the authentication processing of being carried out by master unit 1, at first at step S51, CPU 129 control IEEE-1394 interfaces 133 will be used to ask the authentication request order of authentication mutually to send to linkage unit, and processing enters step S52.
At step S52, whether CPU 129 judges from linkage unit and returns response to the authentication request order.Handle and enter step S53 when linkage unit returns response to the authentication request order when judging at step S52 as yet not, and CPU 129 judges whether take place overtimely, just, whether pass by the scheduled time since sending the authentication request order.
When judge at step S53 taken place overtime, in other words, though after the authentication request order sends to linkage unit in the past during the scheduled time also not when linkage unit returns response to the authentication request order, processing entering step S54.CPU 129 judges that linkage unit is not to authorize IEEE-1394 unit and authentification failure, and operator scheme is made as the stand-alone mode that does not transmit any data with linkage unit.Processing is returned.
Therefore, do not transmit any data after the master unit 1, and do not carry out IEEE-1394 communication with the linkage unit that is not mandate IEEE-1394 unit.
Do not take place as yet to handle and turn back to step S52, and repeat identical process later on when overtime when judging at step S53.
Return response when judging at step S52 to the authentication request order from linkage unit, in other words, receive by IEEE-1394 interface 133 and when sending to CPU 129 from the response of linkage unit, processing enters step S55, and CPU 129 generates random number (pseudo random number) R1 according to pre-defined algorithm, and by IEEE-1394 interface 133 it is sent to linkage unit.
Then, processing enters step S56, and CPU 129 judges whether send encrypted random number E ' (R1) from linkage unit, wherein, encrypted random number E ' is by according to pre-defined algorithm (secret key encryption method such as DES (Data Encryption Standard (R1), data encryption standard), FEAL (Fast DataEncipherment Algorithm, rapid data cryptographic algorithm) or RC5) random number R 1 that sends at step S55 is encrypted obtain.
Send encrypted random number E ' (R1) time when judging at step S56 as yet not, handle and enter step S57, and CPU 129 judges whether take place overtimely, just, whether pass by the scheduled time since sending random number R 1 from linkage unit.
Taken place overtimely when judging at step S57, in other words, even also do not return encrypted random number E ' (R1) time from linkage unit during the scheduled time in the past after random number R 1 sends to linkage unit, processing entering step S54.CPU 129 judges that as mentioned above linkage unit is not to authorize the IEEE-1394 unit, and operator scheme is made as stand-alone mode.Processing is returned.
Do not take place as yet to handle and turn back to step S56, and repeat identical process later on when overtime when judging at step S57.
Return encrypted random number E ' (R1) from linkage unit when judging at step S56, in other words, (R1) receive by IEEE-1394 interface 133 and when sending to CPU129 from the encrypted random number E ' of linkage unit, processing enters step S58, and CPU 129 encrypts the random number R 1 that generates at step S55 according to predetermined cryptographic algorithm, to generate encrypted random number E (R1).Processing enters step S59.
At step S59, CPU 129 judges whether the encrypted random number E ' that sends from linkage unit (R1) equals the encrypted random number E (R1) that generated by CPU 129 at step S58.
When judging that at step S59 encrypted random number E ' (R1) is not equal to encrypted random number E (R1), in other words, when the cryptographic algorithm that linkage unit adopted (comprise in case of necessity and be used for encrypted secret key) is different from the cryptographic algorithm that CPU 129 adopted, processing enters step S54, and CPU 129 judges that as mentioned above linkage unit is not to authorize the IEEE-1394 unit, and operator scheme is made as stand-alone mode.Processing is returned.
When judging that at step S59 encrypted random number E ' (R1) equals encrypted random number E (R1), in other words, when the cryptographic algorithm that the cryptographic algorithm that linkage unit adopted and CPU 129 are adopted is identical, processing enters step S60, and CPU 129 judges whether send the random number R 2 that is used for authenticating master unit 1 by linkage unit from linkage unit.
When step S60 judgement does not receive random number R 2 as yet, processing enters step S61, and CPU 129 judge whether take place overtime, just, for example, whether judge that at step S59 encrypted random number E ' (R1) equals encrypted random number E (R1) and passes by the scheduled time afterwards.
When having taken place overtime in step S61 judgement, in other words, even when scheduled time in the past also not when linkage unit sends random number R 2, processing enters step S54, and CPU 129 judges that as mentioned above linkage unit is not to authorize the IEEE-1394 unit, and operator scheme is made as stand-alone mode.Processing is returned.
On the other hand, do not take place as yet to handle and turn back to step S52, and repeat identical process later on when overtime when judging at step S61.
Send random number R 2 when judging from linkage unit at step S60, in other words, receive by IEEE-1394 interface 133 and when sending to CPU 129 from the random number R 2 of linkage unit, processing enters step S62, and CPU 129 encrypts random number R 2 according to predetermined cryptographic algorithm, producing encrypted random number E (R2), and it is sent to linkage unit by IEEE-1394 interface 133.
When judging from linkage unit transmission random number R 2 at step S60, linkage unit is the IEEE-1394 unit by success identity.
Then, handle and enter step S63, and CPU 129 control IEEE-1394 interfaces 133 will be will be used to ask the unit ID of linkage unit and the function-information request command of function information and the unit ID and the function information of master unit 1 to send to linkage unit.
Unit ID is unique ID that sign is taken on the television receiver of master unit 1 or auxilliary unit 2.
Function information is the information relevant with having function, and the type that for example comprises the order that will receive from the external world (for example, to receive which of the order be used for controlling mains switch, volume adjusting, channel, brightness and acutance from the external world), whether allow screen to show (OSD demonstration), whether allow mute state, and whether allow sleep state.In addition, function information comprises also whether the unit has the function of master unit or the function of auxilliary unit.
Master unit 1 can for example be stored in unit ID and function information in vendor_dependent_information (manufacturer's relevant information) field of EEPROM 130 or configuration ROM shown in Figure 15.
Then, handle to enter step S64, and CPU 129 wait-for-responses send to the function-information request command of linkage unit and arrive the unit ID and the function information of linkage unit at step S63.CPU 129 passes through IEEE-1394 interface 133 receiving element ID and function informations, and they are stored among the EEPROM130, and processing enters step S65.
At step S65, whether CPU 129 is auxilliary unit with reference to the function information that is stored among the EEPROM 130 to judge linkage unit.When judging that at step S65 linkage unit is auxilliary unit, in other words, linkage unit is when assisting the unit, to handle skips steps S66 and S67, and enter step S68 by success identity.CPU 129 is made as many viewpoints with operator scheme and shows the possibility pattern, wherein, provides aftermentioned virtual many viewpoints Presentation Function with the linkage unit of taking on auxilliary unit.Processing is returned.
On the other hand, when judging that at step S65 linkage unit is not auxilliary unit, handle and enter step S66, and CPU 129 is with reference to being stored in function information among the EEPROM 130 to judge whether linkage unit is master unit.When judging that at step S66 linkage unit is a master unit, in other words, when linkage unit is master unit by success identity, handles and enter step S67, and CPU 129 carries out master unit with the linkage unit of taking on master unit and the adjustment of auxilliary unit is handled.
Specifically, in this case,, therefore in forming the television receiver that to expand television system, exist two television receivers to take on master unit because master unit 1 is connected to another master unit.In the present embodiment, requirement can be expanded only a master unit in the television system.Therefore, at step S67, carry out master unit and the adjustment of auxilliary unit and handle, wherein, judgement is that master unit 1 or the linkage unit of taking on master unit are used as the television receiver of taking on master unit.
More particularly, for example, judging becomes that master unit that can expand a television system part the earliest, and just, the master unit 1 in the present embodiment is as the television receiver of taking on master unit.Be judged to be as other master units outside the master unit 1 of master unit as auxilliary unit.
Carry out after master unit handles with the adjustment of auxilliary unit at step S67, handle and enter step S68, and CPU 129 as mentioned above operator scheme is made as that many viewpoints show may patterns, and processing is returned.
When judging that at step S66 linkage unit is not a master unit, in other words, linkage unit is not neither master unit is again auxilliary unit, therefore when linkage unit is not authenticated to be master unit or auxilliary unit, handles and enter step S69, and CPU 129 is made as common command function with operator scheme and receives/supply a pattern, wherein, can have the AV/C command set now with the linkage unit transmission, but can not transmit the control command that is used to provide many viewpoints Presentation Function, and handle and return.
In other words, in this case, because linkage unit even therefore this linkage unit is connected to master unit 1, does not provide many viewpoints Presentation Function neither master unit is not again auxilliary unit yet.Yet, in this case,, therefore allow the existing AV/C command set of transmission between master unit 1 and linkage unit because linkage unit is to authorize the IEEE-1394 unit.Thereby in this case, between master unit 1 and linkage unit, one (or be connected to another IEEE-1394 unit of the master unit 1) can control another by existing AV/C command set.
Next step is with reference to the processing of flow chart description shown in Figure 30 auxilliary unit 2 shown in Figure 11.
At first, at step S71, CPU 149 judges whether the incident that any unit is connected to terminal panel 41 or sends any order by IEEE-1394 interface 153 or IR interface 155 takes place.When judgement any incident does not take place, handle turning back to step S71.
When judging that at step S71 a unit takes place is connected to the incident of terminal panel 41, handle and enter step S72, and CPU 149 carries out authentication processing shown in Figure 31 described later.Then, processing turns back to step S71.
Whether there is a unit to be connected to terminal panel 41, then to need to detect this unit and be connected to terminal panel 41 these states for judging.This test example is as to carry out with mode identical as described in the step S41 shown in Figure 28.
When judging that at step S71 generation sends the incident of any order by IEEE-1394 interface 153 or IR interface 155, handle and enter step S73, and auxilliary unit 2 is carried out corresponding to this process of commands.Then, processing turns back to step S71.
Next step, the authentication processing of carrying out by auxilliary unit 2 at step S72 shown in Figure 30 with reference to flow chart description shown in Figure 31.
In the authentication processing of being carried out by auxilliary unit 2, carry out two kinds of authentications: whether the unit (being called linkage unit in case of necessity) that newly is connected to terminal panel 41 is to authorize the IEEE-1394 unit, and whether the IEEE-1394 unit is master unit.
More particularly, in the authentication processing of being carried out by auxilliary unit 2, at first at step S81, CPU 149 judges whether send the authentication request order that is used to ask mutual authentication from linkage unit.When judgement does not send this order as yet, handle entering step S82.
Whether at step S82, it is overtime whether CPU 149 judgements take place, just, pass by the scheduled time since the beginning authentication processing.
When judge at step S82 taken place overtime, in other words, though when after the order of beginning authentication request in the past during the scheduled time also not when linkage unit sends the authentication request order, processing entering step S83.CPU 149 judges that linkage unit is not to authorize IEEE-1394 unit and authentification failure, and operator scheme is made as the stand-alone mode that does not transmit any data with linkage unit.Processing is returned.
Therefore, auxilliary unit 2 is in the mode identical with master unit 1, be not to authorize the linkage unit of IEEE-1394 unit not transmit any data, and do not carry out IEEE-1394 communication.
Do not take place as yet to handle and turn back to step S81, and repeat identical process later on when overtime when judging at step S82.
Send the authentication request order when judging from linkage unit at step S81, in other words, the authentication command that sends from the master unit 1 of taking on linkage unit at step S51 shown in Figure 29 is received by IEEE-1394 interface 153 and when sending to CPU 149, processing enters step S84, and CPU 149 control IEEE-1394 interfaces 153 will be sending to linkage unit to the response of authentication request order.
In the present embodiment, master unit 1 is carried out the processing of step S51 shown in Figure 29 to S53, and step S81, S82 shown in Figure 31 and the processing of S84 are carried out in auxilliary unit 2.Can also be that 2 execution step S51 shown in Figure 29 in auxilliary unit carries out step S81, S82 shown in Figure 31 and the processing of S84 to processing and the master unit 1 of S53.In other words, master unit 1 or auxilliary unit 2 can send the authentication request order.
Then, handle and enter step S85, and CPU 149 judges whether send random number R 1 from linkage unit.When judgement does not send random number R 1 as yet, handle entering step S86.
At step S86, whether CPU 149 judges whether take place overtimely, just, pass by the scheduled time since step S84 sends response to the authentication request order since CPU 149.
When judge at step S86 taken place overtime, in other words, though after CPU 149 sends the response of authentication command in the past during the scheduled time also not when linkage unit sends random number R 1, processing entering step S83.CPU 149 judges that as mentioned above linkage unit is not to authorize the IEEE-1394 unit, and operator scheme is made as the stand-alone mode that does not transmit any data with linkage unit.Processing is returned.
Do not take place as yet to handle and turn back to step S85, and repeat identical process later on when overtime when judging at step S86.
Send random number R 1 when judging from linkage unit at step S85, in other words, the random number R 1 that sends from the master unit 1 of taking on linkage unit at step S55 shown in Figure 29 is received by IEEE-1394 interface 153 and when sending to CPU 149, processing enters step S87, and CPU 149 encrypts random number R 1 according to predetermined cryptographic algorithm, to generate encrypted random number E ' (R1).In addition, at step S87, CPU 149 control IEEE-1394 interfaces 153 (R1) send to linkage unit with encrypted random number E '.Processing enters step S89.
At step S89, CPU 149 generates random number (pseudo random number) R2, and control IEEE-1394 interface 153 is to send to linkage unit with random number R 2.Processing enters step S90.
At step S90, CPU 149 judges whether send encrypted random number E (R2) from linkage unit, and wherein, encrypted random number E (R2) is by random number R 2 being encrypted and being generated by the master unit 1 of taking on linkage unit at step S62 shown in Figure 29.
Handles and enter step S91 when not receiving encrypted random number E (R2) as yet when judging, and CPU 149 judges whether take place overtimely that just, whether CPU 149 passes by the scheduled time after sending random number R 2 at step S90.
When having taken place overtime in step S91 judgement, in other words, even when CPU 149 send random number R 2 after the linkage unit in the past during the scheduled time also not when linkage unit sends encrypted random number E (R2), processing enters step S83, and CPU 149 judges that as mentioned above linkage unit is not to authorize the IEEE-1394 unit, and operator scheme is made as stand-alone mode.Processing is returned.
On the other hand, do not take place as yet to handle and turn back to step S90, and repeat identical process later on when overtime when judging at step S91.
Send encrypted random number E (R2) when judging from linkage unit at step S90, in other words, receive by IEEE-1394 interface 153 and when sending to CPU 149 from the encrypted random number E (R2) of linkage unit, processing enters step S92, and CPU 149 encrypts the random number R 2 that generates at step S89 according to predetermined cryptographic algorithm, to generate encrypted random number E ' (R2).Processing enters step S93.
At step S93, CPU 149 judges whether the encrypted random number E (R2) that sends from linkage unit equals at step S92 by the encrypted random number E ' that assists unit 2 generations (R2).
When judging that at step S93 encrypted random number E (R2) is not equal to encrypted random number E ' (R2), in other words, when the cryptographic algorithm that linkage unit adopted (comprise in case of necessity and be used for encrypted secret key) is different from the cryptographic algorithm that CPU 149 adopted, processing enters step S83, and CPU 149 judges that as mentioned above linkage unit is not to authorize the IEEE-1394 unit, and operator scheme is made as stand-alone mode.Processing is returned.
When judging that at step S93 encrypted random number E (R2) equals encrypted random number E ' (R2), in other words, thereby the cryptographic algorithm that linkage unit adopted is when authorizing the IEEE-1394 unit with the identical linkage unit of cryptographic algorithm that CPU 149 is adopted by success identity, processing enters step S94, and CPU 149 is received in unit ID and the function information that step S63 shown in Figure 29 is sent with function-information request command by the master unit 1 of taking on linkage unit by IEEE-1394 interface 153, and they are stored among the EEPROM 150.
Then, handle to enter step S95, and function-information request command of receiving from linkage unit in response to step S94 of CPU 149 control IEEE-1394 interfaces 153 and unit ID and the function information that will assist unit 2 sends to linkage unit.Then, processing enters step S96.
Auxilliary unit 2 can adopt the mode that is same as with reference to the described master unit 1 of Figure 29, unit ID and function information is stored in vendor_dependent_information (manufacturer's relevant information) field of EEPROM 150 or configuration ROM shown in Figure 15.
At step S96, CPU 149 is with reference to the function information that is stored among the EEPROM 150, to judge whether linkage unit is master unit.When judging that at step S96 linkage unit is a master unit, in other words, when linkage unit is master unit by success identity, processing enters step S97, and CPU 149 is made as many viewpoints with operator scheme and shows the possibility pattern, wherein, can provide virtual many viewpoints Presentation Function with the linkage unit of taking on master unit.Processing is returned.
On the other hand, when judging that at step S96 linkage unit is not a master unit, in other words, when linkage unit is not authenticated to be master unit, handle to enter step S98, and CPU 149 is made as common command function with operator scheme and receives/supply a pattern, wherein, can have the AV/C command set now with the linkage unit transmission, but can not transmit the control command that is used to provide many viewpoints Presentation Function, and handle and return.
In other words, in this case,,, do not provide many viewpoints Presentation Function even therefore this linkage unit is connected to auxilliary unit 2 because linkage unit is not a master unit yet.Therefore, only, do not provide many viewpoints Presentation Function by another auxilliary unit being connected to auxilliary unit 2.Yet, in this case,, therefore allow the existing AV/C command set of transmission between auxilliary unit 2 and linkage unit because linkage unit is to authorize the IEEE-1394 unit.Thereby, in this case, between auxilliary unit 2 and linkage unit (comprising other auxilliary unit), can control another by existing AV/C command set for one.
Next step, master unit 1 and auxilliary unit 2 complete successfully respectively with reference to Figure 29 and the described authentication processing of Figure 31 and master unit 1 and auxilliary unit 2 their operator scheme is made as many viewpoints show may patterns after, when user's operation control 15 (or controller 35) asked many viewpoints to show, master unit 1 and auxilliary unit 2 were carried out virtual many viewpoints described later and are shown processing.
Be used for carrying out virtual many viewpoints and show that the instruction of handling for example can send from menu screen.
More particularly, as mentioned above, when the menu button switch 54 on user's operation control 15 (Fig. 7) (the perhaps menu button switch 84 on the controller 35 (Fig. 8)), menu screen is presented on the CRT 11 (the perhaps CRT 31 of auxilliary unit 2) of master unit 1.On this menu screen, for example show that the virtual many viewpoints of expression show the icon of handling (being called virtual many viewpoints display icon in case of necessity).When user's operation control 15 is clicked virtual many viewpoints display icon, in master unit 1 and auxilliary unit 2, carry out virtual many viewpoints and show and handle.
Figure 32 illustrates and carries out the first illustrative functions structure that virtual many viewpoints show the signal processing 127 (Figure 10) in the master unit of handling 1.When clicking virtual many viewpoints display icon, the CPU 129 control signal processing sections 127 of master unit 1 are so that DSP 127A carries out the preset program that is stored among the EEPROM 127B.Functional structure shown in Figure 32 is to realize when DSP 127A carries out the program that is stored among the EEPROM 127B.Other functional structures of signal processing 127 described later also realize in the same manner.
Frame memory 191,192 and 193 is brightness signal Y and chrominance signal R-Y and the B-Y of the interim storage of unit (being unit with territory (field) perhaps) as the view data of exporting from mpeg video decoder 125 (Figure 10) with the frame.Specifically, 125 pairs of mpeg video decoders are from view data TS grouping the carrying out mpeg decodes of the predetermined channel program of demodulation multiplexer 124 outputs, and the view data that will be formed by brightness signal Y and chrominance signal R-Y and B-Y is as decoded result.Brightness signal Y and chrominance signal R-Y and B-Y that frame memory 191,192 and 193 storages are exported from mpeg video decoder 125 by this way.
In embodiment shown in Figure 32, frame memory 191 to 193 has the memory capacity that allows storage at least two frames (two territories) view data.Specifically, frame memory 191 to 193 has two memory banks (bank) that can store a frame image data separately, and view data alternately is stored in these two memory banks.
Therefore, when the nearest frame in being stored in frame memory 191 was called present frame, frame memory 191 was always stored the view data of present frame and former frame (being called former frame in case of necessity) thereof.Frame memory 192 and 193 is storing image data in the same manner.
Frame memory 194,195 and 196 is stored in brightness signal Y and chrominance signal R-Y and the B-Y that stores and divide from memory controller 197 frame (or a territory) view data that send in frame memory 191,192 and 193 respectively.
Memory controller divides 197 to be controlled by system controller 201, and the view data (brightness signal Y and chrominance signal R-Y and B-Y) that will be stored in the present frame in the frame memory 191 to 193 respectively sends to frame memory 194 to 196, and stores data in the mode of overriding.
Difference test section 198 obtains difference between the brightness signal Y of view data of the present frame that is stored in the frame memory 191 and former frame as the feature of current frame image data, and this difference is sent to system controller 201.Specifically, for example, difference test section 198 obtains absolute difference sum between the luminance signal of the brightness signal Y of current frame pixel and former frame pixel as the feature of current frame image, and should and send to system controller 201.
Counter portion 199 is carried out under the control of system controller 201 to the counting of predetermined value and with count value and is sent to system controller 201.Counter portion 199 also according to the control of system controller 201 to its count resets.
Output control part 200 reads the brightness signal Y and chrominance signal R-Y and the B-Y that are stored in the frame image data in the frame memory 194 to 196 under the control of system controller 201, and they are sent to CPU 129.
System controller 201 control storage control sections 197, counter portion 199 and output control part 200.
More particularly, the 198 absolute difference sums that send compare system controller 201 with predetermined threshold and as the current frame image data characteristics and from the difference test section, and according to comparative result control counter part 199.System controller 201 also comes control storage control section 197 and output control part 200 according to the counting of counter portion 199.
At this, system controller 201 is according to the absolute difference sum of brightness signal Y, control counter part 199.For example, can also be that difference test section 198 obtains the absolute difference sum of chrominance signal R-Y or B-Y and the absolute difference sum of system controller 201 consideration chrominance signal R-Y or B-Y is come control counter part 199.
As sending to frame memory 191 to 193 from the brightness signal Y of the view data of mpeg video decoder 125 (Figure 10) output and chrominance signal R-Y and B-Y and being positioned at matrix circuit 128 after the signal processing 127.Brightness signal Y that matrix circuit 128 will send by this way and chrominance signal R-Y and B-Y convert rgb image data to.
Figure 32 does not illustrate the voice data of mpeg audio decoder 126 (Figure 10) output from master unit 1.For example in statu quo send to the amplifier 137 of back one-level from the voice data of mpeg audio decoder 126 outputs.
Next step shows processing with reference to flow chart description shown in Figure 33 by the virtual many viewpoints in the master unit of signal processing shown in Figure 32 127 execution.
At first, at step S101, frame memory 191 to 193 is waited for from brightness signal Y and chrominance signal R-Y and the B-Y of mpeg video decoder 125 (Figure 10) arrival as a frame image data, and storage brightness signal Y and chrominance signal R-Y and B-Y.Processing enters step S102.
At step S102, difference test section 198 detect brightness signal Y that (obtaining) performed step S101 before the next-door neighbour is stored in the view data in the frame memory 191 be the brightness signal Y of current frame image data and performed step S101 is stored in the view data in the frame memory 191 in last operation brightness signal Y be absolute difference sum (the absolute difference sum that is called present frame in case of necessity) between the brightness signal Y of former frame view data as the feature of current frame image data, and should and send to system controller 201.Processing enters step S103.
At step S103, system controller 201 judges whether the absolute difference sum of present frames is almost nil, just, should and whether be equal to or less than (or less than) little positive threshold value.
Non-vanishing or when approaching zero value when the absolute difference sum of judging present frame at step S103, handle skips steps S104 to S108, and enter step S109.
When judging that at step S103 the absolute difference sum of present frame is zero or approaches zero value, in other words, current frame image and former frame image almost do not have (or not having) to change when therefore current frame image can be regarded still frame as, processing enters step S104, and 199 pairs of countings of system controller 201 control counter parts increase 1.Processing enters step S105.
At step S105, the counting of system controller 201 reference count parts 199 is to judge that whether counting is greater than (perhaps being equal to or greater than) predetermined threshold Th c(as 5).
When judging that at step S105 the counting of counter portion 199 is not more than threshold value Th cThe time, handle skips steps S106 to S108, and enter step S109.
When judging that at step S105 the counting of counter portion 199 is greater than threshold value Th cIn other words, when the view data of the predetermined number of frames of mpeg video decoder 125 output does not comprise any motion, processing enters step S106, and system controller 201 control storage control sections 197 send to frame memory 194 to 196 with the current frame image data (brightness signal Y and chrominance signal R-Y and B-Y) that will be stored in the frame memory 191 to 193, and with this storage therein in the mode of overriding.Processing enters step S107.
At step S107, the count resets of 201 pairs of counter portion 199 of system controller.Processing enters step S108.At step S108, system controller 201 control output control part 200 are to read the brightness signal Y that is stored in the frame image data in the frame memory 194 to 196 and chrominance signal R-Y and B-Y and these signals are sent to CPU 129.In addition, at step S108, system controller 201 will be indicated in predetermined auxilliary unit 2 IjThe demonstration request command of display image data sends to CPU 129.Processing enters step S109.
When CPU 129 receives when showing request commands from system controller 201, CPU 129 control IEEE-1394 interfaces 133 with the frame image data (brightness signal Y and chrominance signal R-Y are with B-Y) that will send from output control part 200 be used to indicate the demonstration request command of display image data to send to auxilliary unit 2 IjShow the auxilliary unit of handling 2 when carrying out virtual many viewpoints IjDuring from master unit 1 reception view data and demonstration request command, auxilliary unit is display image data as described later.
Therefore, for example, as above-mentioned threshold value Th cBe 5 o'clock,, then the 6th frame image data sent to auxilliary unit 2 from master unit 1 if the view data of continuous six frames much at one IjAnd show.
At step S109, whether decision-making system controller 201 receives indication from CPU 129 stops the termination order that virtual many viewpoints show processing.
When judging that at step S109 not receiving termination as yet orders, handle and turn back to step S101, and repeat identical process later on.
When judging that at step S109 receiving termination orders, in other words, for example, when user's operation control 15 (Fig. 7) thus with at display menu screen on the CRT 11 and in menu screen, click virtual many viewpoints display icon indication CPU 129 once more and stop virtual many viewpoints and show and handle and CPU 129 will stop order and send to system controller 201 time, in master unit, stop virtual many viewpoints and show processing.
Figure 34 illustrates and carries out the auxilliary unit 2 that virtual many viewpoints show processing IjIn the illustrative functions structure of signal processing 147 (Figure 11).When clicking virtual many viewpoints display icon, the CPU129 of master unit 1 controls IEEE-1394 interface 133 (Figure 10) as mentioned above and shows that will be used to indicate auxilliary unit to carry out virtual many viewpoints the initiation command of handling sends to each auxilliary unit 2 IjIn auxilliary unit 2 IjIn, when CPU 149 (Figure 11) received initiation command by IEEE-1394 interface 153, CPU 149 made the DSP 147A of signal processing 147 carry out the preset program that is stored among the EEPROM 147B.Functional structure shown in Figure 34 is to realize when DSP 147A carries out the program that is stored among the EEPROM 147B.Other functional structures of signal processing 147 described later also realize in the same manner.
Frame memory 211,212 and 213 interim storages are from brightness signal Y and the chrominance signal R-Y and the B-Y of a frame (or a territory) view data of CPU 149 transmissions.
More particularly, virtual many viewpoints according to the described master unit of reference Figure 33 show processing, when CPU 129 receives when showing request commands from system controller 201, CPU 129 control IEEE-1394 interfaces 133 will be sending to auxilliary unit 2 with showing request command from the frame image data (brightness signal Y and chrominance signal R-Y are with B-Y) that output control part 200 sends IjIn auxilliary unit 2 IjIn, CPU 149 receives by IEEE-1394 interface 153 (Figure 11) and shows a request command and a frame image data.CPU 149 will show that request command sends to system controller 219 described later, and the brightness signal Y of a frame image data and chrominance signal R-Y and B-Y are sent to frame memory 211,212 and 213.Brightness signal Y and the chrominance signal R-Y and the B-Y of the frame image data that frame memory 211,212 and 213 interim storages send from CPU 149 by this way.
Frame memory 214,215 and 216 is stored in brightness signal Y and chrominance signal R-Y and the B-Y that stores and divide from memory controller 217 frame (or a territory) view data that send in frame memory 211,212 and 213 respectively.
Memory controller divides 217 to be controlled by system controller 219, and a frame image data (brightness signal Y and chrominance signal R-Y and B-Y) that will be stored in respectively in the frame memory 211 to 213 sends to frame memory 214 to 216, and stores data in the mode of overriding.
Selector 218 selects to be stored in brightness signal Y and the chrominance signal R-Y and the B-Y of the frame image data in the frame memory 214 to 216 under the control of system controller 219, perhaps from auxilliary unit 2 IjBrightness signal Y and the chrominance signal R-Y and the B-Y of view data of mpeg video decoder 145 (Figure 11) output, and they are sent to the matrix circuit 148 (Figure 11) of back one-level.
Therefore, when selector 218 was selected view data from mpeg video decoder 145 (Figure 11) output, the program view data of tuner 141 (Figure 11) selected channel was presented at auxilliary unit 2 Ij CRT 31 on, and when selector 218 selects to be stored in the view data of frame memory 214 in 216, view data, just, the view data that sends from master unit 1 is presented at auxilliary unit 2 as mentioned above IjCRT31 on.
System controller 219 is control storage control section 217 and selector 218 under the control of CPU 149.
As Figure 32, Figure 34 does not illustrate from auxilliary unit 2 IjIn the voice data of mpeg audio decoder 146 (Figure 11) output.For example in statu quo send to the amplifier 157 of back one-level from the voice data of mpeg audio decoder 146 outputs.
Next step shows processing with reference to flow chart description shown in Figure 35 by the virtual many viewpoints in the auxilliary unit of signal processing shown in Figure 34 147 execution.
At first, at step S121, system controller 219 control selectors 218 are stored in brightness signal Y and the chrominance signal R-Y and the B-Y of the frame image data in the frame memory 214 to 216 with selection, and begin video data.More particularly, therefore, selector 218 repeats to read brightness signal Y and chrominance signal R-Y and the B-Y that is stored in the frame image data in the frame memory 214 to 216, and they are sent to the matrix circuit 148 (Figure 11) of back one-level.Like this, the view data that sends from master unit 1 in the described mode of reference Figure 33 is presented at auxilliary unit 2 IjCRT 32 on.
Do not show that when in auxilliary unit 2, carrying out virtual many viewpoint when handling, selector 218 is selected from the view data of mpeg video decoder 145 (Figure 11) output, and it is outputed to the matrix circuit 148 of back one-level.Therefore, in this case, the program view data of tuner 141 selected channels is presented on the CRT 31 of auxilliary unit 2.
Then, handle and enter step S122, and system controller 219 judges that whether CPU is with showing that request command sends a frame image data.
When step S122 judgement does not receive demonstration request command and view data, handle skips steps S123 and S124, turn back to step S125, and repeat identical process later on.
Show request command and view data when judging at step S122 to receive, in other words, master unit 1 shows that by virtual many viewpoints of the described master unit of reference Figure 33 processing will show that request command and view data send to auxilliary unit 2 IjThe time, handle and enter step S123, and brightness signal Y and the chrominance signal R-Y and the B-Y of frame memory 211 to 213 difference storing image datas.Processing enters step S124.
At step S124, system controller 219 control storage control sections 217 are to wait for that selector 218 reads a frame image data from frame memory 214 to 216; Respectively will be before the next-door neighbour performed step S123 be stored in brightness signal Y in the frame memory 211 to 213 and chrominance signal R-Y and B-Y and send to frame memory 214 to 216; And store them in the mode of overriding.Processing enters step S125.
Like this, show that from next frame selector 218 reads the view data that newly is stored in the frame memory 214 to 216 constantly.View data sends to CRT 31 by the matrix circuit 148 of back one-level, and shows.
At step S125, system controller 219 judges whether send the termination order from CPU 149 (Figure 11).
More particularly, as described in reference Figure 33, when the virtual many viewpoints of indication termination showed processing, the CPU 129 (Figure 10) of master unit 1 sent and stops ordering system controller 201.Simultaneously, CPU 129 control IEEE-1394 interfaces 133 (Figure 10) also will stop order and send to auxilliary unit 2 IjIn auxilliary unit 2 IjIn, CPU 149 receives from master unit 1 by IEEE-1394 interface 153 and stops order.When CPU 149 received the termination order, it sent to system controller 219 with this order.At step S125, judge that whether will stop order by this way sends to system controller 219 from CPU 149.
When judging at step S125, handle and turn back to step S122, and repeat identical process later on as yet not from CPU 149 transmission termination orders.
When judging at step S125 from CPU 149 transmission termination orders, processing enters step S126, system controller 219 control selectors 218 to be turning back to selection mode its used previous state before step S121 changes, and the virtual many viewpoints that stop auxilliary unit show and handle.
Show processing according to reference Figure 32 to the described virtual many viewpoints of Figure 35, when master unit 1 shown in Figure 36 A during the display program view data, if (perhaps taking place hardly) motion does not take place in successive frame, the frame image data that then will not comprise any motion sends to the auxilliary unit that composition can be expanded television system from master unit 1, for example auxilliary unit 2 11, and shown in Figure 36 B, be presented on its CRT 31.
Then, if having successive frame not move again in the shown view data in the master unit 1, the frame image data that then will not comprise any motion sends to auxilliary unit 2 from master unit 1 11, and replace present shown view data to be presented on its CRT 31, shown in Figure 36 C.
In Figure 36 A, master unit 1 shows the view data of the real-time television broadcast program of MLB Slam match.In Figure 36 B, master unit 1 continues to show the real-time television broadcast program of MLB Slam match, and auxilliary unit 211 shows that the view data of scoreboard is as the frame image data that does not comprise any motion.In addition, in Figure 36 C, master unit 1 continues the view data of the real-time television broadcast program of demonstration MLB Slam match, and auxilliary unit 2 11The view data that shows player's bench for the replacement of a majority is as the frame image data that does not comprise any motion.
In other words, in the embodiment shown in Figure 36 C, master unit 1 shows the view data of the real-time television broadcast program of MLB Slam match in the mode identical with the common TV receiver at Figure 36 A.Owing in the real-time television broadcast program of professional baseball game, change video camera, show scoreboard in the mode of close-up shot, and this scene continues some frames, so master unit 1 sends to auxilliary unit 2 with the scoreboard view data 11, and auxilliary unit 2 11Display image data (Figure 36 B).In addition, then,, show player's bench for the replacement of a majority owing in the real-time television broadcast program of professional baseball game, change video camera, and the lasting some frames of this scene, so master unit 1 sends to auxilliary unit 2 with bench for the replacement of a majority view data 11, and auxilliary unit 2 11Display image data (Figure 36 C).
As mentioned above, show to the described virtual many viewpoints of Figure 35 according to reference Figure 32 and to handle, when having successive frame not comprise any motion in the shown program of master unit 1, frame image data is sent to auxilliary unit 2, and show.Therefore, the user can watch the view data that is presented in the master unit 1 and be different from the scene of this view data, and in other words, the user can watch simultaneously from a plurality of viewpoint shot image data.
Usually, the real-time broadcast program of MLB Slam match shows scoreboard in the beginning of each innings in the mode of close-up shot.In this case, show according to virtual many viewpoints and to handle, though the user not one innings begin to see the scoreboard that shows in the close-up shot mode, but since on auxilliary unit 2 this scene of demonstration, so the user also can know score immediately.
In the embodiment shown in the 36C, view data always sends to auxilliary unit 2 from master unit 1 at Figure 36 A 11Send the auxilliary unit 2 that view data will arrive at and show therein from master unit 1 IjCan change.
Specifically, for example, can be that the view data that will not comprise first framing of any motion sends to auxilliary unit 2 from master unit 1 11, shown in Figure 37 A, the view data that does not comprise second framing of any motion is sent to auxilliary unit 2 from master unit 1 12, shown in Figure 37 B, and order changes the auxilliary unit that the transmission view data will arrive at and show therein in the same manner IjIn this case, shown in Figure 37 C, send to all auxilliary unit 2 when view data IjAnd when do showing, for example, the view data that does not comprise next framing of any motion can be sent at first to its auxilliary unit 2 that sends view data and do to show 11, and replace present shown view data to show.
In this case, the user can watch a lot of scenes that are presented at the view data in the master unit 1 and are different from this view data simultaneously.
In the embodiment shown in Figure 37 C, view data sends to all auxilliary unit 2 that composition can be expanded television system from master unit 1 at Figure 37 A Ij, and show.View data only can be sent to certain auxilliary unit 2 that composition can be expanded television system from master unit 1 IjAnd show.Certain auxilliary unit 2 IjCan for example on menu screen, specify.
Next step, Figure 38 illustrates and carries out the second illustrative functions structure that virtual many viewpoints show the signal processing 127 (Figure 10) in the master unit of handling 1.In the figure, be same as the used allocation of symbols of Figure 32 and give corresponding to part shown in Figure 32, and if there is no need, the descriptions thereof are omitted.Specifically, except counter portion 199 was not provided, signal processing 127 shown in Figure 38 had the structure identical with Figure 32 basically.
Next step shows processing with reference to flow chart description shown in Figure 39 by the virtual many viewpoints in the master unit of signal processing shown in Figure 38 127 execution.
At step S131 and S132, carry out the processing identical with S102 with step S101 shown in Figure 33.
In difference test section 198 in absolute difference sum that step S132 detects present frame as the feature of current frame image data and after should and sending to system controller 201, processing enters step S133, and system controller 201 judge present frames the absolute difference sum whether greater than (more than or equal to) predetermined threshold Th1.
When step S133 judges that the absolute difference sum of present frame is not more than threshold value Th1, handle skips steps S134 and S135, and enter step S136.
When judging that at step S133 the absolute difference sum of present frame is greater than threshold value Th1, in other words, current frame image is compared with the former frame image and is changed greatly, when therefore scene changes in current frame image, processing enters step S134, and system controller 201 is with the mode control storage control section 197 identical with step S106 shown in Figure 33, thereby will be stored in the brightness signal Y of the current frame image data in the frame memory 191 to 193 and chrominance signal R-Y and B-Y and send to frame memory 194 to 196, and with this storage therein in the mode of overriding.Processing enters step S135.
At step S135, system controller 201 control output control part 200 reading brightness signal Y and chrominance signal R-Y and the B-Y that is stored in the frame image data in the frame memory 194 to 196, and send to CPU 129 with these signals.In addition, at step S135, system controller 201 will be used in reference to and be shown in predetermined auxilliary unit 2 IjThe demonstration request command of display image data sends to CPU 129.Processing enters step S136.
When CPU 129 receives when showing request commands from system controller 201, CPU 129 controls IEEE-1394 interface 133 as mentioned above will send to auxilliary unit 2 with showing request command from the frame image data (brightness signal Y and chrominance signal R-Y are with B-Y) that output control part 200 sends IjIn this case, auxilliary unit 2 IjSignal processing 147 construct as shown in figure 34, and carrying out and show with reference to the described virtual many viewpoints of Figure 35 and to handle.Therefore, as reference Figure 36 A to Figure 36 C or Figure 37 A to Figure 37 C as described in, auxilliary unit 2 IjWith showing that request command shows the view data that sends from master unit 1.
In embodiment shown in Figure 39, because first frame that obtains after changing scene sends to auxilliary unit 2 from master unit 1 Ij, therefore auxilliary unit 2 IjBe presented at the summary of program shown in the master unit 1.
At step S136, whether decision-making system controller 201 receives the termination order that is used to indicate the virtual many viewpoints demonstration processing of termination from CPU 129.
When judging that at step S136 not receiving termination as yet orders, handle and turn back to step S131, and repeat identical process later on.
When judging that at step S131 receiving termination orders, in other words, for example, when user's operation control 15 (Fig. 7) thus with at display menu screen on the CRT 11 and in menu screen, click virtual many viewpoints display icon indication CPU 129 once more and stop virtual many viewpoints and show and handle and CPU 129 will stop order and send to system controller 201 time, in master unit, stop virtual many viewpoints and show processing.
Next step, Figure 40 illustrates and carries out the 3rd illustrative functions structure that virtual many viewpoints show the signal processing 127 (Figure 10) in the master unit of handling 1.In the figure, be same as the used allocation of symbols of Figure 32 and give corresponding to part shown in Figure 32, and if there is no need, the descriptions thereof are omitted.In other words, except do not provide frame memory 192 to 196, memory controller divide 197 or output control part 200, signal processing 127 shown in Figure 40 has the structure identical with Figure 32 basically.
Next step shows processing with reference to flow chart description shown in Figure 41 by the virtual many viewpoints in the master unit of signal processing shown in Figure 40 127 execution.
To S145, carry out step S101 with Figure 33 at step S141 to the identical processing of S105.Yet,, only will be stored in the frame memory 191 from the brightness signal Y the view data of mpeg video decoder 125 (Figure 10) output at step S141.
When judging that at step S145 the counting of counter portion 199 is not more than threshold value Th cThe time, handle skips steps S146 and S147, and enter step S148.
When judging that at step S145 the counting of counter portion 199 is greater than threshold value Th c, in other words, when the view data of the predetermined number of frames of mpeg video decoder 125 output does not comprise any motion, handle and enter step S146, and system controller 201 will be assisted unit 2 IjBe tuned to the tuner 121 selected channels (current channel) of master unit 1, and will be used to indicate and freeze and show that the freeze command of the image data frame of this channel institute broadcast program sends to CPU 129.Processing enters step S147.
When CPU 129 when system controller 201 receives freeze commands, CPU 129 control IEEE-1394 interfaces 133 are to send to freeze command auxilliary unit 2 IjShow the auxilliary unit of handling 2 when carrying out virtual many viewpoints IjDuring from master unit 1 reception freeze command, auxilliary unit receives the channel by the freeze command appointment, stores the program view data in this channel, and shows it, as described later.
In Figure 32 and embodiment shown in Figure 33, when the view data in being presented at master unit 1 does not change (changing hardly) in the plurality of continuous frame, will not become view data and send to auxilliary unit 2 from master unit 1 IjAnd show.Yet, in Figure 40 and embodiment shown in Figure 41, be not that view data is sent to auxilliary unit 2 from master unit 1 Ij, but send the freeze command just comprise at the channel of broadcast image data programs.In auxilliary unit 2 IjIn, as described later, tuner 141 selects to be included in the channel in the freeze command, and stores and show the view data of this channel institute broadcast program.
At step S147, system controller 201 is zero with the count resets of counter portion 199.Processing enters step S148.
At step S148, whether decision-making system controller 201 receives the termination order that is used to indicate the virtual many viewpoints demonstration processing of termination from CPU 129.
When judging that at step S148 not receiving termination as yet orders, handle and turn back to step S141, and repeat identical process later on.
When judging that at step S148 receiving termination orders, in other words, for example, when user's operation control 15 (Fig. 7) thus with at display menu screen on the CRT 11 and in menu screen, click virtual many viewpoints display icon indication CPU 129 once more and stop virtual many viewpoints and show and handle and CPU 129 will stop order and send to system controller 201 time, in master unit, stop virtual many viewpoints and show processing.
Next step, Figure 42 illustrates the pairing auxilliary unit 2 of situation that the signal processing 127 of master unit 1 is constructed as shown in figure 40 IjIn the illustrative functions structure of signal processing 147 (Figure 11).In Figure 42, be same as the used allocation of symbols of Figure 34 and give corresponding to part shown in Figure 34, and if there is no need, the descriptions thereof are omitted.Specifically, signal processing 147 shown in Figure 42 has the structure identical with Figure 34 basically.
In embodiment shown in Figure 42, not will be from the view data of CPU 149 (Figure 11) output but will send to frame memory 211 to 213 from the view data of mpeg video decoder 145 (Figure 11) output.
Next step shows processing with reference to flow chart description shown in Figure 43 by the virtual many viewpoints in the auxilliary unit of signal processing shown in Figure 42 147 execution.
At first, at step S151, system controller 219 control selectors 218 are stored in brightness signal Y and the chrominance signal R-Y and the B-Y of the view data in the frame memory 214 to 216 with selection, and begin video data.More particularly, therefore, selector 218 repeats to read brightness signal Y and chrominance signal R-Y and the B-Y that is stored in the frame image data in the frame memory 214 to 216, and they are sent to the matrix circuit 148 (Figure 11) of back one-level.Like this, the view data that is stored in the frame memory 214 to 216 is presented at auxilliary unit 2 IjCRT 32 on.
In embodiment shown in Figure 43, before the virtual many viewpoints of beginning show processing, suppose that frame memory 214 to 216 storages for example have the view data of black level.In this case, be adjacent to the view data that will have black level after the processing of execution in step S151 and be presented at auxilliary unit 2 IjCRT 32 on.
Then, handle and enter step S152, and system controller 219 judges whether receive freeze command.When judgement does not receive freeze command as yet, handle skips steps S153 and S154, and enter step S155.
Receive freeze command when judging at step S152, in other words, the freeze command that sends by master unit 1 at step S146 shown in Figure 41 by IEEE-1394 interface 153 (Figure 11) by CPU 149 receptions and when sending to system controller 219, processing enters step S153, and system controller 219 request CPU 149 receive the channel that is included in the freeze command by tuner 141.CPU 149 controls tuner 141 according to the request from system controller 219, thereby receives the channel that is included in the freeze command.
Like this, tuner 141 receives the channel that is included in the freeze command, and by QPSK demodulator circuit 142, error correction circuit 143 and demodulation multiplexer 144 and mpeg video decoder 145 and mpeg audio decoder 146 it is sent to signal processing 147.
The beginning storage package is contained in as mentioned above the view data in the channel in the freeze command that sends in the frame memory 211 to 213 of signal processing 147, and handles and enter step S154.
Frame memory 211 to 213 is to override the mode sequential storage to its image data frame that provides.
At step S154, system controller 219 control storage control sections 217 are waited for the nearest frame image data of storage in frame memory 211 to 213; Respectively the brightness signal Y of view data and chrominance signal R-Y and B-Y are sent to frame memory 214 to 216; And store them in the mode of overriding.Processing enters step S155.
Like this, selector 218 reads in step S154 and newly is stored in view data in the frame memory 211 to 213.Therefore, newly be stored in the view data in the frame memory 214 to 216, just, being adjacent to master unit 1 all shows in the plurality of continuous frame and does not comprise and view data in the identical channel of the channel that is receiving with master unit 1 that obtained after the view data of (comprising few) motion be presented at auxilliary unit 2 Ij CRT 31 on.
At step S155, system controller 219 judges whether send the termination order from CPU 149 (Figure 11).
More particularly, as mentioned above, the CPU 129 (Figure 10) of master unit 1 will stop order and send to system controller 201, and simultaneously, CPU 129 also will stop order and send to auxilliary unit 2 IjIn auxilliary unit 2 IjIn, CPU 149 receives from master unit 1 by IEEE-1394 interface 153 and stops order.When CPU 149 received the termination order, it sent to system controller 219 with this order.At step S155, judge that whether will stop order by this way sends to system controller 219 from CPU 149.
When judging at step S155, handle and turn back to step S152, and repeat identical process later on as yet not from CPU 149 transmission termination orders.
Send when stopping order when judging at step S155, handle entering step S156 from CPU 149, system controller 219 control tuners 141 with tuning state turn back to be adjacent to the virtual many viewpoints of beginning show handle before used previous state.Processing enters step S157.
At step S157, system controller 219 control selectors 218 to be turning back to selection mode used previous state before step S151 changes it, and stop virtual many viewpoints show and handle in auxilliary unit.
In addition, show processing according to the described virtual many viewpoints of reference Figure 40 to 43, to show the identical mode of handling to the described virtual many viewpoints of Figure 35 with reference Figure 32, when having a plurality of successive frames not comprise (comprising seldom) motion in the program view data that just in master unit 1, shows, as reference Figure 36 A to Figure 36 C or Figure 37 A to Figure 37 C as described in, auxilliary unit 2 IjThe view data that shows the frame that does not comprise motion.
Next step, Figure 44 illustrates and carries out the 4th illustrative functions structure that virtual many viewpoints show the signal processing 127 (Figure 10) in the master unit of handling 1.In the figure, be same as the used allocation of symbols of Figure 38 and give corresponding to part shown in Figure 38, and if there is no need, the descriptions thereof are omitted.In other words, except do not provide frame memory 192 to 196, memory controller divide 197 or output control part 200, signal processing 127 shown in Figure 44 has the structure identical with Figure 38 basically.
In Figure 38 and embodiment shown in Figure 39, when the view data occurrence scene that is presented at master unit 1 changed, the view data that is adjacent to the frame of scene after changing sent to auxilliary unit 2 from master unit 1 Ij, and show.Yet, in embodiment shown in Figure 44,, be not with view data but the freeze command that just will comprise the channel that is received by master unit 1 sends to auxilliary unit 2 from master unit 1 as Figure 40 and situation shown in Figure 41 IjIn auxilliary unit 2 IjIn, as described in reference Figure 42 and Figure 43, tuner 141 selects to be included in the channel from the freeze command that master unit 1 sends, and stores and show the view data of this channel institute broadcast program immediately.
Specifically, Figure 45 is that the virtual many viewpoints that illustrate in the master unit of being carried out by signal processing shown in Figure 44 127 show the flow chart of handling.
At step S161 and S163, carry out the processing identical with S133 with step S131 shown in Figure 39.
When the absolute difference sum of determining present frame at step S163 is not more than threshold value Th1, handles skips steps S164, and enter step S165.
When determining that at step S163 the absolute difference sum of present frame is greater than threshold value Th1, in other words, current frame image is compared with the former frame image and is changed when greatly therefore scene changes in current frame image, processing enters step S164, and system controller 201 sends to CPU 129 in the mode identical with step S146 shown in Figure 41 with freeze command.Processing enters step S165.
When CPU 129 when system controller 201 receives freeze commands, CPU 129 control IEEE-1394 interfaces 133 are to send to freeze command auxilliary unit 2 Ij
In this case, auxilliary unit 2 IjSignal processing 147 construct as shown in figure 42, and carrying out and show with reference to the described virtual many viewpoints of Figure 43 and to handle.Therefore, when auxilliary unit 2 IjDuring from master unit 1 reception freeze command, begin to receive the channel that has received by master unit 1 immediately, and store and show the view data of this channel program immediately.In other words, equally in this case, auxilliary unit 2 IjTo be presented at the summary of the program that shows in the master unit 1 with the described identical mode of Figure 38 and embodiment shown in Figure 39.
At step S165, whether decision-making system controller 201 receives the termination order that is used to indicate the virtual many viewpoints demonstration processing of termination from CPU 129.
When judging that at step S165 not receiving termination as yet orders, handle and turn back to step S161, and repeat identical process later on.
When judging that at step S165 receiving termination orders, in other words, for example, when user's operation control 15 (Fig. 7) thus with at display menu screen on the CRT 11 and in menu screen, click virtual many viewpoints display icon indication CPU 129 once more and stop virtual many viewpoints and show and handle and CPU 129 will stop order and send to system controller 201 time, in master unit, stop virtual many viewpoints and show processing.
In embodiment shown in Figure 45, freeze command is to send to auxilliary unit 2 by IEEE-1394 communication from master unit 1 at Figure 40 Ij.The transmission of freeze command can for example adopt infrared communication to carry out by controller 15 (or 35) as shown in figure 46.
More particularly, in embodiment shown in Figure 46, the CPU 129 indication IR interfaces 135 of master unit 1 send to auxilliary unit 2 with freeze command IjResponse is from the instruction of CPU 129, and 135 emissions of IR interface are used for indication and send freeze command to auxilliary unit 2 IjThe pairing infrared ray of transmission command.Infrared ray is received by controller 15, and controller 15 responses will be arrived auxilliary unit 2 corresponding to the infrared of freeze command corresponding to the transmission command of receiving infrared-ray IjCorresponding to the infrared ray of freeze command by auxilliary unit 2 Ij IR interface 155 receive, and IR interface 155 will send to CPU 149 corresponding to ultrared freeze command.
The IR interface 135 of master unit 1 and controller 15 send the frame data that have with reference to the described form of Figure 23 by infrared ray.The frame data that send from IR interface 135 are called F1, and the frame data that slave controller 15 sends are called F2.Frame data F1 comprises the device code of distributing to IR interface 135 and controller 15, thereby is received by controller 15 from the frame data F1 that IR interface 135 sends.
Under present case, the frame data F1 request that sends to controller 15 from IR interface 135 sends to auxilliary unit 2 with freeze command IjTherefore, frame data F1 need comprise that being used for indication is transferred to auxilliary unit 2 IjTransmission command, the freeze command that will transmit and as the auxilliary unit 2 of transmission destination IjDevice code.
In the command code of frame data F1, the command code that comprises transmission command is as so-called command code, and comprises the command code of freeze command and the device code of the transmission destination that will arrive at by transmission command transmission freeze command (is to assist unit 2 in this case IjThe device code of IR interface 155) as so-called operand.
In this case, the master unit 1 of transmit frame data F1 needs the auxilliary unit 2 of identification IjDevice code, wherein, auxilliary unit 2 IjBe to send the transmission destination that freeze command will arrive at by transmission command.When device code changes, and be adjacent to for example when auxilliary unit 2 IjPerformed above-mentioned authentication processing (Figure 31) will be assisted unit 2 by IEEE-1394 communication afterwards when being connected to master unit 1 by the IEEE-1394 cable IjDevice code report to master unit 1.Like this, all the auxilliary unit 2 that can expand television system are formed in master unit 1 identification IjDevice code.
When controller 15 receives above-mentioned frame data F1, the step S4 that controller 15 is handled at the described controller of reference Figure 26 carries out the pairing processing of command code that is arranged in frame data F1, with the frame data F2 of the device code that generates the freeze command sign indicating number wherein be equipped with among the frame data F1 and transmission destination, and it is sent to auxilliary unit 2 Ij
More particularly, in this case, controller shown in Figure 22 15 responses be arranged in frame data F1 transmission command and according to flow chart fill order transmission process shown in Figure 47, the processing of carrying out as step S4 corresponding to command code at Figure 26.
In command transfer is handled, at first, at step S171, when control section 162 when Return Reception Dept. divides 167 frames receiveds according to F1, control section 162 control frame generating portions 163 are to put into the transmission destination device code in the command code of frame data F1 the device code of frame data F2.Processing enters step S172.
At step S172, control section 162 control frame generating portions 163 are to pile up the freeze command in the command code of frame data F1 the command code among the frame data F2.Processing enters step S173.
At step S173, the frame data F2 that frame generating portion 163 will wherein be equipped with the command code of the device code of transmission destination and freeze command as mentioned above sends to and sends processing section 164, thereby by infrared ray output frame data F2, and finish dealing with.
In this case, frame data F2 comprises the device code of transmission destination, just, and auxilliary unit 2 IjDevice code.Therefore, in auxilliary unit 2 IjIn, IR interface 155 frames receiveds are according to F2, and will be that freeze command sends to CPU 149 corresponding to the order of command code.
Send to auxilliary unit 2 by infrared communication from master unit 1 IjOrder be not limited to freeze command.Can also transmit other orders.
Next step, Figure 48 illustrates and carries out the 5th illustrative functions structure that virtual many viewpoints show the signal processing 127 (Figure 10) in the master unit of handling 1.In the figure, be same as the used allocation of symbols of Figure 32 and give corresponding to part shown in Figure 32, and if there is no need, the descriptions thereof are omitted.In other words, except difference test section 198 or counter portion 199 not being provided and newly providing circular buffer 221, audio frequency rating unit 222 and the audio mode storage area 223, signal processing 127 shown in Figure 48 has the structure identical with Figure 32 basically.
Circular buffer 221 receives from the voice data of mpeg audio decoder 126 (Figure 10) output, and the sequential storage voice data.
In statu quo send to the amplifier 137 of circular buffer and back one-level from the voice data of mpeg audio decoder 126 outputs.
Audio frequency rating unit 222 is used as input pattern to the voice data that is stored in the circular buffer 221, input pattern and the voice data that is stored in the audio mode storage area 223 as reference mode are compared, and comparative result is sent to system controller 201.
223 storages of audio mode storage area are as the voice data of reference mode.
The voice data of exporting and being stored in the circular buffer 221 from mpeg audio decoder 126 (Figure 10) sends to audio mode storage area 223, and the voice data that audio mode storage area 223 can will be stored under the control of system controller 201 in the circular buffer 221 is stored as new reference mode.In other words, can update stored in audio frequency reference mode in the audio mode storage area 223.
Next step shows processing with reference to flow chart description shown in Figure 49 by the virtual many viewpoints in the master unit of signal processing shown in Figure 48 127 execution.
Frame memory 191 to 193 is to override brightness signal Y and chrominance signal R-Y and the B-Y of mode sequential storage from the view data of mpeg video decoder 125 (Figure 10) transmission.
Circular buffer 221 is also to override the voice data that the mode sequential storage sends from mpeg audio decoder 126 (Figure 10).
Show in the processing in virtual many viewpoints at first, at step S181, system controller 201 judges whether send audio modes registration request from CPU 129.
As mentioned above, audio mode storage area 223 can be stored as new reference mode with the voice data that is stored in the circular buffer 221, just, can register new reference mode.This registration request can for example be carried out by clicking audio mode registration request icon on the menu screen shown when the menu button switch 84 on the operation control 15 (Fig. 7) time.
At step S181, judge whether clicked audio mode registration request icon.
When judging that at step S181 not receiving the audio mode registration as yet asks, handle skips steps S182, and enter step S183.
When judging that at step S181 receiving the audio mode registration asks, in other words, when thereby the user clicks audio mode registration request that audio mode registration request icon will be used to ask to register new reference mode when CPU 129 sends to system controller 201, processing enters step S182, and system controller 201 for example the nearest audio data sample from be stored in circular buffer 221 be stored in the audio mode storage area 223 as new reference mode to the voice data of the sample of the rollback scheduled time.
Therefore, when listening in the program sound of loudspeaker unit 12L and 12R output output user the user when wanting voice data as reference mode, the user can register this voice data as reference mode by operation control 15.
Audio mode storage area 223 can be stored a reference mode, just, can store new reference mode to override the mode that is stored in the reference mode in the audio mode storage area 223, and also can store a plurality of reference modes, just, the reference mode in being stored in audio mode storage area 223, can also store new reference mode.
When step S182 is stored in new reference mode in the audio mode storage area 223, handle to enter step S183, and audio frequency rating unit 222 reads all voice datas of for example being stored in the circular buffer 221 as input pattern.Processing enters step S184.
At step S184, audio frequency rating unit 222 reads the reference mode that is stored in the audio mode storage area 223 so that it and input pattern are compared.More particularly, audio frequency rating unit 222 is elongating or shortening under the situation of time shaft the distance of obtaining according to predetermined ratio between input pattern and the reference mode (being called distance between audio mode in case of necessity), obtain between minimum audio mode the feature (with respect to reference mode) of distance, and it is sent to system controller 201 as input pattern.
Then, handle to enter step S185, and system controller 201 (or the less than) predetermined threshold of judging whether distance is equal to or less than between the audio mode that obtains as the input pattern feature.
When step S185 judges that distance is greater than predetermined threshold between audio mode, handle skips steps S186 and S187, and enter step S188.
When judging that at step S185 distance is not more than predetermined threshold between audio mode, in other words, in the time of can thinking input pattern and reference mode coupling, handle entering step S186 and S187 with execution and step S106 and the performed identical processing of S108 at Figure 33.Then, processing enters step S188.
Like this, in master unit 1, when voice data, will send to auxilliary unit 2 from the image data frame of mpeg video decoder 125 outputs from mpeg audio decoder 126 outputs the same as or similar to reference mode Ij
In this case, auxilliary unit 2 IjSignal processing 147 construct as shown in figure 34, and carrying out virtual many viewpoints shown in Figure 35 and show and handle.Therefore, in auxilliary unit 2 IjIn, show the image data frame that sends from master unit 1 in the above described manner.
At step S188, whether decision-making system controller 201 receives the termination order that is used to indicate the virtual many viewpoints demonstration processing of termination from CPU 129.
When judging that at step S188 not receiving termination as yet orders, handle and turn back to step S181, and repeat identical process.
When judging that at step S188 receiving termination orders, in other words, for example, when user's operation control 15 (Fig. 7) thus with at display menu screen on the CRT 11 and in menu screen, click virtual many viewpoints display icon indication CPU 129 once more and stop virtual many viewpoints and show and handle and CPU 129 will stop order and send to system controller 201 time, in master unit, stop virtual many viewpoints and show processing.
Show processing according to virtual many viewpoints shown in Figure 49, when voice data, in auxilliary unit 2 from mpeg audio decoder 126 outputs the same as or similar to reference mode IjThe middle demonstration from the image data frame of mpeg video decoder 125 outputs.Therefore, for example, the voice data of being exported when special news incision segment (special-news telop) is play in storage is during as reference mode, in auxilliary unit 2 IjThe middle view data of being broadcasted when this voice data of output that shows just, comprises the view data of special news incision segment.
When audio mode storage area 223 stores a plurality of reference mode, input pattern and a plurality of reference mode are compared one by one at the step S184 of Figure 49.For example, if can think that input pattern mates at least one in a plurality of patterns, the then processing of execution in step S186 and S187.
In these cases, obtain the feature of taking on the voice data of input pattern between audio mode apart from conduct.The power (or amplitude leyel) that can obtain the voice data of taking on input pattern is as its feature.In this case, power and the predetermined threshold of taking on the voice data of input pattern can be compared, and in auxilliary unit 2 IjIn show the power be adjacent to voice data become and be greater than or less than (more than or equal to, perhaps, be less than or equal to) after the predetermined threshold from the image data frame of mpeg video decoder 125 outputs.
In addition, be presented at auxilliary unit 2 IjIn view data be transferred to auxilliary unit 2 from master unit 1 Ij, in addition as mentioned above can also be by freeze command is sent to auxilliary unit 2 from master unit 1 IjCause is assisted unit 2 IjReceive.
Next step, Figure 50 illustrates and carries out the 6th illustrative functions structure that virtual many viewpoints show the signal processing 127 (Figure 10) in the master unit of handling 1.In the figure, be same as the used allocation of symbols of Figure 32 and give corresponding to part shown in Figure 32, and if there is no need, the descriptions thereof are omitted.In other words, except difference test section 198 or counter portion 199 are not provided and image rating unit 232 newly is provided and image model storage area 233, signal processing 127 shown in Figure 50 has the structure identical with Figure 32 basically.
Image rating unit 232 is used as input pattern to the view data that is stored in the frame memory 191, input pattern and the view data that is stored in the image model storage area 233 as reference mode are compared, and comparative result is sent to system controller 201.
The view data of reference mode is taken in 233 storages of image model storage area.
The view data (its brightness signal Y) of exporting and being stored in the frame memory 191 from mpeg video decoder 125 (Figure 10) sends to image model storage area 233, and the image data storage that image model storage area 233 can will be stored under the control of system controller 201 in the frame memory 191 is new reference mode.In other words, can update stored in graphic based pattern in the image model storage area 233.
Next step shows processing with reference to the flow chart description shown in Figure 51 by the virtual many viewpoints in the master unit of signal processing shown in Figure 50 127 execution.
Frame memory 191 to 193 is to override brightness signal Y and chrominance signal R-Y and the B-Y of mode sequential storage from the view data of mpeg video decoder 125 (Figure 10) transmission.
Show in the processing in virtual many viewpoints at first, at step S191, system controller 201 judges whether send image models registration request from CPU 129.
As mentioned above, image model storage area 233 can be new reference mode with the image data storage that is stored in the frame memory 191, just, can register new reference mode.This registration request can for example be carried out by clicking image model registration request icon on the menu screen shown when the menu button switch 84 on the operation control 15 (Fig. 7) time.
At step S191, judge whether clicked image model registration request icon.
When judging that at step S191 not receiving the image model registration as yet asks, handle skips steps S192, and enter step S193.
When judging that at step S191 receiving the image model registration asks, in other words, when thereby the user clicks image model registration request that image model registration request icon will be used to ask to register new reference mode when CPU 129 sends to system controller 201, processing enters step S192, and the nearest frame image data that system controller 201 will be stored in the frame memory 191 stores in the image model storage area 233 as new reference mode.
Therefore, when explicit user was wanted to be used as the view data of reference mode when watching the program image that is presented on the CRT 11 the user when, the user can register this view data as reference mode by operation control 15.
Image model storage area 233 can adopt the mode identical with audio mode storage area 223, stores a reference mode, and also can store a plurality of reference modes.
When step S192 is stored in new reference mode in the image model storage area 233, handle to enter step S193, and image rating unit 232 reads the nearest frame image data that for example is stored in the frame memory 191 as input pattern.Processing enters step S194.
At step S194, image rating unit 232 reads the reference mode that is stored in the image model storage area 233 so that it and input pattern are compared.More particularly, image rating unit 232 obtains the feature (with respect to reference mode) of distance between input pattern and the reference mode (being called distance between image model in case of necessity) as input pattern according to predetermined ratio, and it is sent to system controller 201.
Between image model distance for example can be take on input pattern the view data pixel pixel value and take on absolute difference sum between the pixel value of respective pixel of view data of reference mode.
At this, input pattern and reference mode all are frame image datas.The part that can also use a frame image data is as input pattern and reference mode.
In addition, can also be to use a frame image data as input pattern and a part of using a frame image data as reference mode.In this case, can be to take on a frame image data of input pattern and take under the situation of the position corresponding relation between the part of a frame image data of reference mode and obtain distance between a plurality of image models, and use the minimum value of being obtained as distance between final image model in change.
When obtain at step S194 between image model apart from the time, handle and enter step S195, and system controller 201 (or the less than) predetermined threshold of judging whether distance is equal to or less than between the image model that obtains as the input pattern feature.
When step S195 process decision chart as mode interval from greater than predetermined threshold the time, handle skips steps S196 and S197, and enter step S198.
When at step S195 process decision chart as mode interval from being not more than predetermined threshold, in other words, can think that input pattern and reference mode when coupling, processing sequence enter step S196 and S197 to carry out and step S 106 and the performed identical processing of S108 at Figure 33.Then, processing enters step S198.
Like this, in master unit 1, when view data, image data frame is sent to auxilliary unit 2 from mpeg video decoder 125 outputs the same as or similar to reference mode Ij
In this case, auxilliary unit 2 IjSignal processing 147 construct as shown in figure 34, and carrying out virtual many viewpoints shown in Figure 35 and show and handle.Therefore, in auxilliary unit 2 IjIn, show the image data frame that sends from master unit 1 in the above described manner.
At step S198, whether decision-making system controller 201 receives the termination order that is used to indicate the virtual many viewpoints demonstration processing of termination from CPU 129.
When judging that at step S198 not receiving termination as yet orders, handle and turn back to step S191, and repeat identical process.
When judging that at step S198 receiving termination orders, in other words, for example, when user's operation control 15 (Fig. 7) thus with at display menu screen on the CRT 11 and in menu screen, click virtual many viewpoints display icon indication CPU 129 once more and stop virtual many viewpoints and show and handle and CPU 129 will stop order and send to system controller 201 time, in master unit, stop virtual many viewpoints and show processing.
Show processing according to the virtual many viewpoints shown in Figure 51, when view data, in auxilliary unit 2 from mpeg video decoder 125 outputs the same as or similar to reference mode IjMiddle display image data frame.
Therefore, when the scoreboard view data that shows in the close-up shot mode in the real time programme that is stored in MLB Slam match during as reference mode, in auxilliary unit 2 IjIn show after broadcasting and have the view data of identical or icotype with the view data of taking on reference mode, just, the later broadcast image data of the scoreboard that shows in the close-up shot mode.
More particularly, when the scoreboard view data that shows in the close-up shot mode in the real time programme that master unit 1 is receiving the MLB Slam contest show shown in Figure 52 A when, is stored in MLB Slam match during as reference mode, if certain office afterwards broadcasts the scoreboard view data in the close-up shot mode, then for example Figure 52 B is shown in auxilliary unit 2 11In show the scoreboard view data in the close-up shot mode.In addition, after, when broadcasting the scoreboard view data in the close-up shot mode once more, for example shown in Figure 52 C, in auxilliary unit 2 in next office 12In show the scoreboard view data in the close-up shot mode.
When the beginning of each innings in the real-time television broadcast program in professional baseball game is broadcasted the scoreboard image in the close-up shot mode, as mentioned above, forming the auxilliary unit 2 that to expand television system IjIn show the scoreboard image in proper order in the close-up shot mode.
Therefore, in this case, the user can watch auxilliary unit 2 IjDemonstration know that every innings score changes.
In the fast announce items of election, show the view data that shows the person to be elected's face in the close-up shot mode, wherein, this view data is with the picture of each the person to be elected of political parties and groups number of expression.When this image data storage during as reference mode, if master unit 1 receives the fast announce items of election, is then being formed the auxilliary unit 2 that can expand television system in image model storage area 233 IjMiddle order shows the view data of being broadcasted that shows the person to be elected's face in the close-up shot mode in the fast announce items of election, shown in Figure 53.
Therefore, in this case, the user can watch auxilliary unit 2 IjDemonstration understand the person to be elected.
And for example, the broadcasting station is frequent broadcast weather forecast in one day.When the image data storage of the japanese map that will be used for weather predicting program (or the map in some areas such as Kanto area) in image model storage area 233 during as reference mode, if master unit 1 receives weather predicting program, then forming the auxilliary unit 2 that to expand television system IjMiddle order is presented at the weather map of broadcasting in the weather forecast, shown in Figure 54.
Therefore, in this case, the user can watch auxilliary unit 2 IjDemonstration easily know at different time zone with the weather forecast of same channel broadcasting with the weather forecast of different channel broadcasting.
Will be in auxilliary unit 2 IjThe middle view data that shows sends to auxilliary unit 2 from master unit 1 IjIn addition, as mentioned above, view data can also be by sending to auxilliary unit 2 with freeze command from master unit 1 IjCause is assisted unit 2 IjReceive.
Next step, Figure 55 illustrates and carries out the 7th illustrative functions structure that virtual many viewpoints show the signal processing 127 (Figure 10) in the master unit of handling 1.In the figure, be same as the used allocation of symbols of Figure 32, Figure 48 or Figure 50 and give corresponding to Figure 32, Figure 48 or part shown in Figure 50, and if there is no need, the descriptions thereof are omitted.In other words, except difference test section 198 shown in Figure 32 or counter portion 199 are not provided and circular buffer 221, audio frequency rating unit 222 and audio mode storage area 223 shown in Figure 48 newly is provided and image rating unit 232 and image model storage area 233 shown in Figure 50, the signal processing 127 shown in Figure 55 has the structure identical with Figure 32 basically.
Next step shows processing with reference to the flow chart description shown in Figure 56 by the virtual many viewpoints in the master unit of 127 execution of the signal processing shown in Figure 55.
Frame memory 191 to 193 is to override brightness signal Y and chrominance signal R-Y and the B-Y of mode sequential storage from the view data of mpeg video decoder 125 (Figure 10) transmission.
Circular buffer 221 is also to override the voice data that the mode sequential storage sends from mpeg audio decoder 126 (Figure 10).
Show in the processing in virtual many viewpoints at first, at step S201, system controller 201 judges whether send audio modes registration request from CPU 129 in the mode identical with the step S181 of Figure 49.When judging that not receiving the audio mode registration as yet asks, handle skips steps S202, and enter step S203.
When judging that at step S201 receiving the audio mode registration asks, processing enters step S202, and system controller 201 is stored in the audio mode storage area 223 voice data that is stored in the given time in the circular buffer 221 as new reference mode in the mode identical with the step S183 of Figure 49.Processing enters step S203.
At step S203, system controller 201 judges whether send image model registration request from CPU 129 in the mode identical with the step S191 of Figure 51.When judging that not receiving the image model registration as yet asks, handle skips steps S204, and enter step S205.
When judging that at step S203 receiving the image model registration asks, processing enters step S204, and system controller 201 stores in the image model storage area 233 the nearest frame image data that is stored in the frame memory 191 as new reference mode in the mode identical with the step S192 of Figure 51.Processing enters step S205.
At step S205, audio frequency rating unit 222 is in the mode identical with the step S183 of Figure 49, reads the voice data that is stored in the circular buffer 221 as the audio frequency input pattern.In addition at step S205, image rating unit 232 is in the mode identical with the step S193 of Figure 51, reads the view data that is stored in the frame memory 191 as the image input pattern.Processing enters step S206.
At step S206, audio frequency rating unit 222 is in the mode identical with the step S184 of Figure 49, relatively be stored in audio frequency reference mode and audio frequency input pattern in the audio mode storage area 223, thereby obtain the feature of taking on the voice data of audio frequency input pattern between audio mode apart from conduct, and it is sent to system controller 201.In addition at step S206, image rating unit 232 is in the mode identical with the step S194 of Figure 51, relatively be stored in graphic based pattern and image input pattern in the image model storage area 233, thereby obtain the feature of taking on the view data of image input pattern between image model apart from conduct, and it is sent to system controller 201.
Then, processing enters step S207, system controller 201 is in the mode identical with the step S 185 of Figure 49, whether distance is equal to or less than (perhaps less than) predetermined threshold between the audio mode that judgement obtains as audio frequency input pattern feature, perhaps in the mode identical with the step S195 of Figure 51, (perhaps less than) predetermined threshold of judging whether distance is equal to or less than between the image model that obtains as image input pattern feature.
When distance is also greater than predetermined threshold between distance between step S207 judgement audio mode is greater than predetermined threshold and image model, handles skips steps S208 and S209, and enter step S210.
When judging that at step S207 distance is equal to or less than between predetermined threshold or image model apart from being equal to or less than predetermined threshold between audio mode, in other words, can think that audio frequency input pattern and audio frequency reference mode coupling or can think image input pattern and graphic based pattern matching time the, processing sequence enter step S208 and S209 with execution and step S106 and the performed identical processing of S108 at Figure 33.Then, processing enters step S210.
Like this, in master unit 1, when voice data, will send to auxilliary unit 2 from the image data frame of mpeg video decoder 125 outputs from mpeg audio decoder 126 outputs the same as or similar to the audio frequency reference mode Ij, perhaps when view data, image data frame is sent to auxilliary unit 2 from mpeg video decoder 125 outputs the same as or similar to the graphic based pattern Ij
At step S210, whether decision-making system controller 201 receives the termination order that is used to indicate the virtual many viewpoints demonstration processing of termination from CPU 129.
When judging that at step S210 not receiving termination as yet orders, handle and turn back to step S201, and repeat identical process.
When judging that at step S210 receiving termination orders, in other words, for example, when user's operation control 15 (Fig. 7) thus with at display menu screen on the CRT 11 and in menu screen, click virtual many viewpoints display icon indication CPU 129 once more and stop virtual many viewpoints and show and handle and CPU 129 will stop order and send to system controller 201 time, in master unit, stop virtual many viewpoints and show processing.
When the signal processing 127 of master unit 1 is constructed shown in Figure 55, auxilliary unit 2 IjSignal processing 147 construct as shown in figure 34, and carrying out virtual many viewpoints shown in Figure 35 and show and handle.Therefore, in auxilliary unit 2 IjIn, show the image data frame that sends from master unit 1 in the above described manner.More particularly, in auxilliary unit 2 IjIn, when voice data from mpeg audio decoder 126 outputs the same as or similar to the audio frequency reference mode, demonstration is from the image data frame of mpeg video decoder 125 outputs, perhaps when view data from mpeg video decoder 125 output the same as or similar to the graphic based pattern, the display image data frame.
In these cases, when distance between audio mode is equal to or less than between predetermined threshold or image model distance and is equal to or less than predetermined threshold, the processing of execution in step S208 and S209.Also can be for example to have only when distance between audio mode is equal to or less than between predetermined threshold and image model distance and is equal to or less than predetermined threshold, in other words, when thinking audio frequency input pattern and audio frequency reference mode coupling and can think image input pattern and graphic based pattern matching the time, the processing of ability execution in step S208 and S209.
In this case, in auxilliary unit 2 IjIn, if when voice data from mpeg audio decoder 126 output the same as or similar to the audio frequency reference mode from the view data of mpeg video decoder 125 outputs the same as or similar to the graphic based pattern, display image data frame then.
Next step, Figure 57 illustrates and carries out the 8th illustrative functions structure that virtual many viewpoints show the signal processing 127 (Figure 10) in the master unit of handling 1.In the figure, be same as the used allocation of symbols of Figure 38 and give corresponding to part shown in Figure 38, and if there is no need, the descriptions thereof are omitted.In other words, the signal processing shown in Figure 57 127 has the structure identical with signal processing shown in Figure 38 127 basically.
Next step shows processing with reference to the flow chart description shown in Figure 58 by the virtual many viewpoints in the master unit of 127 execution of the signal processing shown in Figure 57.
At first, at step S221, system controller 201 specifies auxilliary unit of default still frame and default scene to change auxilliary unit in forming the auxilliary unit 2 that can expand television system.
The auxilliary unit of still frame is to be used for showing that master unit 1 shown view data can think not comprise the auxilliary unit of view data of the still frame of (comprising seldom) motion.The auxilliary unit of scene change is to be used for showing that master unit 1 shown view data is adjacent to the auxilliary unit of the view data that is obtained after the scene change.
At step S221, for example, system controller 201 will be made as the auxilliary unit of default still frame adjacent to the auxilliary unit 221 in master unit 1 left side, and will be made as the auxilliary unit of default scene change adjacent to the auxilliary unit 223 on master unit right side.
Then, handle and enter step S222, and system controller 201 judges whether CPU 129 has specified still frame to assist the unit.
More particularly, can specify auxilliary unit 2 outside the auxilliary unit 221 that has been designated as the auxilliary unit of default still frame IjAs the auxilliary unit of still frame.For example, the menu button switch 84 of user on can operation control 15 (Fig. 7) be with the display menu screen, and click the icon that is used to specify the auxilliary unit of still frame and specify still frame to assist the unit.
At step S222, judge whether clicked the icon that is used to specify the auxilliary unit of still frame.
When step S222 judgement does not specify still frame to assist the unit as yet, handle skips steps S223, and enter step S224.
When step S222 judgement has specified still frame to assist the unit, in other words,, click the icon that is used to specify the auxilliary unit of still frame, and specify and assist unit 2 when user's operation control 15 IjAs the auxilliary unit of still frame, thereby CPU 129 will be used in reference to the auxilliary surely unit 2 of first finger IjWhen sending to system controller 201, handle and enter step S223, and system controller 201 is provided with and specifies auxilliary unit 2 as the order of the auxilliary unit of still frame Ij(discern it and be the auxilliary unit of still frame) is as the auxilliary unit of still frame.Then, processing enters step S224.
At step S224, whether system controller 201 judges the CPU 129 auxilliary unit of given scenario change.
More particularly, can specify the auxilliary unit 2 that has been designated as the auxilliary unit of default scene change 23Outside auxilliary unit 2 IjAs the auxilliary unit of scene change.For example, the menu button switch 84 of user on can operation control 15 (Fig. 7) be with the display menu screen, and click the icon that is used to specify the auxilliary unit of scene change and come given scenario to change auxilliary unit.
At step S224, judge whether clicked the icon that is used to specify the auxilliary unit of scene change.
When judging as yet that at step S224 the unit is not assisted in the given scenario change, handle skips steps S225, and enter step S226.
When judging that at step S224 the unit is assisted in the given scenario change, in other words,, click the icon that is used to specify the auxilliary unit of scene change, and specify and assist unit 2 when user's operation control 15 IjBe the auxilliary unit of scene change, thereby CPU 129 will be used in reference to the auxilliary surely unit 2 of first finger IjWhen sending to system controller 201, handle and enter step S225, and system controller 201 is provided with and specifies auxilliary unit 2 for the order of the auxilliary unit of scene change IjBe the auxilliary unit of scene change.Then, processing enters step S226.
At step S226, frame memory 191 to 193 is waited for from brightness signal Y and chrominance signal R-Y and the B-Y of mpeg video decoder 125 (Figure 10) arrival as a frame image data, and storage brightness signal Y and chrominance signal R-Y and B-Y.Processing enters step S227.
At step S227, difference test section 198 detect that before the next-door neighbour performed step S101 is stored in the brightness signal Y (brightness signal Y of view data in the present frame) of the view data in the frame memory 191 and in last operation performed step S101 be stored in absolute difference sum between the brightness signal Y (brightness signal Y of view data in the former frame) of the view data in the frame memory 191, just, the absolute difference sum of present frame is as the feature of current frame image data, and should and send to system controller 201.
Then, processing enters step S228.Whether system controller 201 judges whether the absolute difference sum of present frames is almost nil, just with less than (being equal to or less than) little positive threshold value Th2.
When the absolute difference sum of judging present frame at step S228 during less than threshold value Th2, just, when current frame image is compared variation very little (perhaps not changing) with the former frame image, therefore can think when current frame image is still frame, processing enters step S229, and system controller 210 control storage control sections 197 send to frame memory 194 to 196 with the view data (its brightness signal Y and chrominance signal R-Y and B-Y) that will be stored in the present frame in the frame memory 191 to 193, and with this storage therein in the mode of overriding.Processing enters step S230.
At step S230, system controller 201 control output control part 200 read brightness signal Y and chrominance signal R-Y and the B-Y that is stored in the frame memory 194 to 196, and they are sent to CPU 129.In addition, at step S230, system controller 201 will be used in reference to the demonstration request command that is shown in display image data in the auxilliary unit and send to CPU 129.Processing enters step S234.
Be used in reference to when being shown in the demonstration request command that shows the auxilliary unit of still frame when CPU 129 receives from system controller 201, CPU 129 control IEEE-1394 interfaces 133 with the frame image data (its brightness signal Y and chrominance signal R-Y are with B-Y) that will send from output control part 200 be used to indicate the demonstration request command of display image data to send to the auxilliary unit of still frame.The auxilliary unit 2 of taking on the auxilliary unit of still frame IjSignal processing 147 construct as shown in figure 34, and carrying out virtual many viewpoints shown in Figure 35 and show and handle.Therefore, the view data of comparing the very little present frame of variation in the view data that is presented in the master unit 1 with former frame is sent to the auxilliary unit of still frame, and show therein.
On the other hand, when when step S228 judges that the absolute difference sum of present frame is not less than threshold value Th2, processing enters step S231, and system controller 201 judges that whether the absolute difference sum of present frame is greater than (perhaps being equal to or greater than) threshold value Th1, wherein, threshold value Th1 is fully greater than threshold value Th2.
When step S231 judges that the absolute difference sum of present frame is not more than threshold value Th1, handle skips steps S232 and S233, and enter step S234.
When the absolute difference sum of judging present frame at step S231 during greater than threshold value Th1, in other words, compare with the image of former frame when the image of present frame and to alter a great deal, when therefore the present frame occurrence scene changes, processing enters step S232, and system controller 210 is with the mode control storage control section 197 identical with step S229, send to frame memory 194 to 196 with the brightness signal Y that will be stored in the current frame image data in the frame memory 191 to 193 and chrominance signal R-Y and B-Y, and they are stored in wherein in the mode of overriding.Processing enters step S233.
At step S233, system controller 201 control output control part 200 to be reading brightness signal Y and chrominance signal R-Y and the B-Y that is stored in the frame image data in the frame memory 194 to 196, and they are sent to CPU 129.In addition, at step S233, system controller 201 will be used in reference to the demonstration request command that is shown in display image data in the auxilliary unit of scene change and send to CPU 129.Processing enters step S234.
Be used in reference to when being shown in the demonstration request command that shows the auxilliary unit of scene change when CPU 129 receives from system controller 201, CPU 129 control IEEE-1394 interfaces 133 will be sending to the auxilliary unit of scene change with showing request command from the frame image data (its brightness signal Y and chrominance signal R-Y are with B-Y) that output control part 200 sends.The auxilliary unit 2 of taking on the auxilliary unit of scene change IjSignal processing 147 construct as shown in figure 34, and carrying out virtual many viewpoints shown in Figure 35 and show and handle.Therefore, send to the auxilliary unit of scene change with being adjacent to the view data that is obtained after the scene change in the view data that is presented in the master unit 1, and show therein.
At step S234, whether decision-making system controller 201 receives the termination order that is used to indicate the virtual many viewpoints demonstration processing of termination from CPU 129.
When judging that at step S234 not receiving termination as yet orders, handle and turn back to step S221, and repeat identical process later on.
When judging that at step S234 receiving termination orders, in other words, for example, when user's operation control 15 (Fig. 7) thus with at display menu screen on the CRT 11 and in menu screen, click virtual many viewpoints display icon indication CPU 129 once more and stop virtual many viewpoints and show and handle and CPU 129 will stop order and send to system controller 201 time, in master unit, stop virtual many viewpoints and show processing.
As mentioned above, according to the embodiment shown in Figure 57 and Figure 58, the view data that does not comprise motion in the program that master unit 1 is being received is presented in the auxilliary unit of still frame, and the view data that is obtained after the scene change in the program that master unit 1 is being received is presented in the auxilliary unit of scene change.
At this, the auxilliary unit of still frame is taken in an auxilliary unit.Can be that the auxilliary unit of still frame is taken in a plurality of auxilliary unit, and a plurality of auxilliary unit show from the view data of master unit 1 transmission as reference Figure 37 A order as described in Figure 37 C.This also can be applied to the auxilliary unit of scene change.
Next step, Figure 59 illustrates and carries out the 9th illustrative functions structure that virtual many viewpoints show the signal processing 127 (Figure 10) in the master unit of handling 1.
Frame memory 241 receives from the view data of mpeg video decoder 125 (Figure 10) output, and temporarily storing image data.More particularly, frame memory 241 has the memory capacity that for example allows storage at least two frame image datas, and overrides the mode sequential storage view data of older frame image data of time with nearest view data.
In the present embodiment, as mentioned above, mpeg video decoder 125 output brightness signal Y and chrominance signal R-Y and B-Y are as view data.In Figure 59, brightness signal Y and chrominance signal R-Y and B-Y general designation are made the image data.
In statu quo send to the matrix circuit 128 (Figure 10) and the frame memory 241 of back one-level from the view data of mpeg video decoder 125 outputs.
In the embodiment shown in Figure 59, the voice data of exporting from mpeg audio decoder 126 sends to the amplifier 137 of one-level afterwards by signal processing 127, but Figure 59 does not illustrate voice data.
N frame memory 242 1To 242 NBe stored in the view data of storing and dividing 243 transmissions in the frame memory 241 from memory controller temporarily.
Memory controller divides 243 to be controlled by system controller 247, and the view data (its brightness signal Y and chrominance signal R-Y and B-Y) that will be stored in the present frame in the frame memory 241 sends to N frame memory 242 1To 242 NOne of, and in the mode of overriding with this storage therein.
Still frame (can regard still frame as) is detected in still frame test section 244 in the view data that frame memory 241 is stored, from frame memory 241 reads image data, and it is sent to rating unit 245.
More particularly, still frame test section 224 for example obtains the absolute difference sum between the view data of the view data of the nearest frame (present frame) that is stored in the frame memory 241 and former frame, and when difference absolute value sum is zero or approaches zero, the view data of present frame is read as still frame (not comprising (perhaps comprising seldom) motion) in still frame test section 244 from frame memory 241, and it is sent to rating unit 245.
Also can be, when each absolute difference sum of plurality of continuous frame is zero or when approaching zero, the view data of last frame is detected as still frame in still frame test section 244.
Rating unit 245 relatively from the still frame test section the 244 still frame view data that send be stored in each frame memory 242 1To 242 NIn the view data match condition of two view data (judge), and comparative result sent to system controller 247.
More particularly, rating unit 245 for example obtains the still frame view data of 244 transmissions from the still frame test section and is stored in each frame memory 242 1To 242 NIn view data between the absolute difference sum, and comparative result sent to system controller 247.
Output control part 246 reads under the control of system controller 247 and is stored in frame memory 242 nIn a frame image data, and these data are sent to CPU 129.
System controller 247 is according to the control of CPU 129 and the output of rating unit 245, control storage control section 243 and output control part 246.
Next step flow chart description with reference to Figure 60 is shown by the virtual many viewpoints in the master unit of 127 execution of the signal processing shown in Figure 59 to be handled.
At first, at step S241, system controller 247 is made as the sum that composition can be expanded the auxilliary unit 2 of television system with the maximum number N of the auxilliary unit of still frame.Therefore, in the present embodiment, the maximum number N of the auxilliary unit of still frame is made as 8 at step S241.
Then, handle and enter step S242, and system controller 247 will be represented frame memory 242 respectively 1To 242 NWhether the storage mark flg (1) of storing image data is made as the expression value of storing image data not as yet to flg (N), as " 0 ".Processing enters step S243.
Storage mark flg (1) is stored in the unshowned memory to flg (N), and is built in the system controller 247.
At step S243, system controller 247 judges whether CPU 129 has specified still frame to assist the unit.
More particularly, in the embodiment shown in Figure 60, composition can be expanded all auxilliary unit 2 of television system by the default auxilliary unit of still frame of being appointed as.The user can specify the auxilliary unit 2 of taking on the auxilliary unit of still frame.For example, the menu button switch 84 of user on can operation control 15 (Fig. 7) comes the display menu screen, and clicks the icon that is used to specify the auxilliary unit of still frame and specify auxilliary unit.At step S243, judge whether clicked the icon that is used to specify the auxilliary unit of still frame.
When step S243 judgement has specified still frame to assist the unit, in other words, when user's operation control 15, click is used to specify the icon of the auxilliary unit of still frame, and specify one or more auxilliary unit 2 as the auxilliary unit of still frame, thereby CPU 129 output is used in reference to the fixed one or more auxilliary unit 2 of first finger when arriving system controller 247 for the order of the auxilliary unit of still frame, processing enters step S244, and system controller 247 is provided with specified one or more auxilliary unit 2 as the auxilliary unit of still frame (being identified as the auxilliary unit of still frame).Then, processing enters step S245.
At step S245, system controller 247 is made as the auxilliary unit number of being appointed as the auxilliary unit of still frame by CPU 129 once more with the maximum number N of the auxilliary unit of still frame.Processing enters step S248.
On the other hand, handle and enter step S246 when not specifying the auxilliary unit of still frame as yet, and system controller 247 is judged whether CPU 129 have sent to be used for removing and is stored in frame memory 242 when judging at step S243 nIn the removing request of view data.
As described later, frame memory 242 nThe still frame view data that the storage past is received by master unit 1, then, when master unit 1 newly receives and be stored in frame memory 242 nIn the similar still frame view data of still frame view data the time, upgrade frame memory 242 nMemory contents.Therefore, when frame memory 242 nDuring storage still frame view data, only will with the similar still frame image data storage of the still frame view data of being stored at frame memory 242 nIn.
In the embodiment shown in Figure 60, the user can remove frame memory 242 by operation control 15 nMemory contents.At step S246, whether judgement user operation control 15 removes frame memory 242 nThereby CPU 129 will be used for removing and be stored in frame memory 242 nIn the removing request of view data send to system controller 247.
Be used to remove frame memory 242 nRequest can for example on menu screen, carry out.
Be used to remove frame memory 242 when judging at step S246 not send as yet nRequest the time, handle skips steps S247, and enter step S248.
Be used to remove frame memory 242 when judging to have sent at step S246 nRequest the time, in other words, be used for indication and remove frame memory 242 when system controller 247 receives from CPU 129 nOrder the time, handle to enter step S247, and system controller 247 is made as zero with storage mark flg (n).Then, processing enters step S248.
At step S248, frame memory 241 is waited for from mpeg video decoder 125 (Figure 10) and is sent the new image data frame, and storing image data.Processing enters step S249.
At step S249, judge whether the step S244 before the next-door neighbour shows still frame by the current frame image data that still frame test section 244 is stored in the frame memory.When judging that view data does not show still frame, handle skips steps S250 to S259, and enter step S260.
When the view data of judging present frame at step S249 showed still frame, still frame test section 244 was from the current frame image data of frame memory 241 reading displayed still frames, and it is sent to rating unit 245.Processing enters step S250.At step S250, system controller 247 will be represented frame memory 242 1To 242 N Middle frame memory 242 to be processed nVariable " n " be made as zero.Processing enters step S251.At step S251,247 pairs of variablees of system controller " n " increase 1.Processing enters step S252.System controller 247 judges whether storage mark flg (n) is zero.
When step S252 judges that storage mark flg (n) is zero, in other words, when frame memory 242 nNot during storing image data, handle and enter step S253, and system controller 247 is made as expression frame memory 242 with storage mark flg (n) as yet nThe value of having stored the still frame view data is as " 1 ".Processing enters step S254.
At step S254, system controller 247 control storage control sections 243 send to frame memory 242 with the view data (its brightness signal Y and chrominance signal R-Y and B-Y) that will be stored in the present frame in the frame memory 241 n, and in the mode of overriding it is stored in wherein.Processing enters step S258.
At step S258, system controller 247 control output control part 246 are with from frame memory 242 nRead a frame image data, and it is sent to CPU 129.In addition, at step S258, system controller 247 will be used in reference to be shown in and show in the auxilliary unit of still frame and be stored in frame memory 242 nIn the demonstration request command of view data send to CPU 129.Processing enters step S259.
When being used for indicated number from system controller 247 receptions, CPU 129 is stored in frame memory 242 nIn the demonstration request command of view data the time, CPU 129 control IEEE-1394 interfaces 133 with the frame image data (its brightness signal Y and chrominance signal R-Y are with B-Y) that will send from output control part 246 and the demonstration request command that is used for indicating display image data send to take on the auxilliary unit of still frame auxilliary unit 2 corresponding to frame memory 242 n Auxilliary unit 2 Ij
More particularly, frame memory 242 1To 242 NNumber N equal to take on number (maximum number) N of the auxilliary unit 2 of the auxilliary unit of still frame.Be adjacent to after step S241 or the step S244, system controller 247 will be taken on one of auxilliary unit 2 of the auxilliary unit of still frame and distribute to each frame memory 242 nThereby a frame memory 242 is corresponding to an auxilliary unit 2 of taking on the auxilliary unit of still frame.
When being used for indicated number from system controller 247 receptions, CPU 129 is stored in frame memory 242 nIn the demonstration request command of view data the time, CPU 129 will show that request command sends to corresponding to frame memory 242 nAuxilliary unit.
Take on the signal processing 147 of the auxilliary unit 2 of the auxilliary unit of still frame and construct as shown in figure 34, and carrying out virtual many viewpoints demonstrations shown in Figure 35 and handling.Therefore, be stored in frame memory 242 when auxilliary unit 2 with showing that request command receives nIn view data the time, its shows and to be stored in frame memory 242 nIn view data.
On the other hand, when when step S252 judges that storage mark flg (n) is non-vanishing, in other words, when frame memory 242 nDuring storing image data, handle and enter step S255, and rating unit 245 relatively from the still frame test section current frame image data of the 244 demonstration still frames that send be stored in frame memory 242 nIn view data, obtaining the feature of absolute difference sum (the absolute difference sum of present frame), and should and send to system controller 247 as the current frame image data.
When system controller 247 when rating unit 245 receives the absolute difference sum of present frames, system controller 247 judges at step S256 whether the absolute difference sum of present frame is almost nil, just, be equal to or less than (or less than) little positive threshold value.
Not only non-vanishing but also keep off when zero when the absolute difference sum of judging present frame at step S256, handle skips steps S257 and S258, and enter step S259.
When the absolute difference sum of judging present frame at step S256 is zero or when approaching zero, in other words,, the view data of present frame is stored in frame memory 242 when having to be similar to nIn the pattern of view data, so the view data of present frame receives by master unit 1, and is similar to and is stored in frame memory 242 nIn view data the time, handle to enter step S257, and system controller 247 control output control part 246 send to frame memory 242 with the current frame image data that will be stored in the demonstration still frame in the frame memory 241 n, and store into wherein to upgrade frame memory 242 in the mode of overriding nMemory contents.
Then, handle and enter step S258, and as mentioned above, system controller 247 control output control part 246 read and are stored in frame memory 242 nIn a frame image data, and it is sent to CPU 129.In addition, at step S258, output control part 246 will be stored in frame memory 242 nIn view data and being used in reference to be shown in and show in the auxilliary unit of still frame and be stored in frame memory 242 nIn the demonstration request command of view data send to CPU 129.Processing enters step S259.
Like this, as mentioned above, newly be stored in frame memory 242 nIn the current frame image data send to corresponding to frame memory 242 from master unit 1 n Auxilliary unit 2, and show.
At step S259, whether system controller 247 decision variables " n " equal the maximum number N of the auxilliary unit of still frame.When judging that they are unequal, handle and turn back to step S251, and repeat identical process later on.
When equaling the maximum number N of the auxilliary unit of still frame at step S259 decision variable " n ", in other words, when rating unit 245 is finished the current frame image data of the demonstration still frame that is stored in the frame memory 241 and is stored in each frame memory 242 1To 242 NIn view data between comparison the time, handle to enter step S260, and whether decision-making system controller 247 receives to be used to indicate and stops virtual many viewpoints and show the termination order of handling from CPU 129.
When judging that at step S260 not receiving termination as yet orders, handle and turn back to step S243, and repeat identical process later on.
When judging that at step S260 receiving termination orders, in other words, for example, when user's operation control 15 (Fig. 7) thus with at display menu screen on the CRT 11 and in menu screen, click virtual many viewpoints display icon indication CPU 129 once more and stop virtual many viewpoints and show and handle and CPU 129 will stop order and send to system controller 247 time, in master unit, stop virtual many viewpoints and show processing.
Show processing according to the virtual many viewpoints shown in Figure 58, shown in Figure 61 A, for example, after master unit 1 begins to show the real time programme of MLB Slam match, for example, when master unit 1 shows when representing the still frame view data of scoreboard in the close-up shot mode, with the image data storage of scoreboard at frame memory 242 1In, and send to corresponding to frame memory 242 1The auxilliary unit of still frame, and show.
Specifically, for example, when corresponding to frame memory 242 1The auxilliary unit of still frame be auxilliary unit 2 11The time, in auxilliary unit 2 11The middle demonstration is stored in frame memory 242 1In the scoreboard image data frame, shown in Figure 61 B.
In addition, then, for example,, then will show that baseballer's still frame image data storage is at frame memory 242 in the close-up shot mode if master unit 1 is presented at the still frame view data that shows the baseballer in the interview in the close-up shot mode 2In, and send to corresponding to frame memory 242 2Auxilliary unit, and show.
Specifically, for example, when corresponding to frame memory 242 2The auxilliary unit of still frame be auxilliary unit 2 12The time, in auxilliary unit 2 12The middle demonstration is stored in frame memory 242 2In baseballer's image data frame, shown in Figure 61 B.
Then, after this, for example, when master unit 1 shows when representing the still frame view data of scoreboard in the close-up shot mode once more, with frame memory 242 1Memory contents be updated to this scoreboard view data, and with frame memory 242 1In the update image data send to corresponding to frame memory 242 1The auxilliary unit 2 of still frame 11, and show.In this case, be presented at auxilliary unit 2 11In the scoreboard view data be updated to nearest one.
Therefore, the user can watch the nearest view data of the various still frames that are presented in the master unit 1.
Next step, Figure 62 illustrates another example electrical structure of master unit 1.In the figure, be same as the used allocation of symbols of Figure 10 and give corresponding to part shown in Figure 10, and if there is no need, the descriptions thereof are omitted.
Master unit 1 shown in Figure 10 is the television receiver that is used for receiving digital broadcast, and the master unit 1 shown in Figure 62 is the television receiver that is used to receive analog broadcasting.
Analog tv broadcast signal in 251 pairs of predetermined channels of tuner detects and demodulation.The output of 252 pairs of tuners 251 of A/D conversion portion applies the A/D conversion, and the view data in the A/D transformation result is sent to Y/C separating part 253, and voice data is sent to signal processing 127.
Y/C separating part 253 is brightness signal separation Y and chrominance signal R-Y and B-Y from the output of tuner 251, and they are sent to signal processing 127.
Even the television receiver that is used for receiving analog broadcasting with aforementioned manner structure also can be taken on the master unit 1 that can expand television system.
As the television receiver shown in the Figure 62 that takes on master unit, the television receiver that is used to receive analog broadcasting also can be as the television receiver of taking on auxilliary unit 2.
In the above-described embodiments, can take on that master unit or auxilliary unit and it are taken on auxilliary unit according to for example television receiver then using and to use how many auxilliary unit and difference before the auxilliary unit if composition can be expanded the selling price of the television receiver of television system.
If owing to do not have master unit as mentioned above then can not provide virtual many viewpoints Presentation Function in the television system can expanding, so master unit has high value, and the selling price of master unit can be established height.
Can expect that the user will additionally buy a plurality of auxilliary unit in case of necessity after buying master unit.Can establish the selling price of the some auxilliary unit of beginning to such an extent that for example be lower than the selling price of master unit, but be higher than the price of common TV receiver.Can also further establish the selling price of the auxilliary unit bought later on low.
In the above-described embodiments, taking on the television receiver that can expand the master unit in the television system can for example construct by signal processing 127 being added to the ordinary numbers television receiver and changing the program of being carried out by CPU 129.Therefore, can relatively easily make by using the ordinary numbers television receiver owing to take on the television receiver that to expand the master unit in the television system, therefore consider to provide expensive efficient under the above-mentioned situation that Premium Features are provided by the virtual many viewpoints that can expand television system and provide.This point is equally applicable to take on the television receiver of auxilliary unit.
In addition, in the above-described embodiments, the view data that sends from master unit 1 (the perhaps view data that is received by tuner 141 (Figure 11) according to the freeze command that sends from master unit 1) is stored in the frame memory 214 to 216 (Figure 34) in the mode of overriding auxilliary unit 2.Can be that auxilliary unit 2 for example is equipped with hard disk etc., the Imagery Data Recording that sends from master unit 1 and reproduces this view data later on according to user instruction hard disk.
In the above-described embodiments, master unit 1 sends to auxilliary unit 2 with a frame image data, and it is presented in the auxilliary unit 2.Can be that master unit 1 sends to auxilliary unit 2 with the multiple image data, and auxilliary unit 2 repeat to show the scene that is formed by the multiple image data.In addition, when the view data that is received by the tuner (Figure 11) of assisting unit 2 according to the freeze command that sends from master unit 1 is presented at the auxilliary unit 2, can be to repeat to show a scene that forms by the multiple image data in the same manner.A scene can be a frame image data or a territory (one-field) view data, perhaps can be from being close to the multiple image data of scape on the scene frame after changing to next scene change frame.
And in the above-described embodiments, virtual many viewpoints show handles the television program that is applied to by master unit 1 reception.Virtual many viewpoint shows to handle and for example also can be applied to the view data and the voice data that send to master unit 1 and do to show from as the VTR of external equipment etc.
Next step, Figure 63 A and Figure 63 B are the perspective views that the example structure of using another embodiment that expands television system of the present invention is shown.In the figure, be same as the used allocation of symbols of Figure 1A and Figure 1B and give corresponding to the part shown in Figure 1A and Figure 1B, and if there is no need, the descriptions thereof are omitted.In other words, Figure 63 A has and the identical structure of the television system expanded shown in Figure 1A and Figure 1B basically with the television system expanded shown in Figure 63 B.
In the embodiment shown in Figure 63 A, can expand television system by nine television receivers 1,2 11, 2 12, 2 13, 2 21, 2 23, 2 31, 2 32With 2 33Form.In the embodiment shown in Figure 63 B, can expand television system by 25 television receivers 1,2 11, 2 12, 2 13, 2 14, 2 15, 2 21, 2 22, 2 23, 2 24, 2 25, 2 31, 2 32, 2 34, 2 35, 2 41, 2 42, 2 43, 2 44, 2 45, 2 51, 2 52, 2 53, 2 54With 2 55Form.
The number that composition can be expanded the television receiver of television system is not limited to 9 or 25.In other words, can expand television system can be formed by any a plurality of television receivers.The arrangement that composition can be expanded the television receiver of television system is not limited to 3 * 3 or 5 * 5 horizontal vertical forms shown in Figure 63 A or Figure 63 B.In other words, the arrangement that composition can be expanded the television receiver of television system can be other horizontal vertical forms, as 1 * 2,2 * 1,2 * 3 horizontal vertical forms.In addition, the spread geometry that composition can be expanded the television receiver of television system is not limited to the matrix shape shown in Figure 63 A and the 63B, and can be Pyramid.
Can be like this constitute for any a plurality of television receiver arbitrarily owing to can expand television system, so it can be called " can expand " system by the arrangement number on the horizontal vertical direction.
Can expand in the television receiver of television system at composition, have two kinds of television receivers: main television receiver (being called master unit in case of necessity), can control other television receivers; And auxilliary television receiver (being called auxilliary unit in case of necessity), can control by another television receiver but can not control other television receivers.
Expand in the television system any, as described later, full screen display is possible, wherein, and display image on all display screens of forming all television receivers that can expand television system.
For carrying out full screen display can expanding in the television system, can expand television system and need have and follow the television receiver that to expand television system (below be called to expand use the unit), and one of them need be a master unit.Therefore, in each embodiment shown in Figure 63 A and Figure 63 B, can expand in the television receiver of television system at composition, the television receiver that for example is positioned at central authorities is taken on master unit 1.
For above-mentioned reasons, using in forming the television receiver that can expand television system is not in the time of can expanding the television receiver that uses the unit, can not provide full screen display.In addition, even all be can expand when using the unit when forming all television receivers can expand television system, if but they are auxilliary unit entirely, then can not provide full screen display.
Therefore, for using the full screen display function can expand television system, the user need buy at least one or a plurality of master unit or a master unit and one or more auxilliary unit.
Because master unit can be taken on auxilliary unit, therefore in forming the television receiver that to expand television system, can use a plurality of master units.
In the embodiment shown in Figure 63 A, the television receiver 1 that is positioned at central authorities (left side rise second and gone up second) in 3 * 3 television receivers is taken on master unit, and other eight television receivers 2 11, 2 12, 2 13, 2 21, 2 23, 2 31, 2 32With 2 33Take on auxilliary unit.In the embodiment shown in Figure 63 B, the television receiver 1 that is positioned at central authorities (left side rise the 3rd and gone up the 3rd) in 5 * 5 television receivers is taken on master unit, and other 24 television receivers 2 11, 2 12, 2 13, 2 14, 2 15, 2 21, 2 22, 2 23, 2 24, 2 25, 2 31, 2 32, 2 34, 2 35, 2 41, 2 42, 2 43, 2 44, 2 45, 2 51, 2 52, 2 53, 2 54With 2 55Take on auxilliary unit.
Therefore, in each embodiment shown in Figure 63 A and Figure 63 B, master unit 1 is positioned at the central authorities that composition can be expanded the television receiver of television system.The position of master unit 1 is not limited to form the central authorities of the television receiver that can expand television system.Master unit 1 can be positioned at any position, as the upper left corner or the lower right corner.
For simplification describes below, suppose that can expand television system is formed by 3 * 3 television receivers shown in Figure 63 A, and master unit 1 is positioned at the central authorities that composition can be expanded the television receiver of television system.
Be used to form the auxilliary unit 2 that to expand television system IjIn subfix i and j be meant auxilliary unit 2 IjBe arranged in the capable and j of the i that can expand television system row (on play the capable and left j of rising of i be listed as).
Below, if there is no need to distinguish auxilliary unit 2 Ij, then their general designations are made auxilliary unit 2.
Because the master unit 1 and the auxilliary unit 2 of the television system expanded shown in composition diagram 63A and Figure 63 B IjExternal structure with described identical to Fig. 3 F, Fig. 4 and Fig. 5 A to Fig. 5 F with reference to Fig. 2, Fig. 3 A, therefore the descriptions thereof are omitted.Thereby master unit 1 and auxilliary unit 2 IjThe annex that has controller 15 (Fig. 2) and controller 35 (Fig. 4) respectively, because its structure is with described identical to Fig. 9 with reference to Fig. 7, so the descriptions thereof are omitted.
Next step, Figure 64 illustrates the example electrical structure of the master unit 1 that is used for Figure 63 A and Figure 63 B.
The television broadcasting signal that is received by unshowned antenna sends to tuner 1121, and process detects and demodulation under the control of CPU1129.The output of tuner 1121 sends to QPSK (orthogonal PSK) demodulator circuit 1122, and process QPSK demodulation under the control of CPU 1129.The output of QPSK demodulator circuit 1122 sends to error correction circuit 1123.Under the control of CPU 1129, pass through error detection and correction, and output is sent to demodulation multiplexer 1124.
Descrambling is carried out in demodulation multiplexer 1124 output to error correction circuit 1123 in case of necessity under the control of CPU 1129, and extracts TS (transport stream) grouping in the predetermined channel.Then, demodulation multiplexer 1124 sends to MPEG (motion picture expert group) Video Decoder 1125 with view data (video data) TS grouping, and voice data (voice data) TS grouping is sent to mpeg audio decoder 1126.The TS grouping that demodulation multiplexer 1124 also will be included in the output of error correction circuit 1123 in case of necessity sends to CPU 1129.Demodulation multiplexer 1124 also receives view data or the voice data (comprising those data the TS grouping) that sends from CPU 1129, and it is sent to mpeg video decoder 1125 or mpeg audio decoder 1126.
Mpeg decode is carried out in 1125 pairs of view data TS groupings that send from demodulation multiplexer 1124 of mpeg video decoder, and sends to selector 1127.1126 pairs of voice data TS grouping carrying out mpeg decodes that send from demodulation multiplexer 1124 of mpeg audio decoder.The L channel voice data and the right audio channel data that obtain by decoding in mpeg audio decoder 1126 send to selector 1127.
Under the control of CPU 1129, selector 1127 selections are from the view data of mpeg video decoder 1125 outputs, from the view data of CPU 1129 outputs or the view data that sends from safety system part 1137, and send to NTSC (National Television System Committee, National Television System Committee) encoder 1128.NTSC encoder 1128 will convert the NTSC view data to from the view data that selector 1127 sends, and it is sent to CRT 11 and show thereon.Under the control of CPU 1129, selector 1127 is also selected from the L channel and the right audio channel data of mpeg audio decoder 1126 transmissions, from the voice data of CPU 1129 transmissions or the voice data that sends from safety system part 1137, and send to amplifier 1138.
CPU 1129 carries out various processing according to the program that is stored among EEPROM (Electrically Erasable Read Only Memory) 1130 and the ROM (read-only memory) 1131, with control example such as tuner 1121, QPSK demodulator circuit 1122, error correction circuit 1123, demodulation multiplexer 1124, selector 1127, IEEE-1394 interface 1133, modulator-demodulator 1136 and safety system part 1137.In addition, CPU 1129 also will send to IEEE-1394 interface 1133 from the data that demodulation multiplexer 1124 sends, and will send to demodulation multiplexer 1124 and selector 1127 from the data that IEEE-1394 interface 1133 sends.In addition, CPU 1129 also carries out from the pairing processing of order of front panel 1134 or 1135 transmissions of IR receiving unit.And CPU 1129 also controls modulator-demodulator 1136 and obtains refresh routine and necessary data to visit unshowned server by telephone wire.
Even also need the data and the program of preserving after EEPROM 1130 storage power supplys disconnect.ROM1131 is storing IP L (initial program loader) program for example.The data and the program that are stored among the EEPROM 1130 can be upgraded by overriding.
Required data and the program of operation of RAM 1132 interim storage CPU 1129.
IEEE-1394 interface 1133 is connected to terminal panel 21 (its IEEE-1394 terminal 21 11To 21 33(Fig. 3 F)), and as the interface of carrying out the communication of following IEEE 1394 standards.Like this, IEEE-1394 interface 1133 will send to the external world from the data that CPU 1129 sends according to IEEE 1394 standards, and receive the data that send from the external world according to IEEE 1394 standards, and it is sent to CPU 1129.
Front panel 1134 is installed in a certain position in the front of master unit 1.Front panel 1134 is included as the part of the push-button switch that controller 15 (Fig. 7) provides.When the push-button switch of operation on the front panel 1134, send to CPU 1129 corresponding to the operation signal of this operation.In this case, CPU 1129 carries out the pairing processing of operation signal that sends from front panel 1134.
Operation on IR receiving unit 1135 response controllers 15 receives the infrared ray (light) that slave controller 15 sends.In addition, 1135 pairs of receiving infrared-rays of IR receiving unit carry out opto-electronic conversion, and the signal that will therefore obtain sends to CPU 1129.In this case, CPU 1129 carries out the pairing processing of signal that sends from IR receiving unit 1135, just, and the pairing processing of the operation on the controller 15.
Modulator-demodulator 1136 is communicated by letter by telephone wire control, thereby sends the data that send from CPU 1129 by telephone wire, receives the data that send by telephone wire, and it is sent to CPU 1129.
Safety system part 1137 is formed by safety governor 1137A, wave point 1137B, data processing section 1137C and warning processing section 1137D, and carries out the various processing that are used for occurring to user's warning emergency (unusual thing) described later under the control of CPU 1129.
More particularly, safety governor 1137A controls wave point 1137B, data processing section 1137C and warning processing section 1137D under the control of CPU 1129.
Wave point 1137B receives image (live image) data and the voice data that send from aftermentioned video camera 1162 (Figure 67) as the interface of carrying out radio communication, and they are sent to selector 1127 and data processing section 1137.For example, can adopt in IEEE 802.11 regulation and the NIC (network interface card, network interface unit) by so-called WLAN executive communication as wave point 1137B.
Can also adopt according to the standard that is different from IEEE 802.11 and carry out the unit of radio communication as wave point 1137B.Yet preferably wave point 1137B follows and has the standard that enough transmission bands are used to transmit and receive moving image data.
Data processing section 1137C detects from the view data of wave point 1137B transmission or the characteristic quantity of voice data with predetermined space, and it is sent to warning processing section 1137D.
Warning processing section 1137D analyzes view data or the fluctuation in time of voice data characteristic quantity that sends from data processing section 1137C.In addition, warning processing section 1137D judges whether view data or the fluctuation in time of voice data characteristic quantity satisfy predetermined condition, and according to result of determination request safety governor 1137A warning user.
Amplifier 1138 amplifies the voice data that sends from selector 1127 in case of necessity, and it is sent to loudspeaker unit 12L and 12R.Amplifier 1138 has D/A (digital-to-analog) transducer, in case of necessity voice data is applied the D/A conversion, and exports it.
Gou Zao master unit 1 is exported image and the sound (display image and output sound) in the television program in the following manner as mentioned above.
Transport stream as the television broadcasting signal that is received by antenna sends to demodulation multiplexer 1124 by tuner 1121, QPSK demodulator circuit 1122 and error correction circuit 1123.Demodulation multiplexer 1124 extracts the TS grouping of scheduled program from transport stream, view data TS grouping is sent to mpeg video decoder 1125, and voice data TS grouping is sent to mpeg audio decoder 1126.
1125 pairs of TS grouping carrying out mpeg decodes that send from demodulation multiplexer 1124 of mpeg video decoder.Therefore the view data that obtains sends to CRT 11 to show by selector 1127 and NTSC encoder 1128 from mpeg video decoder 1125.
1126 pairs of TS grouping carrying out mpeg decodes that send from demodulation multiplexer 1124 of mpeg audio decoder.Therefore the voice data that obtains sends to loudspeaker unit 12L and 12R to export by selector 1127 and amplifier 1138 from mpeg audio decoder 1126.
In master unit 1, IEEE-1394 interface 1133 receives the TS grouping that sends from another unit.In the TS grouping, view data TS grouping and voice data TS grouping send to mpeg video decoder 1125 and mpeg audio decoder 1126 by CPU 1129 and demodulation multiplexer 1124, and export (demonstration) in the mode identical with voice data with the view data of television broadcasting signal.
In master unit 1, the wave point 1137B of safety system part 1137 receives view data and the voice data that sends from another unit.The view data that is received by wave point 1137B sends to CRT 11 to show by selector 1127 and NTSC encoder 1128.The voice data that is received by wave point 1137B sends to loudspeaker unit 12L and 12R to export by selector 1127 and amplifier 1138.
Therefore, in the embodiment shown in Figure 64, master unit 1 has three input systems and is used to receive view data and voice data: tuner 1121, IEEE-1394 interface 1133 and wave point 1137B.Though not shown in Figure 64, input terminal 23 is installed on the master unit shown in Fig. 3 F.Therefore, when comprising input terminal 23, master unit 1 has four input systems.
The number of input system that offers master unit 1 is unrestricted.
Next step, Figure 65 illustrates the example electrical structure of the auxilliary unit 2 that is used for Figure 63 A and Figure 63 B.
Because auxilliary unit 2 is formed to amplifier 1158 to amplifier 1138 identical tuners 1141 by structure and the tuner 1121 shown in Figure 64, so the descriptions thereof are omitted.
Next step, Figure 66 illustrates the example structure of the safety system of using the television system expanded shown in Figure 63 A and Figure 63 B.
In the embodiment shown in Figure 66, safety system is by the television system expanded with a plurality of television receivers 1161 and three video cameras (video camera) 1162 1, 1162 2With 1162 3Form.
Can expand television system 1161 for example forms in the mode identical with the television system expanded shown in Figure 63 A.
Video camera 1162 1, 1162 2With 1162 3For example be digital video camcorder, and photographic images, sound gathered, and the view data and the voice data that will therefore obtain with wireless mode send to the scheduled television receiver that composition can be expanded television system 1161.
Safety system in Figure 66 illustrated embodiment is by three video cameras 1162 1To 1162 3Form.The video camera number that is included in the safety system is not limited to 3.Safety system can be equipped with one or more video cameras.
In safety system, can equal to form the number of the television receiver that can expand television system 1161 from the maximum number that video camera 1162 receives the television receiver of view data and voice data.Therefore, when safety system is equipped with its number and surpass forms the video camera 1162 of the television receiver number that can expand television system 1161, can expand television system 1161 and can not receive never view data and the voice data that the video camera 1162 corresponding to any television receiver sends.Each television receiver that can expand television system 1161 can also switch the video camera that receives view data and voice data from it.
Below, if there is no need to distinguish video camera 1162 1To 1162 3, then their general designations are made video camera 1162.
Next step, Figure 67 illustrates the example structure of the video camera 1162 shown in Figure 66.
Light from object is incident on the optical system 1171 that is formed by lens, focus adjusting mechanism, aperture adjustment mechanism etc., and accumulates on the light-receiving plane of CCD (Charge Coupled Device, charge coupled device) 1172.1172 pairs of light that send from optical system 1171 of CCD carry out opto-electronic conversion, and will send to amplifier 1173 as the signal of telecommunication of view data.1173 pairs of view data that send from CCD 1172 of amplifier are amplified, and it is sent to A/D (analog/digital) conversion portion 1174.The analog signal image data that 1174 pairs of A/D conversion portions send from amplifier 1173 are sampled and are quantized (A/D conversion), converting digital signal image data to, and send to memory 1175.The view data that memory 1175 interim storages send from A/D conversion portion 1174.
Microphone 1176 will convert signal of telecommunication voice data to as the sound of microphone surrounding air vibration, and it is sent to amplifier 1177.1177 pairs of voice datas that send from microphone 1176 of amplifier amplify, and it is sent to A/D conversion portion 1178.1178 pairs of analog signal voice datas that send from amplifier 1177 of A/D conversion portion apply the A/D conversion with the acquisition digital audio-frequency data, and it is sent to memory 1179.The voice data that memory 1179 interim storages send from A/D conversion portion 1178.
Wave point 1180 is and the identical interface of the described wave point 1137B of reference Figure 64, and the view data and the voice data that will be stored in memory 1175 and 1179 with wireless mode send to the scheduled television receiver that composition can be expanded television system 1161.
Radio communication between the wave point 1137B and 1180 can for example be passed through TCP/IP (Transmission Control Protocol/Internet Protocol, TCP) and carry out.In this case, the IP address can be used for specifying video camera 1162 1To 1162 3The composition that transmission view data and voice data will arrive at can be expanded the television receiver of television system.
Video camera 1162 can be a field camera, and in this case, the user can easily install video camera 1162, thereby takes desired location.
Can also be that video camera 1162 adopts MPEG or additive method that view data and voice data are encoded, and send them.
Next step is with reference to the processing of the flow chart description shown in Figure 68 as the master unit 1 (Figure 64) of the television receiver of forming the television system expanded 1161 shown in Figure 66.
At first, at step S1041, CPU 1129 judges whether the incident that any unit is connected to terminal panel 21 or sends any order by IEEE-1394 interface 1133 or IR receiving unit 1135 takes place.When judgement any incident does not take place, handle turning back to step S1041.
When judging that at step S1041 a unit takes place is connected to the incident of terminal panel 21, handle and enter step S1042, and CPU 1129 carries out the authentication processing shown in Figure 69 described later.Then, processing turns back to step S1041.
Whether there is a unit to be connected to terminal panel 21 for judging, needs to detect this unit and be connected to terminal panel 21 these states.This test example is as carrying out in following mode.
The IEEE-1394 terminal 21 that provides for terminal panel 21 (Fig. 3 F) is provided when a unit (by the IEEE-1394 cable) IjThe time, IEEE-1394 terminal 21 IjTerminal voltage change.IEEE-1394 interface 1133 is configured to make it that terminal voltage is changed and reports to CPU 1129.CPU 1129 has been connected to terminal panel 21 these states from the report of IEEE-1394 interface 1133 reception terminal change in voltage to detect new unit.CPU 1129 also for example by Same Way discern a unit with this state of terminal panel 21 disconnections.
When judging that at step S1041 generation sends the incident of any order by IEEE-1394 interface 1133 or IR receiving unit 1135, handle and enter step S1043, and master unit 1 is carried out corresponding to this process of commands.Then, processing turns back to step S1041.
Next step, the authentication processing of carrying out by master unit 1 at the step S1042 shown in Figure 68 with reference to the flow chart description shown in Figure 69.
In the authentication processing of carrying out by master unit 1, carry out two kinds of authentications: whether the unit (being called linkage unit in case of necessity) that newly is connected to terminal panel 21 is to authorize the IEEE-1394 unit, and whether the IEEE-1394 unit is the television receiver (can expand and use the unit) of taking on master unit or auxilliary unit.
More particularly, in the authentication processing of being carried out by master unit 1, at first at step S1051, CPU1129 control IEEE-1394 interface 1133 will be used to ask the authentication request order of authentication mutually to send to linkage unit, and processing enters step S1052.
At step S1052, whether CPU 1129 judges from linkage unit and returns response to the authentication request order.Handle and enter step S1053 when linkage unit returns response to the authentication request order when judging at step S1052 as yet not, and CPU 1129 judges whether take place overtimely, just, whether pass by the scheduled time since sending the authentication request order.
When judge at step S1053 taken place overtime, in other words, though when sending the authentication request order and to linkage unit after, pass by the scheduled time also not when linkage unit returns response to the authentication request order, processing entering step S1054.CPU 1129 judges that linkage unit is not to authorize IEEE-1394 unit and authentification failure, and operator scheme is made as the stand-alone mode that does not transmit any data with linkage unit.Processing is returned.
Therefore, do not transmit any data after the master unit 1, and do not carry out IEEE-1394 communication with the linkage unit that is not mandate IEEE-1394 unit.
Do not take place as yet to handle and turn back to step S1052, and repeat identical process later on when overtime when judging at step S1053.
Return response when judging at step S1052 to the authentication request order from linkage unit, in other words, receive by IEEE-1394 interface 1133 and when sending to CPU 1129 from the response of linkage unit, processing enters step S1055, and CPU 1129 generates random number (pseudo random number) R1 according to pre-defined algorithm, and by IEEE-1394 interface 1133 it is sent to linkage unit.
Then, processing enters step S1056, and CPU 1129 judges whether send encrypted random number E ' (R1) from linkage unit, wherein, encrypted random number E ' is by according to pre-defined algorithm (secret key encryption method such as DES (data encryption standard), FEAL (rapid data cryptographic algorithm) or RC5) random number R 1 that sends at step S1055 is encrypted obtain (R1).
Send encrypted random number E ' (R1) time when judging at step S1056 as yet not, handle and enter step S1057, and CPU 1129 judges whether take place overtimely, just, whether pass by the scheduled time since sending random number R 1 from linkage unit.
Taken place overtimely when judging at step S1057, in other words, even also do not return encrypted random number E ' (R1) time from linkage unit during the scheduled time in the past after random number R 1 being sent to linkage unit, processing entering step S1054.CPU 1129 judges that as mentioned above linkage unit is not to authorize the IEEE-1394 unit, and operator scheme is made as stand-alone mode.Processing is returned.
Do not take place as yet to handle and turn back to step S1056, and repeat identical process later on when overtime when judging at step S1057.
Return encrypted random number E ' (R1) from linkage unit when judging at step S1056, in other words, (R1) receive by IEEE-1394 interface 1133 and when sending to CPU1129 from the encrypted random number E ' of linkage unit, processing enters step S1058, and CPU 1129 encrypts the random number R 1 that generates at step S1055 according to predetermined cryptographic algorithm, to generate encrypted random number E (R1).Processing enters step S1059.
At step S1059, CPU 1129 judges whether the encrypted random number E ' that sends from linkage unit (R1) equals the encrypted random number E (R1) that generated by CPU 1129 at step S1058.
When judging that at step S1059 encrypted random number E ' (R1) is not equal to encrypted random number E (R1), in other words, when the cryptographic algorithm that linkage unit adopted (comprise in case of necessity and be used for encrypted secret key) is different from the cryptographic algorithm that CPU1129 adopts, processing enters step S1054, and CPU 1129 judges that as mentioned above linkage unit is not to authorize the IEEE-1394 unit, and operator scheme is made as stand-alone mode.Processing is returned.
When judging that at step S1059 encrypted random number E ' (R1) equals encrypted random number E (R1), in other words, when the cryptographic algorithm that the cryptographic algorithm that linkage unit adopted and CPU 1129 are adopted is identical, processing enters step S1060, and CPU 1129 judges whether send the random number R 2 that is used for authenticating master unit 1 by linkage unit from linkage unit.
When step S1060 judgement does not receive random number R 2 as yet, processing enters step S1061, and CPU 1129 judge whether take place overtime, just, for example, whether judge that at step S1059 encrypted random number E ' (R1) equals encrypted random number E (R1) and passes by the scheduled time afterwards.
When having taken place overtime in step S1061 judgement, in other words, even when scheduled time in the past also not when linkage unit sends random number R 2, processing enters step S1054, and CPU 1129 judges that as mentioned above linkage unit is not to authorize the IEEE-1394 unit, and operator scheme is made as stand-alone mode.Processing is returned.
On the other hand, do not take place as yet to handle and turn back to step S1052, and repeat identical process later on when overtime when judging at step S1061.
Send random number R 2 when judging from linkage unit at step S1060, in other words, receive by IEEE-1394 interface 1133 and when sending to CPU 1129 from the random number R 2 of linkage unit, processing enters step S1062, and CPU 1129 encrypts random number R 2 according to predetermined cryptographic algorithm, producing encrypted random number E (R2), and it is sent to linkage unit by IEEE-1394 interface 1133.
When judging from linkage unit transmission random number R 2 at step S1060, linkage unit is the IEEE-1394 unit by success identity.
Then, handle and enter step S1063, and CPU 1129 control IEEE-1394 interfaces 1133 will be will be used to ask the unit ID of linkage unit and the function-information request command of function information and the unit ID and the function information of master unit 1 to send to linkage unit.
Unit ID is unique ID that sign is taken on the television receiver of master unit 1 or auxilliary unit 2.
Function information is the information relevant with having function, and the type that for example comprises the order that will receive from the external world (for example, to receive which of the order be used for controlling mains switch, volume adjusting, channel, brightness and acutance from the external world), whether allow screen to show (OSD demonstration), whether allow mute state, and whether allow sleep state.In addition, function information comprises also whether the unit has the function of master unit or the function of auxilliary unit.
Master unit 1 can for example be stored in unit ID and function information in vendor_dependent_information (manufacturer's relevant information) field of EEPROM 1130 or configuration ROM shown in Figure 15.
Then, handle to enter step S1064, and CPU 1129 wait-for-responses send to the function-information request command of linkage unit and arrive the unit ID and the function information of linkage unit at step S1063.CPU1129 passes through IEEE-1394 interface 1133 receiving element ID and function informations, and they are stored among the EEPROM 1130, and processing enters step S1065.
At step S1065, whether CPU 1129 is auxilliary unit with reference to the function information that is stored among the EEPROM 1130 to judge linkage unit.When judging that at step S1065 linkage unit is auxilliary unit, in other words, linkage unit is when assisting the unit, to handle skips steps S1066 and S1067, and enter step S1068 by success identity.CPU 1129 is made as the full screen display that allows full screen display with operator scheme may pattern.Processing is returned.
On the other hand, when judging that at step S1065 linkage unit is not auxilliary unit, handle and enter step S1066, and CPU 1129 is with reference to being stored in function information among the EEPROM 1130 to judge whether linkage unit is master unit.When judging that at step S1066 linkage unit is a master unit, in other words, when linkage unit is master unit by success identity, handles and enter step S1067, and CPU 1129 carries out master unit with the linkage unit of taking on master unit and the adjustment of auxilliary unit is handled.
Specifically, in this case,, therefore form and exist two television receivers to take on master unit in the television receiver that to expand television system because master unit 1 is connected to another master unit.In the present embodiment, requirement can be expanded only a master unit in the television system.Therefore, at step S1067, carry out master unit and the adjustment of auxilliary unit and handle, wherein, judgement is that master unit 1 or the linkage unit of taking on master unit are used as the television receiver of taking on master unit.
More particularly, for example, judging becomes that master unit that can expand a television system part the earliest, and just, the master unit 1 in the present embodiment is as the television receiver of taking on master unit.Other master units that are not judged to be as master unit are used as auxilliary unit.
Carry out after master unit handles with the adjustment of auxilliary unit at step S1067, handle and enter step S1068, and CPU 1129 as mentioned above operator scheme is made as full screen display may pattern, and processing is returned.
When judging that at step S1066 linkage unit is not a master unit, in other words, linkage unit is not neither master unit is again auxilliary unit, therefore when linkage unit is not authenticated to be master unit or auxilliary unit, handles and enter step S1069, and CPU 1129 is made as common command function with operator scheme and receives/supply a pattern, wherein, can have the AV/C command set now with the linkage unit transmission, but can not transmit the control command that is used to provide the full screen display function, and handle and return.
In other words, in this case,,, do not provide the full screen display function even therefore when this linkage unit is connected to master unit 1 because linkage unit is not neither master unit is again auxilliary unit yet.Yet, in this case,, therefore allow the existing AV/C command set of transmission between master unit 1 and linkage unit because linkage unit is to authorize the IEEE-1394 unit.Thereby in this case, between master unit 1 and linkage unit, one (or be connected to another IEEE-1394 unit of the master unit 1) can control another by existing AV/C command set.
Next step is with reference to the processing of the flow chart description shown in Figure 70 as the auxilliary unit 2 (Figure 65) of the television unit of forming the television system expanded shown in Figure 66.
At first, at step S1071, CPU 1149 judges whether the incident that any unit is connected to terminal panel 41 or sends any order by IEEE-1394 interface 1153 or IR receiving unit 1155 takes place.When judgement any incident does not take place, handle turning back to step S1071.
When judging that at step S1071 a unit takes place is connected to the incident of terminal panel 41, handle and enter step S1072, and CPU 1149 carries out the authentication processing shown in Figure 71 described later.Then, processing turns back to step S1071.
Whether there is a unit to be connected to terminal panel 41, then to need to detect this unit and be connected to terminal panel 41 these states for judging.This test example is as to carry out with mode identical as described in the step S1041 shown in Figure 68.
When judging that at step S1071 generation sends the incident of any order by IEEE-1394 interface 1153 or IR receiving unit 1155, handle and enter step S1073, and auxilliary unit 2 is carried out corresponding to this process of commands.Then, processing turns back to step S1071.
Next step, the authentication processing of carrying out by auxilliary unit 2 at the step S1072 shown in Figure 70 with reference to the flow chart description shown in Figure 71.
In the authentication processing of being carried out by auxilliary unit 2, carry out two kinds of authentications: whether the unit (being called linkage unit in case of necessity) that newly is connected to terminal panel 41 is to authorize the IEEE-1394 unit, and whether the IEEE-1394 unit is master unit.
More particularly, in the authentication processing of being carried out by auxilliary unit 2, at first at step S1081, CPU1149 judges whether send the authentication request order that is used to ask mutual authentication from linkage unit.When judgement does not send this order as yet, handle entering step S1082.
Whether at step S1082, it is overtime whether CPU 1149 judgements take place, just, pass by the scheduled time since the beginning authentication processing.
When judge at step S1082 taken place overtime, in other words, though when after the order of beginning authentication request in the past during the scheduled time also not when linkage unit sends the authentication request order, processing entering step S1083.CPU 1149 judges that linkage unit is not to authorize IEEE-1394 unit and authentification failure, and operator scheme is made as the stand-alone mode that does not transmit any data with linkage unit.Processing is returned.
Therefore, auxilliary unit 2 is in the mode identical with master unit 1, be not to authorize the linkage unit of IEEE-1394 unit not transmit any data, and do not carry out IEEE-1394 communication.
Do not take place as yet to handle and turn back to step S1081, and repeat identical process later on when overtime when judging at step S1082.
Send the authentication request order when judging from linkage unit at step S1081, in other words, the authentication command that sends from the master unit 1 of taking on linkage unit at the step S1051 shown in Figure 69 is received by IEEE-1394 interface 1153 and when sending to CPU 1149, processing enters step S1084, and CPU 1149 control IEEE-1394 interfaces 1153 will be sending to linkage unit to the response of authentication request order.
In the present embodiment, master unit 1 is carried out the processing of the step S1051 shown in Figure 69 to S 1053, and step S1081, S1082 shown in Figure 71 and the processing of S1084 are carried out in auxilliary unit 2.Can also be that processing and step S1081, S1082 master unit 1 execution Figure 71 shown in and the processing of S1084 of the step S1051 shown in Figure 69 to S1053 carried out in auxilliary unit 2.
Then, handle and enter step S1085, and CPU 1149 judges whether send random number R 1 from linkage unit.When judgement does not send random number R 1 as yet, handle entering step S1086.
At step S1086, whether CPU 1149 judges whether take place overtimely, just, pass by the scheduled time since step S1084 sends response to the authentication request order since CPU 1149.
When judge at step S1086 taken place overtime, in other words, though after CPU 1149 sends the response of authentication command in the past during the scheduled time also not when linkage unit sends random number R 1, processing entering step S1083.CPU 1149 judges that as mentioned above linkage unit is not to authorize the IEEE-1394 unit, and operator scheme is made as the stand-alone mode that does not transmit any data with linkage unit.Processing is returned.
Do not take place as yet to handle and turn back to step S1085, and repeat identical process later on when overtime when judging at step S1086.
Send random number R 1 when judging from linkage unit at step S1085, in other words, the random number R 1 that sends from the master unit 1 of taking on linkage unit at the step S1055 shown in Figure 69 is received by IEEE-1394 interface 1153 and when sending to CPU 1149, processing enters step S1087, and CPU 1149 encrypts random number R 1 according to predetermined cryptographic algorithm, to generate encrypted random number E ' (R1).In addition, at step S1087, CPU 1149 control IEEE-1394 interfaces 1153 are (R1) to send to linkage unit with encrypted random number E '.Processing enters step S1089.
At step S1089, CPU 1149 generates random number (pseudo random number) R2, and control IEEE-1394 interface 1153 is to send to linkage unit with random number R 2.Processing enters step S1090.
At step S1090, CPU 1149 judges whether send encrypted random number E (R2) from linkage unit, and wherein, encrypted random number E (R2) is by random number R 2 being encrypted and being generated by the master unit 1 of taking on linkage unit at the step S1062 shown in Figure 69.
Handles and enter step S1091 when not receiving encrypted random number E (R2) as yet when judging, and CPU 1149 judges whether take place overtimely that just, whether CPU 1149 passes by the scheduled time after sending random number R 2 at step S1090.
When having taken place overtime in step S1091 judgement, in other words, even when CPU 1149 send random number R 2 after the linkage unit in the past during the scheduled time also not when linkage unit sends encrypted random number E (R2), processing enters step S1083, and CPU 1149 judges that as mentioned above linkage unit is not to authorize the IEEE-1394 unit, and operator scheme is made as stand-alone mode.Processing is returned.
On the other hand, do not take place as yet to handle and turn back to step S1090, and repeat identical process later on when overtime when judging at step S1091.
Send encrypted random number E (R2) when judging from linkage unit at step S1090, in other words, receive by IEEE-1394 interface 1153 and when sending to CPU1149 from the encrypted random number E (R2) of linkage unit, processing enters step S1092, and CPU 1149 encrypts the random number R 2 that generates at step S1089 according to predetermined cryptographic algorithm, to generate encrypted random number E ' (R2).Processing enters step S1093.
At step S1093, CPU 1149 judges whether the encrypted random number E (R2) that sends from linkage unit equals at step S1092 by the encrypted random number E ' that assists unit 2 generations (R2).
When judging that at step S1093 encrypted random number E (R2) is not equal to encrypted random number E ' (R2), in other words, when the cryptographic algorithm that linkage unit adopted (comprise in case of necessity and be used for encrypted secret key) is different from the cryptographic algorithm that CPU1149 adopts, processing enters step S1083, and CPU 1149 judges that as mentioned above linkage unit is not to authorize the IEEE-1394 unit, and operator scheme is made as stand-alone mode.Processing is returned.
When judging that at step S1093 encrypted random number E (R2) equals encrypted random number E ' (R2), in other words, thereby the cryptographic algorithm that linkage unit adopted is when authorizing the IEEE-1394 unit with the identical linkage unit of cryptographic algorithm that CPU 1149 is adopted by success identity, processing enters step S1094, and CPU1149 is received in unit ID and the function information that the step S1063 shown in Figure 69 is sent with function-information request command by the master unit 1 of taking on linkage unit by IEEE-1394 interface 1153, and they are stored among the EEPROM 1150.
Then, handle to enter step S1095, and function-information request command of receiving from linkage unit in response to step S1094 of CPU 1149 control IEEE-1394 interfaces 1153 and unit ID and the function information that will assist unit 2 sends to linkage unit.Then, processing enters step S1096.
Auxilliary unit 2 can adopt the mode that is same as with reference to the described master unit 1 of Figure 69, unit ID and function information is stored in vendor_dependent_information (manufacturer's relevant information) field of EEPROM 1150 or configuration ROM shown in Figure 15.
At step S1096, CPU 1149 is with reference to the function information that is stored among the EEPROM 1150, to judge whether linkage unit is master unit.When judging that at step S1096 linkage unit is a master unit, in other words, when linkage unit is master unit by success identity, handles and enter step S1097, and CPU 1149 is made as operator scheme the described full screen display possibility of the step S1068 pattern of Figure 69.Processing is returned.
On the other hand, when judging that at step S1096 linkage unit is not a master unit, in other words, when linkage unit is not authenticated to be master unit, handle to enter step S1098, and CPU 1149 is made as common command function with operator scheme and receives/supply a pattern, wherein, can have the AV/C command set now with the linkage unit transmission, but can not transmit the control command that is used to provide the full screen display function, and handle and return.
In other words, in this case,,, do not provide the full screen display function even therefore this linkage unit is connected to auxilliary unit 2 because linkage unit is not a master unit yet.Therefore, only, do not provide the full screen display function by another auxilliary unit being connected to auxilliary unit 2.Yet, in this case,, therefore allow the existing AV/C command set of transmission between auxilliary unit 2 and linkage unit because linkage unit is to authorize the IEEE-1394 unit.Thereby, in this case, between auxilliary unit 2 and linkage unit (comprising other auxilliary unit), can control another by existing AV/C command set for one.
Next step, when completing successfully respectively as the master unit 1 of the television receiver of the television system expanded shown in the composition diagram 66 and auxilliary unit 2 their operator scheme is made as full screen display may pattern the time, can expands television system 1161 and can carry out full screen display shown in Figure 72 A and Figure 72 B with reference to Figure 69 and the described authentication processing of Figure 71 and master unit 1 and auxilliary unit 2.
In other words, for example, when master unit 1 shown in Figure 72 A during display image data, if indicating, operation control 15 (or 35) carries out full screen display, then can expand the view data that television system 1161 shows that on whole display screens of forming the television receiver can expand television system master unit 1 is shown, shown in Figure 72 B.
Specifically, when master unit 1 output program image and sound (display image and output sound), for example, if user's operation control 15 (Fig. 7) thus in menu button switch 54 when connecting this switch, controller 15 emissions are corresponding to the infrared ray of user's operation.Infrared ray is received by the IR receiving unit 1135 of master unit 1 (Figure 64), and on the CRT 11 of master unit 1 the display menu screen.The icon of expression full screen display (being called the full screen display icon in case of necessity) for example is presented on the menu screen.When user's operation control 15 was clicked the full screen display icon, full screen display was carried out in master unit 1 and auxilliary unit 2.
In other words, in this case, obtain the view data zone that will show by master unit 1 and will be in the view data zone of CPU 1129 (Figure 64) on being presented at CRT 11 of master unit 1 by each auxilliary unit 2 IjThe view data zone that shows.In addition, the CPU 1129 of master unit 1 control IEEE-1394 interfaces 1133 are with will be in each auxilliary unit 2 IjThe middle view data that shows is with being used to indicate the full screen display order of carrying out full screen display to send to auxilliary unit 2 IjThe view data in the zone that the CPU 1129 of master unit 1 for example will be shown by master unit 1 by interpolation converts the view data of the full-sized with CRT 11 to, and by selector 1127 and NTSC encoder 1128 it is sent to CRT 11, and show thereon.According to the full screen display order that sends from master unit 1, each auxilliary unit 2 IjCarry out the processing identical, on whole C RT 31, to show from the view data of master unit 1 transmission with master unit 1.
By aforementioned operation, composition can be expanded television receiver display image data on whole display screens of television system 1161, shown in Figure 72 B.
When being presented at composition and can expanding view data on whole display screens of television receiver of television system 1161 and be called the full-screen image data, can expand the part that television system 1161 does not show the full-screen image data.Owing to can expand television system 1161 and in fact have the casing that composition can be expanded the television receiver of television system 1161, therefore, casing is positioned at the boundary of adjacent television receiver, and at these boundaries display image not.In Figure 72 A and Figure 72 B,, omit the casing of the boundary that is positioned at adjacent television receiver for simplifying these figure.In fact, casing is present in the boundary of adjacent television receiver, therefore, although casing is the part of television receiver, do not show the full-screen image data on the casing of television receiver, and view data is divided.
Yet, because people's vision has from peripheral part of image an image part of being hidden by fine rule is carried out the interpolation function of interpolation, therefore the problems referred to above are not big problems when watching the full-screen image data.
After showing the full-screen image data as mentioned above, when user's operation control 15 (Fig. 7) comes display menu screen on CRT11, and click is displayed in full screen icon on menu screen, thereby the pairing expression of the operation on the controller 15 is used to indicate the full screen display that stops full screen display to stop infrared ray slave controller 15 emissions of order, receive and when sending to CPU 1129 by IR receiving unit 1135, for example, the demonstration of master unit 1 turns back to the general size demonstration shown in Figure 72 A.In this case, will be displayed in full screen the termination order by IEEE-1394 interface 1133 and send to each auxilliary unit 2 from master unit 1 IjThereby, each auxilliary unit 2 IjDemonstration turn back to reset condition.
In the present embodiment, have only when successfully carrying out, the full screen display function just is provided with reference to Figure 69 and the described authentication of Figure 71.Even when authentification failure, also can provide the full screen display function.
Next step will describe the processing of being carried out by the safety system part 1137 (Figure 64) in the master unit 1 that is used for occurring to user's warning emergency.
The safety system part 1157 (Figure 65) of auxilliary unit 2 is also carried out the processing that is used for occurring to user's warning emergency.Because this is handled with performed identical by the safety system part 1137 of master unit 1, therefore the descriptions thereof are omitted.
Be used for warning the processing that emergency occurs to carry out separately by the television receiver of taking on master unit 1 or auxilliary unit to the user.In mode identical when the full screen display function is provided, can be to have only when successfully carrying out, just to carry out this processing with reference to Figure 69 and the described authentication of Figure 71.
Figure 73 illustrates the example structure of the data processing section 1137C of the safety system part 1137 shown in Figure 64.
In safety system part 1137, the view data and the voice data that send and received by wave point 1137B from video camera 1162 send to image processing section 1191 and Audio Processing part 1192.
Image processing section 1191 for example detects in each frame or each territory from the characteristic quantity of the view data of wave point 1137B transmission, and it is sent to fluctuation calculating section 1194.
More particularly, image processing section 1191 is formed by frame memory 1201, difference arithmetic section 1202 and difference memory 1203.Frame memory 1201 is according to the control of Memory Controller 1193, and interim storage is from the view data of wave point 1137B transmission.Difference arithmetic section 1202 will be stored in one of view data time sequencing frame in the frame memory 1201 and be made as designated frame in proper order, and calculate the view data of designated frame and than the difference between the view data of the frame (being called former frame in case of necessity) of the Zao frame of designated frame, and will send to difference memory 1203 by the error image data that difference forms.Difference memory 1203 storage temporarily is the error image data of unit with the frame from 1202 transmissions of difference arithmetic section.
Audio Processing part 1192 detects from the characteristic quantity of the voice data of wave point 1137B transmission every the frame of predetermined number, and it is sent to fluctuation calculating section 1194.
More particularly, Audio Processing part 1192 is formed by circular buffer 1206, integral operation part 1207 and integration memory 1208.Circular buffer 1206 is according to the control of Memory Controller 1193, and interim storage is from the voice data of wave point 1137B transmission.Integral operation part 1207 will be stored in one of a plurality of frames that form by the time sequencing audio data sample of predetermined number in the circular buffer 1206 and be made as designated frame in proper order, integration (addition) is formed the audio data sample of designated frame, and integrated value is sent to integration memory 1208.Integration memory 1208 storage temporarily is the voice data integrated value of unit with the frame from 1207 transmissions of integral operation part.
It is the integrated value of the voice data of unit that integral operation part 1207 is calculated with the frame, and for example can extract a predetermined sample of the voice data of component frame.
Memory Controller 1193 writes the address and reads the address for circular buffer 1206 controls of the frame memory 1201 of image processing section 1191 and Audio Processing part 1192.
It is the characteristic quantities of the error image data of unit as frame image data that fluctuation calculating section 1194 reads with the frame from the difference memory 1203 of image processing section 1191, and to read with the frame from the integration memory 1208 of Audio Processing part 1192 be the characteristic quantity of the integrated value of unit as the frame voice data.In addition, fluctuation calculating section 1194 obtains wave component in the view data characteristic quantity and the wave component in the voice data characteristic quantity, and they are sent to the warning processing section 1137D (Figure 64) of back one-level as fluctuation information.
Former stating among the data processing section 1137C that mode constructs carried out and is used to handle the view data that sends from wave point 1137B and the view data of voice data is handled and the voice data processing.
Handle and the voice data processing with reference to the view data that the flow chart description shown in Figure 74 A and Figure 74 B is carried out by data processing section 1137C.
At first handle with reference to the flow chart description view data of Figure 74 A.
In view data is handled, at first at step S1101, the frame image data that frame memory 1201 storages send from wave point 1137B.Processing enters step S1102.At step S1102, the image data frame that difference arithmetic section 1202 will be stored in the frame memory 1201 in the processing of the step S1101 before the next-door neighbour is made as designated frame, and read the view data of designated frame, and the view data that once in the processing of step S1101, is stored in the former frame in the frame memory 1201 before reading.In addition, at step S1102, difference arithmetic section 1202 deducts the pixel value of the respective pixel of forming the former frame view data from the pixel value of each pixel of forming the designated frame view data, and the absolute value that obtains the difference that obtains is the error image data of pixel value with generation with these absolute values.The error image data of designated frame send to difference memory 1203 and are stored in wherein.
Then, handle to enter step S1103, and fluctuation calculating section 1194 reads the error image data of designated frame from difference memory 1203, and it is carried out binaryzation.More particularly, fluctuation calculating section 1194 is relatively formed the pixel value and the predetermined threshold of each pixel of error image data.In addition, when pixel value during greater than (perhaps being equal to or greater than) predetermined threshold, fluctuation calculating section 1194 for example is made as " 1 " with pixel value, and when pixel value be equal to or less than (less than) during predetermined threshold, the calculating section 1194 that fluctuates for example is made as " 0 " with pixel value.Processing enters step S1104.
At step S1104, the specific pixel that has the view data (being called the binary image data in case of necessity) of pixel value " 0 " and " 1 " in 1194 pairs of designated frames of fluctuation calculating section is as mentioned above counted.More particularly, pixel value for example is the pixel counts of " 1 " in " 0 " and " 1 " in 1194 pairs of binary image data of fluctuation calculating section, and it is outputed to the fluctuation information of warning processing section 1137D as the designated frame view data.
Then, wait for the view data that sends next frame from wave point 1137B.Processing enters step S1101, and repeats identical process later on.
Next step is handled with reference to the flow chart description voice data shown in Figure 74 B.
In voice data is handled, at first, at step S1111, the frame voice data that circular buffer 1206 storages send from wave point 1137B.Processing enters step S1112.At step S1112, the audio data frame that integral operation part 1207 will be stored in the circular buffer 1206 in the processing of the step S1111 before the next-door neighbour is made as designated frame, reads the voice data of designated frame, and calculates the audio data sample sum of designated frame, just, its integration.The integrated value of voice data sends to integration memory 1208 and is stored in wherein.
Then, handle to enter step S1113, and fluctuation calculating section 1194 reads the voice data of designated frame and than the integrated value of the voice data of the frame (former frame) of the Zao frame of designated frame, and obtain the difference between them.Fluctuation calculating section 1194 these differences of output are to the fluctuation information of warning processing section 1137D as the designated frame voice data.
Then, wait for the voice data that sends next frame from wave point 1137B.Processing enters step S1111, and repeats identical process later on.
Next step, Figure 75 illustrates the example structure of the warning processing section 1137D shown in Figure 64.
The interim storage of fluctuation information storage area 1211 is as the view data that sends from data processing section 1137C as described in reference Figure 73, Figure 74 A and Figure 74 B and the fluctuation information of voice data.
The view data of fluctuation information analysis part 1212 analyzing stored in fluctuation information storage area 1211 and the fluctuation information of voice data, with the characteristic quantity wave component in time that obtains view data and voice data, and they are sent to unusual thing judge part 1213.
Unusual thing judgement part 1213 judges from the view data of fluctuation information analysis part 1212 transmissions and the characteristic quantity wave component in time of voice data whether satisfy the unusual thing condition that is stored in the unusual thing condition storage area 1214, and result of determination is sent to warning processing requested part 1215.
Warning is handled requested part 1215 request safety governors 1137 (Figure 64) according to the result of determination that sends from unusual thing judgement part 1213, carries out the warning processing that is used for occurring to user's warning emergency (unusual thing).
The unusual thing condition that the characteristic quantity wave component in time of view data and voice data should satisfy when unusual thing condition storage area 1214 storages were judged the unusual thing of part 1213 judgement generations when unusual thing.
In unusual thing condition storage area 1214, the specify error condition perhaps can be stored the unusual thing condition by user's appointment in advance.Specifically, the user can operational example such as controller 15 to import unusual thing condition.Unusual thing condition by operation control 15 inputs sends to the unusual thing condition storage area 1214 of warning processing section 1137D by IR receiving unit 1135, CPU 1129 and safety governor 1137A, and is stored in wherein.
In the warning processing section 1137D that constructs as mentioned above, judge according to the fluctuation information of view data and voice data whether unusual thing takes place, and when unusual thing takes place, carry out the unusual thing judgement/warning processing that is used for warning unusual thing to the user.
Handle by the unusual thing judgement/warning that warning processing section 1137D carries out with reference to the flow chart description shown in Figure 76.
Fluctuation information storage area 1211 receives continuously from the view data of the data processing section 1137C output of previous stage and the fluctuation information of voice data, and stores fluctuation information temporarily.
At step S1121, the view data of fluctuation information analysis part 1212 analyzing stored in fluctuation information storage area 1211 and the fluctuation information of voice data, with the characteristic quantity wave component in time that obtains view data and voice data, and they are sent to unusual thing judge part 1213.Processing enters step S1122.
At step S1122, unusual thing judges part 1213 judges from the view data of fluctuation information analysis part 1212 transmissions and the characteristic quantity wave component in time of voice data whether satisfy the unusual thing condition that is stored in the unusual thing condition storage area 1214.
When the characteristic quantity wave component in time of judging the view data that sends from fluctuation information analysis part 1212 and voice data at step S1122 does not satisfy unusual thing condition, wait for that the fluctuation information with next view data and voice data is stored in the fluctuation information storage area 1211.Processing enters step S1121.
When the characteristic quantity wave component in time of judging the view data that sends from fluctuation information analysis part 1212 and voice data at step S1122 satisfied unusual thing condition, unusual thing judged that part 1213 will represent that the result of determination that satisfies unusual thing condition sends to warning processing requested part 1215.Processing enters step S1123.
At step S1123, the result of determination of unusual thing condition is satisfied in the expression that response sends from unusual thing judgement part 1213, warns and handles the warning processing that requested part 1215 request safety system parts 1137 (Figure 64) execution is used for occurring to user's warning emergency.Wait is stored in the fluctuation information of next view data and voice data in the fluctuation information storage area 1211, and processing turns back to step S1121.
Next step further describes processing that unusual thing judge part 1213 to Figure 77 D and Figure 78 A to Figure 78 I with reference to Figure 77 A.
Unusual thing condition is satisfied in 1213 judgements of unusual thing judgement part when the fluctuation information of view data and voice data for example changes in the mode before being different from.
Specifically, for example, when the fluctuation information that changes not significantly significantly changes suddenly or when the fluctuation information that changes does not to a certain extent change suddenly, unusual thing condition is satisfied in 1212 judgements of fluctuation information analysis part.
For example, when 1212 pairs of fluctuation informations of fluctuation information analysis part differentiate (difference between the continuous fluctuation information of the time that order computation is adjacent), and unusual thing is judged when whether part 1213 judgement differential absolute values are equal to or greater than predetermined threshold, can judge whether the fluctuation information that changes not significantly significantly changes suddenly.In this case, predetermined threshold is stored in the unusual thing condition storage area 1214 as unusual thing condition.
For example, judge part 1213 when unusual thing and judge whether fluctuation informations are equaling or be longer than to continue to have in time of the scheduled time when approaching zero value, can judge whether the fluctuation information that changes does not to a certain extent change suddenly.In this case, the scheduled time is stored in the unusual thing condition storage area 1214 as unusual thing condition.
Figure 77 A illustrates example by video camera 1162 shot image data and fluctuation information thereof to Figure 77 D.
For example, shown in Figure 77 A, when video camera 1162 was being taken the elderly's who walks just in doors image, the fluctuation information moderate change of view data was shown in Figure 77 B.For example, shown in Figure 77 C, when video camera 1162 photographed the elderly's who is walking always but falling suddenly image, the fluctuation information of view data significantly changed, and almost nil later on, shown in Figure 77 D.
Therefore, when the state that the fluctuation information of view data is surpassed predetermined threshold suddenly, just, the differential value of the fluctuation information of view data becomes the state that is equal to or greater than predetermined threshold when being made as unusual thing condition, can detect the abnormality that the elderly falls.By warn this abnormality to the user, the elderly can be nursed (rescue) fast.
Shown in Figure 77 C, if a people falls suddenly, then the fluctuation information of view data surpasses predetermined threshold suddenly, and becomes later on and approach zero.Therefore, the differential value of the fluctuation information of view data becomes and is equal to or greater than predetermined threshold, and later on the fluctuation information of view data becomes and approaches zero.When fluctuation information at the fixed time or when continuing to approach zero state in the longer time and being made as unusual thing condition, can be detected the abnormality that a people falls.
Figure 78 A illustrates example by video camera 1162 shot image data and voice data and fluctuation information thereof to Figure 78 I.
For example, shown in Figure 78 A, when video camera 1162 was being taken the baby's who creeps just in doors image, the fluctuation information moderate change of view data and voice data was shown in Figure 78 B and Figure 78 C.For example, shown in Figure 77 D, creep in doors but when entering baby's the image of sleep, the fluctuation information of view data and voice data becomes almost nil when video camera 1162 photographs always, and change very little, shown in Figure 78 E and Figure 78 F.
Therefore, in this case, approach zero and when continuing to approach zero state in the given time and being made as unusual thing condition, can detect the abnormality that the baby enters sleep when the fluctuation information with view data and voice data becomes.By warn this abnormality to the user, the user can take certain action rapidly as cover a blanket to the baby.
Shown in Figure 78 G, if waking up suddenly and begin always, the baby who is sleeping crys, then the fluctuation information by video camera 1162 shot image data and voice data becomes as shown in Figure 78 H and Figure 78 I.
In this case, the baby who is crying is more than activity in bed, but when creeping movable lack, the fluctuation information variation of view data not quite, shown in Figure 78 H.
Yet when the baby who is sleeping began to cry, because crying of baby be discontinuity, so the fluctuation information of voice data became big value suddenly, and continues to have big value in special time, shown in Figure 78 I.
Therefore, in this case, when the state that becomes big value suddenly and continue to have big value in special time when the fluctuation information with voice data is made as unusual thing condition, can detect the abnormality that the baby wakes up and crying.By warn this abnormality to the user, can report rapidly that the baby is awake.
Next step, as described in reference Figure 75 and 76, when the unusual thing state of unusual thing condition occurring satisfying, warning processing section 1137D request safety governor 1137A (Figure 64) carries out the warning that is used for occurring to user's warning emergency and handles.Describe as safety governor 1137A with reference to Figure 79 A and Figure 79 B and Figure 80 A and Figure 80 B and to receive the warning processing that warning is carried out by its when handling request.
When safety governor 1137A received warning processing request, for example, safety governor 1137A request CPU 1129 selected from the view data and the voice data of wave point 1137B output selector 1127.
Like this, selector 1127 is selected from the view data and the voice data of wave point 1137B output, and they are sent to NTSC encoder 1128 and amplifier 1138.The view data that sends to NTSC encoder 1128 sends to CRT 11, and shows thereon.The voice data that sends to amplifier 1138 sends to loudspeaker unit 12L and 12R, and from its output.
As a result, CRT 11 shows the view data that sends and received by wave point 1137B from video camera 1162, and loudspeaker unit 12L and the 12R output voice data that sends and received by wave point 1137B from video camera 1162.
Specifically, for example, when forming master unit 1 and the auxilliary unit 2 that can expand television system 1161 IjShown in Figure 79 A when output specific channel view data of television program and voice data, if auxilliary unit 2 13Warning processing section 1157D will warn the request handled to output to safety governor 1157A, then auxilliary unit 2 13 CRT 31 on demonstration become from sending view data and voice data from the present television program view data that receives to auxilliary unit 2 13The view data that sends of video camera 1162, shown in Figure 79 B.In the embodiment shown in Figure 79 A and Figure 79 B, auxilliary unit 2 13 CRT 31 on demonstration become the view data that people of demonstration of sending from video camera 1162 falls from the television program view data.
In addition, in this case, in auxilliary unit 2 13In, export the voice data that sends from video camera 1162 from loudspeaker unit 32L and 32R (Figure 65).
In this case, even when the user watches television program, also can send view data and voice data to auxilliary unit 2 13Video camera 1162 situation of making a video recording under know immediately unusual thing take place.
Can be that the predetermined warning tones (pipipi...) of loudspeaker unit 32L and 32R output replaces from the voice data of video camera 1162 transmissions, shown in Figure 79 B.
In these cases, auxilliary unit 2 13 CRT 31 on demonstration become the view data that sends from video camera 1162 from the television program view data that is received, shown in Figure 79 B.For example, when auxilliary unit 2 13Power supply when disconnecting, can connect auxilliary unit 2 earlier 13Power supply, will be presented on the CRT 31 from the view data that video camera 1162 sends then.
In addition, in can expanding television system 1161, the view data that sends from the video camera 1162 of environment that unusual thing takes place being made a video recording is presented at and is receiving always from the television receiver of the view data of video camera 1162 transmissions and voice data.In addition, shown in Figure 80 A and 80B, can be to show the view data that sends from video camera 1162 in full frame mode.
Specifically, for example, when forming master unit 1 and the auxilliary unit 2 that can expand television system 1161 IjShown in Figure 80 A when output specific channel view data of television program and voice data, handle request if form the arbitrary television receiver output warning that to expand television system 1161, then switch the demonstration of forming all television receivers that to expand television system 1161, and can show the view data that the television receiver of output warning processing request is receiving in full frame mode, shown in Figure 80 B from video camera 1162 transmissions.
Equally in this case, the user can know the unusual thing of generation immediately.
When carrying out full screen display, owing to do not know in the environment which video camera 1162 is being made a video recording, to take place unusual thing, therefore preferably in can expanding television system 1161, is receiving on all or part of display screen of television receiver of the view data that sends from the video camera 1162 of taking the environment that unusual thing takes place and voice data always, showing the message of representing to take place unusual thing etc.
Full screen display can be for example carried out when the high unusual thing of emergency level takes place.Specifically, when the not high unusual thing of emergency level takes place when, have only receiving always and just switch its demonstration from the view data that video camera 1162 of environment shooting that unusual thing takes place sent and the television receiver of voice data, as described in reference Figure 79 A and Figure 79 B, and when the high unusual thing of emergency level takes place, switch the demonstration of forming all television receivers that to expand television system 1161, and realize full screen display, shown in Figure 80 A and Figure 80 B.Because image significantly shows in full screen display, the user is known the high unusual thing of emergency level takes place.
Can also be to change according to the emergency level of unusual thing and can expanding television system 1161 from the size of the view data (being called urgent view data in case of necessity) that video camera 1162 of environment shooting that unusual thing takes place sent.Specifically, when emergency level is low, urgent view data can adopt with the size of forming the display screen of a television receiver can expanding television system 1161 and show, when emergency level is medium, can adopt the size of the display screen of 2 * 2 adjacent television receivers to show, and when emergency level is high, can adopt the size of the display screen of 3 * 3 adjacent television receivers to show.
Emergency level can be stored in the unusual thing condition storage area 1214 (Figure 75) with unusual thing condition.In this case, can come the identified emergency rank according to the unusual thing condition that is satisfied.
Next step, Figure 81 illustrates and takes on another example structure of television receiver that composition can be expanded the master unit 1 of television system 1161.In the figure, be same as the used allocation of symbols of Figure 64 and give corresponding to the part shown in Figure 64, and if there is no need, the descriptions thereof are omitted.In other words, except warning display part 1139 newly was provided, the master unit 1 shown in Figure 81 was constructed in the mode identical with Figure 64 basically.
Warning display part 1139 is connected according to the control of CPU 1129 or is glimmered.
The television receiver of taking on the auxilliary unit 2 shown in Figure 65 also can adopt the mode identical with the master unit 1 shown in Figure 81, is equipped with the warning display part.
When master unit 1 in can expanding television system 1161 was equipped with warning display part 1139 and auxilliary unit 2 and also is equipped with the warning display part, if unusual thing takes place, can connect or glimmer in warning display part 1139, shown in Figure 82 A and Figure 82 B.
Specifically, for example, when forming master unit 1 and the auxilliary unit 2 that can expand television system 1161 IjShown in Figure 82 A when output specific channel view data of television program and voice data, if the warning processing section 1137D of master unit 1 will warn the request of processing to output to safety governor 1137A, then the demonstration on the CRT 11 of master unit 1 becomes from sending view data and the voice data view data (urgent view data) to video camera 1162 transmissions of master unit 1, shown in Figure 82 B from the present television program view data that receives.
In addition, safety governor 1137A request CPU 1129 connects or flicker warning display part 1139, thereby the warning display part 1139 that disconnects is usually connected shown in Figure 82 B or glimmered.
In addition in this case, even when the user watches television program, the user also can know immediately, and unusual thing takes place in sending the environment that view data and voice data making a video recording to the video camera 1162 of master unit 1.
When emergency level is low, can only connect or glimmer warning display part 1139 or output alarm sound and do not switch to the demonstration of urgent view data only.
Next step, Figure 83 illustrates and takes on another example structure of television receiver that composition can be expanded the master unit 1 of television system 1161.In the figure, be same as the used allocation of symbols of Figure 64 and give corresponding to the part shown in Figure 64, and if there is no need, the descriptions thereof are omitted.
Specifically, the master unit 1 shown in Figure 64 is the television receiver that is used for receiving digital broadcast, and the master unit 1 shown in Figure 83 is the television receiver that is used to receive analog broadcasting.
Analog tv broadcast signal in 1221 pairs of predetermined channels of tuner detects and demodulation.Tuner 1221 will send to Y/C separating part 1222 by the view data that demodulation obtains, and voice data is sent to selector 1127.
Y/C separating part 1222 is brightness signal separation Y and chrominance signal R-Y and B-Y from the output of tuner 1221, and they are sent to selector 1127.
Matrix circuit 1223 will be changed in case of necessity from the color space of the view data of selector 1127 transmissions, and it is sent to CRT 11.
Can expand television system 1161 even also can be used for forming with the television receiver that is used to receive analog broadcasting of aforementioned manner structure.
The television receiver of taking on auxilliary unit 2 also can adopt and the identical mode of television receiver of taking on the master unit 1 shown in Figure 83, constructs as the television receiver that is used to receive analog broadcasting.
Above-mentioned processing sequence can adopt hardware or software to realize.When adopting software to realize that these handle sequences, the program of forming software is installed in the all-purpose computer etc.
Figure 84 illustrates the example structure of computer that is used to carry out the program of above-mentioned processing sequence according to wherein being equipped with of an embodiment.
Program can be recorded in the hard disk 305 or ROM303 of taking on the recording medium that is built in computer in advance.
Perhaps, program can provisional or permanently record removable recording medium 311 as floppy disk, CD-ROM (compact disc read-only memory, compact disc read-only memory), MO (magneto-optical, magneto-optic) in dish, DVD (digital versatile disk, digital versatile disc), disk or the semiconductor memory.This removable recording medium 311 can be used as so-called software kit to be provided.
Program can be installed to the computer from aforesaid removable recording medium 311.In addition, program can be transferred to computer with wireless mode from the download website by the artificial satellite that is used for digital satellite broadcasting, perhaps is transferred to computer by network such as LAN (local area network (LAN)) or internet in wired mode; And Chuan Shu program can be received by communications portion 308 by this way, and is installed in the hard disk 305 that is built in computer.
Computer comprises CPU (CPU) 302.CPU 302 is connected to input/output interface 310 by bus 301.When the importation 307 that user operation is formed by keyboard, mouse, microphone etc. was come by input/output interface 310 input commands, CPU 302 was stored in program among the ROM (read-only memory) 303 according to this command execution.Perhaps, CPU 302 will be stored in the program in the hard disk 305; Via satellite or Network Transmission, receive and be installed to program in the hard disk 305 by communications portion 308; Perhaps the program that reads and be installed to the hard disk 305 from removable recording medium 311 is loaded among the RAM (random access storage device) 304, and carries out it.Like this, CPU 302 carries out based on the processing of above-mentioned flow chart or the processing of carrying out by the structure of above-mentioned block diagram.Then, CPU 302 for example exports results from the output 306 that is formed by LCD (liquid-crystal display, LCD), loud speaker etc. by input/output interface 310 as required; Send result from communications portion 308; Perhaps result is recorded in the hard disk 305.
In this manual, describe the treatment step that is used to make computer carry out various processing and not necessarily handle in the time sequencing mode, and comprise with processing (as parallel processing or object handles) parallel or that independent mode is carried out according to the order described in the flow chart.
Program can perhaps can be carried out by a plurality of computers by distribution process by a Computer Processing.In addition, program can be transferred to far-end computer and carries out.
Can be for example take on according to television receiver that master unit or auxilliary unit and it are taken on auxilliary unit then using and to use how many auxilliary unit and difference before the auxilliary unit if composition can be expanded the selling price of the television receiver of television system.
Specifically, if owing to do not have master unit as mentioned above then can not provide the full screen display function in the television system can expanding, so master unit has high value, and the selling price of master unit can be established height.
Can expect that the user will additionally buy a plurality of auxilliary unit in case of necessity after buying master unit.Can establish the selling price of the some auxilliary unit of beginning to such an extent that for example be lower than the selling price of master unit, but be higher than the price of common TV receiver.Can also further establish the selling price of the auxilliary unit bought later on low.
Taking on the television receiver that can expand the master unit in the television system can for example construct by safety system part 1137 being added to the ordinary numbers television receiver and changing the program of being carried out by CPU 1129.Therefore, can relatively easily make by using the ordinary numbers television receiver owing to take on the television receiver that to expand the master unit in the television system, therefore under the situation of considering aforesaid full screen display function and warning function, provide expensive efficient.This point is equally applicable to take on the television receiver of auxilliary unit.
The present invention can also be applied to not be with the display device of the tuner that is used to export the image that sends from the external world and sound, and as the television receiver with display device of built-in tuner.
In addition, in the safety system shown in Figure 66, view data can not be with wireless mode with voice data but wired mode (communicating by letter or USB (universal serial bus, USB) communication as IEEE-1394) sends to the television receiver that composition can be expanded television system 1161 from video camera 1162.
Except the video camera of preparing for safety system, for example, can also adopt the video camera that is used for what is called " the inside and outside communication (Doorphone) of door " installed and surveillance camera as video camera 1162.
In addition, safety system can be used for monitor infants and the elderly, and monitors and to offer the hot water amount of bathtub, and sends the water boil situation in the kettle of chimes of doom when water boil.
In the present embodiment, processing is applied to by common camera 1162 shot image data.For example can adopt the hotness video camera as video camera 1162, and handle the view data that can be applied to the expression Temperature Distribution that obtains from video camera.
In addition, can also come monitor temperature to change by the sensing infrared ray, and give a warning.
Industrial applicability
As mentioned above, according to the present invention, when connecting and using a large amount of display device, realize more independent than working as More senior function when using display device.
In addition, according to the present invention, the user can easily identify by existing television receiver etc. unusual Thing (emergency) like this, can effectively be used existing television receiver.

Claims (4)

1. display device that is connected to another display device is characterized in that comprising:
Storage device is used for storing according to the control of another display device a scene of the view data of the input data that comprise view data at least;
Display unit, be used for being presented at view data that storage device stores and
Authenticate device is used for carrying out authentication with another display device,
And its feature is that also when successfully carrying out authentication, storage device is stored a scene that is included in the view data in the input data according to the control of another display device; And display unit is presented at the view data of storing in the storage device.
2. display device as claimed in claim 1 is characterized in that also comprising:
Communicator is used for communicating by letter with another display device; And
Receiving system is used for receiving the input data, and it is characterized in that,
When communicator receives input data and is used for indicated number when being included in the order of a scene of view data of input data from another display device,
Receiving system receives the input data;
Memory device stores is included in a scene of the view data in the input data that received by receiving system; And
Display unit is presented at the view data of storing in the storage device.
3. display device as claimed in claim 1 is characterized in that also comprising:
Communicator is used for communicating by letter with another display unit; And it is characterized in that,
When communicator receives scene that is included in the view data in the input data,
Memory device stores is included in a scene of the view data in the input data that received by communicator; And
Display unit is presented at the view data of storing in the storage device.
4. control method that is used for display device, wherein, described display device is connected to another display device, and described control method is characterised in that and comprises:
Storing step is stored a scene of the view data in the input data that comprise view data at least according to the control of another display device;
Step display, be presented at view data that storing step stores and
Authenticating step is used for carrying out authentication with another display device,
And its feature also is, when successfully carrying out authentication, stores a scene that is included in the view data in the input data according to the control of another display device; And, be presented at the view data of storing in the storing step.
CN 200610092547 2001-12-28 2002-12-12 Display apparatus and control method Expired - Fee Related CN1921632B (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP399867/01 2001-12-28
JP399865/01 2001-12-28
JP2001399865A JP3624883B2 (en) 2001-12-28 2001-12-28 Display control apparatus, display control method, recording medium, program, and display system
JP2001399867A JP2003199092A (en) 2001-12-28 2001-12-28 Display device and method for controlling, program and recording medium as well as display system

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN028074300A Division CN1500345B (en) 2001-12-28 2002-12-12 Display apparatus and control method

Publications (2)

Publication Number Publication Date
CN1921632A true CN1921632A (en) 2007-02-28
CN1921632B CN1921632B (en) 2013-05-08

Family

ID=27604720

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 200610092547 Expired - Fee Related CN1921632B (en) 2001-12-28 2002-12-12 Display apparatus and control method

Country Status (2)

Country Link
JP (1) JP3624883B2 (en)
CN (1) CN1921632B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4736659B2 (en) * 2005-09-15 2011-07-27 ソニー株式会社 Multi-screen television receiver remote control system, remote controller and operation method, multi-screen television receiver and operation method, recording medium, and program
CN102724424B (en) * 2011-11-29 2017-09-12 新奥特(北京)视频技术有限公司 A kind of method that teletext packaging scene switching is carried out using data file

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0749243A1 (en) * 1995-06-13 1996-12-18 Matsushita Electric Industrial Co., Ltd. Video monitor system
JP2000324465A (en) * 1999-05-13 2000-11-24 Sony Corp Pay tv broadcasting reception system, tv receiver and pay tv broadcasting reception method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0749243A1 (en) * 1995-06-13 1996-12-18 Matsushita Electric Industrial Co., Ltd. Video monitor system
JP2000324465A (en) * 1999-05-13 2000-11-24 Sony Corp Pay tv broadcasting reception system, tv receiver and pay tv broadcasting reception method

Also Published As

Publication number Publication date
CN1921632B (en) 2013-05-08
JP3624883B2 (en) 2005-03-02
JP2003198972A (en) 2003-07-11

Similar Documents

Publication Publication Date Title
CN1500345A (en) Display appts. and control method
CN1247010C (en) Displaying apparatus, displaying device, programme, storage medium and displaying system
CN100348023C (en) Data processing device, data processing method, computer program, and data processing system
CN100352271C (en) Device and method for processing image
CN1249992C (en) System and method for controlling image display
CN1210966C (en) Copy management method
CN1113302C (en) Method and apparatus for controlling system
CN1249949C (en) Architecture for home network on world wide web
CN1128542C (en) Portable viewing system
CN1795672A (en) Information processing apparatus, information processing method, and computer program
CN1503557A (en) Automatic personal video recording method and system
CN1235298A (en) User interface apparatus and broadcast receiving apparatus
CN1659623A (en) Network type content reproduction system
CN1323036A (en) Control apparatus and control method
CN100338955C (en) Apparatus and method for executing interactive TV application on set top units
CN1788259A (en) Content providing server, information processing device and method, and computer program
CN1842782A (en) Server architecture for network resource information routing
CN1321984A (en) Electronic apparatus system, control apparatus and synchronous power supply control method
CN1685716A (en) Network control confirmation system and method, control communication terminal, server
CN1535465A (en) Information recording medium, and apparatus and method for recording information on information recording medium
CN1830208A (en) Program reservation recording device, information providing server, program reservation information management server, program reception device, and program reservation recording system
CN1930877A (en) Recording medium, reproduction device, recording method, program, and reproduction method
CN1767616A (en) Data processing apparatus, data processing method, program, program recording medium, data recording medium, and data structure.
CN1178444C (en) Controller equiment, communication system and controlling method
CN1276614C (en) Electronic device system, control device and control method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20130508

Termination date: 20191212

CF01 Termination of patent right due to non-payment of annual fee