CN100584003C - Transmitting apparatus and method - Google Patents
Transmitting apparatus and method Download PDFInfo
- Publication number
- CN100584003C CN100584003C CN 200510084849 CN200510084849A CN100584003C CN 100584003 C CN100584003 C CN 100584003C CN 200510084849 CN200510084849 CN 200510084849 CN 200510084849 A CN200510084849 A CN 200510084849A CN 100584003 C CN100584003 C CN 100584003C
- Authority
- CN
- China
- Prior art keywords
- data
- click
- image
- consideration
- transmitting apparatus
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
Images
Landscapes
- Information Transfer Between Computers (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
- Image Processing (AREA)
Abstract
On the transmission side, a background picture and objects 1 to 3 are transmitted at a transmission rate R/4 each. On the reception side, a picture composed of the objects 1 to 3 and the background picture is displayed with a particular spatial resolution and a particular temporal resolution. In this case, on the reception side, when the object 1 is dragged at particular time t1, on the transmission side, the transmission of the background picture and the objects 2 and 3 is stopped. Only the object 1 is transmitted at the transmission rate R of the transmission path. Thus, a picture of which the spatial resolution of the object 1 dragged is improved is displayed at the sacrifice of the temporal resolution of the picture.
Description
This case be that August 9, application number in 2000 are 00802048.5 the applying date, denomination of invention divides an application for the application for a patent for invention of " Apparatus and method for, recording medium and the signal that send, receive and send and receive ".
Technical field
The present invention relates to transmitting apparatus, sending method, receiving equipment, method of reseptance, transmitting and receiving device, method of sending and receiving, recording medium, and signal.Particularly, the present invention relates to allow for example send view data and allow demonstration to have a kind of transmitting apparatus of the image of high spatial resolution by limited transmission rate (having a limited transmission bandwidth), a kind of sending method, a kind of receiving equipment, a kind of method of reseptance, a kind of transmitting and receiving device, a kind of method of sending and receiving, a kind of recording medium and a kind of signal.
Background technology
For example, a coherent reference file, Japanese patent unexamined discloses flat 10-112856 and discloses a kind of image transmission apparatus.In this image transmission apparatus, transmit leg sends the view data of a specific region of image, and other regional view data of this image sends with different amount of information corresponding to the order that the recipient sends.The recipient shows that image and the demonstration in the zone that is in the specified point that contains a higher spatial resolution (resolution of direction in space) is in one than the image in other zones of low spatial resolution.
In other words, when transmit leg through a transmit path when the recipient sends view data, transmit leg can not send image with the data rate that is higher than this transmit path.Thereby when the recipient shows data on the base when actual, transmit leg will send view data to the recipient with the transmission rate corresponding to transmit path.Therefore, if when transmission rate is enough, the spatial resolution on the direction in space of the image that the recipient shows then worsens fully so.
In order to address this problem, disclose in the image transmission apparatus of flat 10-112856 in Japanese patent unexamined, the view data in the view data in the zone of image and other zone of this image sends with different amount of information.The image and the image in other zone that contain in this zone of a specified point show with high spatial resolution and low spatial resolution respectively the recipient.
In other words, disclose in the image transmission apparatus of flat 10-112856 in Japanese laid-open, the improvement of the spatial resolution of the part that the user will carefully watch is that spatial resolution (reduction) with other parts is a cost.
On the other hand, disclose in the image transmission apparatus of flat 10-112856 in Japanese patent unexamined, because the improvement of the segment space resolution that will carefully watch of user is the spatial resolution with other parts is cost, so the segment space resolution that the user will carefully watch is that cost improves with the spatial resolution of sacrificing only.
In addition, when the transmission rate of transmit path is very low, be cost with the sacrifice of the spatial discrimination of other parts if the user will carefully watch the improvement of the spatial resolution of part, the spatial resolution of other parts severe exacerbation then so.In the worst case, the user can not clearly watch other parts.
The present invention is based on that above-mentioned viewpoint makes.The objective of the invention is to allow the spatial resolution of image better to be improved corresponding to user's preference.
Summary of the invention
First transmitting apparatus of the present invention is the transmitting apparatus that sends data to receiving equipment, comprising: receiving system is used to receive the control information that sends from receiving equipment; Control device sends to the resolution of the both direction at least in time orientation, direction in space and level (level) direction of data of receiving equipment corresponding to control information control; And dispensing device, send the data of having controlled the resolution in the both direction at least corresponding to control information to receiving equipment.
Receiving equipment of the present invention is the receiving equipment that receives from the data of transmitting apparatus transmission, comprise: dispensing device, be used for sending control information, so that corresponding to the resolution on the both direction at least of time orientation, direction in space and the level direction of control information control data to transmitting apparatus; Receiving system is used to receive the data that transmitting apparatus sends, and the resolution of these data is controlled at the both direction at least of data; And output device, the data that output is received by receiving system.
Transmitting and receiving device of the present invention is the transmitting and receiving device that has the transmitting apparatus that is used to send data and be used to receive the receiving equipment of data, wherein transmitting apparatus comprises: the control information receiving system is used to receive the control information that sends from receiving equipment; Control device is used for sending to resolution on the both direction at least of time orientation, direction in space and level direction of data of receiving equipment corresponding to control information control; And data sending device, be used for controlling the data of the resolution of both direction at least corresponding to control information to the receiving equipment transmission; And wherein receiving equipment comprises: the control information dispensing device is used to send control information; Data sink is used to receive the data that send from transmitting apparatus, and the resolution of these data is Be Controlled on the both direction at least of data; And output device, be used to export the data that receive by data sink.
Second transmitting apparatus of the present invention is the transmitting apparatus that sends data to receiving equipment, comprising: receiving system is used to receive the control information that sends from receiving equipment; Sort out device, data are sorted out corresponding to control information; And dispensing device, send data corresponding to the classification result of data to receiving equipment.
First sending method of the present invention is the sending method that sends data to receiving equipment, may further comprise the steps: receive the control information that sends from receiving equipment; Send to the resolution of data on the both direction at least of time orientation, direction in space and level direction of receiving equipment corresponding to control information control; With the data of the resolution of both direction send to receiving equipment controlling at least corresponding to control information.
Method of reseptance of the present invention is the method for reseptance of a reception from the data of transmitting apparatus transmission, may further comprise the steps: send control information to transmitting apparatus, so that corresponding to the resolution on the both direction at least of time orientation, direction in space and the level direction of control information control data; Receive the data that transmitting apparatus sends, the resolution of these data is Be Controlled on the both direction of data at least; With the data that on receiving step, receive of output.
Method of sending and receiving of the present invention is the method for sending and receiving of processing that has the processing of the transmitting apparatus that sends data and receive the receiving equipment of data, and wherein the processing of transmitting apparatus may further comprise the steps: receive the control information that sends from receiving equipment; Send to the resolution of data on the both direction at least of time orientation, direction in space and level direction of receiving equipment corresponding to control information control; With the data of the resolution of both direction send to receiving equipment controlling at least corresponding to control information; Wherein the processing of receiving equipment may further comprise the steps: send control information; The data that reception sends from transmitting apparatus, the resolution of these data is Be Controlled on the both direction of data at least; And the data that in the Data Receiving step, receive of output.
Second sending method of the present invention is the sending method that sends data to receiving equipment, may further comprise the steps: receive the control information that sends from receiving equipment; Corresponding to control information data are sorted out; Send data with classification result to receiving equipment corresponding to data.
First recording medium of the present invention is a kind of recording medium of logging program, and this program makes computer carry out the transmission processing that sends data to receiving equipment, and this transmission is handled and be may further comprise the steps: receive the control information that sends from receiving equipment; Send to the resolution of data on the both direction at least of time orientation, direction in space and level direction of receiving equipment corresponding to control information control; With the data of the resolution of both direction send to receiving equipment controlling at least corresponding to control information.
Second recording medium of the present invention is a kind of recording medium of logging program, this program is carried out computer and is received from the reception processing of the data of transmitting apparatus transmission, this reception is handled and be may further comprise the steps: send control information to transmitting apparatus, so that corresponding to the resolution of control information control data on the both direction at least of time orientation, direction in space and level direction; The data that reception sends from transmitting apparatus, the resolution of these data are Be Controlled on the both direction of data at least; With the data that in receiving step, receive of output.
The 3rd recording medium of the present invention is a kind of recording medium of logging program, the transmission that this program causes computer to carry out the transmitting apparatus that sends data is handled and is received the reception of the receiving equipment of data and handles, and wherein the transmission of transmitting apparatus is handled and be may further comprise the steps: receive the control information that sends from receiving equipment; Send to the resolution of data on the both direction at least of time orientation, direction in space and level direction of receiving equipment corresponding to control information control; With the data of the resolution of both direction send to receiving equipment controlling at least corresponding to control information; Wherein the reception of receiving equipment is handled and be may further comprise the steps: send control information; The data that reception sends from transmitting apparatus, the resolution of these data are Be Controlled on the both direction of data at least; And the data that on the Data Receiving step, receive of output.
The 4th recording medium of the present invention is a kind of recording medium of logging program, and this program makes computer carry out the transmission processing that sends data to receiving equipment, and this transmission is handled and be may further comprise the steps: receive the control information that sends from receiving equipment; Sort out data corresponding to control information; Send data with classification result to receiving equipment corresponding to data.
First signal of the present invention is a kind of signal that comprises program, and this program causes computer to carry out the transmission processing that sends data to receiving equipment, and this transmission is handled and be may further comprise the steps: receive the control information that sends from receiving equipment; Send to the resolution of data on the both direction at least of time orientation, direction in space and level direction of receiving equipment corresponding to control information control; With the data of the resolution of both direction send to receiving equipment controlling at least corresponding to control information.
Secondary signal of the present invention is a kind of signal that comprises program, this program causes computer to be carried out and receives from the reception processing of the data of transmitting apparatus transmission, this reception is handled and be may further comprise the steps: send control information to transmitting apparatus, so that corresponding to the resolution of control information control data on the both direction at least of time orientation, direction in space and level direction; The data that reception sends from transmitting apparatus, the resolution of these data are Be Controlled on the both direction of data at least; With the data that in receiving step, receive of output.
The 3rd signal of the present invention is a kind of signal that comprises program, the transmission that this program causes computer to carry out the transmitting apparatus that sends data is handled and is received the reception of the receiving equipment of data and handles, and wherein the transmission of transmitting apparatus is handled and be may further comprise the steps: receive the control information that sends from receiving equipment; Send to the resolution of data on the both direction at least of time orientation, direction in space and level direction of receiving equipment corresponding to control information control; With the data of the resolution of both direction send to receiving equipment controlling at least corresponding to control information; Wherein the reception of receiving equipment is handled and be may further comprise the steps: send control information; The data that reception sends from transmitting apparatus, the resolution of these data is Be Controlled on the both direction of data at least; With the data that on the Data Receiving step, receive of output.
The 4th signal of the present invention is a kind of signal that comprises program, and this program causes computer to carry out the transmission processing that sends data to receiving equipment, and this transmission is handled and be may further comprise the steps: receive the control information that sends from receiving equipment; Sort out data corresponding to control information; Send data with classification result to receiving equipment corresponding to data.
According to first transmitting apparatus of the present invention, first sending method, first recording medium and first signal, receive the control information that sends from a receiving equipment.Corresponding to this control information, the resolution on the both direction at least in control time direction, direction in space and the level direction.The data of both direction resolution send to receiving equipment controlling at least corresponding to this control information.
According to receiving equipment of the present invention, method of reseptance, second recording medium and secondary signal, send control information to transmitting apparatus, this transmitting apparatus is corresponding to the resolution on the both direction at least of this control information control time direction, direction in space and level direction.In addition, the data of the resolution on the both direction of having controlled corresponding to this control information at least send from transmission equipment, receive and output.
According to transmitting and receiving device of the present invention, method of sending and receiving, the 3rd recording medium and the 3rd signal, transmitting apparatus is received from the control information that receiving equipment sends.Corresponding to this control information, the resolution on the both direction at least of control time direction, direction in space and level direction is equipped with.The data of the resolution on the both direction of having controlled corresponding to this control information at least send receiving equipment fully to, and in addition, receiving equipment sends control information to transmitting apparatus.Data corresponding to the resolution on the both direction at least of this control information control send from transmission equipment, receive and output.
According to second transmitting apparatus of the present invention, second sending method, the 4th recording medium and the 4th signal, be received from the control signal that a receiving equipment sends, corresponding to this control information, sort out data.Corresponding to the classification result of these data, send data to this receiving equipment.
The simple declaration of accompanying drawing
Fig. 1 is the schematic diagram of structure example that shows the transmitting system of embodiments of the invention;
Fig. 2 is the detailed maps of example that shows first structure of transmitting system shown in Figure 1;
Fig. 3 is the block diagram that shows the structure example of transmitting apparatus (terminal unit) 1 shown in Figure 1;
Fig. 4 is a flow chart of explaining the processing of transmitting apparatus shown in Figure 3;
Fig. 5 is the block diagram that shows the structure example of receiving equipment (terminal unit) 2 shown in Figure 1;
Fig. 6 explains the flow chart of the processing of receiving equipment shown in Figure 52;
Fig. 7 is the block diagram that shows the structure example of transmission processing section 16 shown in Figure 3;
Fig. 8 is the block diagram that shows the structure example of coded portion 31 shown in Figure 7;
Fig. 9 A, 9B and 9C are the schematic diagrames of explaining hierarchical coding/decoding;
Figure 10 is a transmission process chart of explaining transmission processing section 16 shown in Figure 7;
Figure 11 shows that Return Reception Dept. shown in Figure 5 divides the block diagram of 21 structure example;
Figure 12 is the block diagram of the structure example of decoded portion 53 shown in Figure 11;
Figure 13 is the block diagram that shows the structure example of combined treatment part 22 shown in Figure 5;
Figure 14 is a flow chart of explaining the combined treatment of combined treatment part 22 shown in Figure 13;
Figure 15 A, 15B and 15C are the schematic diagrames that shows the image demonstration example of image output 23 shown in Figure 5;
Figure 16 A and 16B explain the schematic diagram that concerns from transmitting apparatus 1 shown in Figure 1 between the spatial resolution of the image that receiving equipment 2 sends and temporal resolution;
Figure 17 is the block diagram that shows the structure example of object extraction part 14 shown in Figure 3;
Figure 18 A and 18B are the processing schematic diagrames of explaining beginning area dividing part 83 shown in Figure 17;
Figure 19 A, 19B, 19C and 19D are the schematic diagrames of explaining the processing of regional splicing part 84 shown in Figure 17;
Figure 20 is the schematic diagram of the processing of splicing regions processing section 85 shown in Figure 17 and separated region processing section 86;
Figure 21 is a flow chart of explaining the object extraction processing of object extraction part 4 shown in Figure 17;
Figure 22 is a flow chart of explaining the zone splicing processing details on the step S43 shown in Figure 21;
Figure 23 is the flow chart of the details handled of the splicing regions on the step S44 shown in Figure 21;
Figure 24 is a flow chart of explaining the separated region processing details on the step S44 shown in Figure 21;
Figure 25 is the block diagram that shows the structure example of control section 35 shown in Figure 7;
Figure 26 is a schematic diagram of explaining the characteristic quantity (feature amount) of object;
Figure 27 is a flow chart of explaining the processing details of control section 35 shown in Figure 25;
Figure 28 is the detailed maps of example that shows second structure of transmitting system shown in Figure 1;
Figure 29 is the block diagram that shows the structure example of transmitting apparatus 1 shown in Figure 28;
Figure 30 is a flow chart of explaining the processing of transmitting apparatus 1 shown in Figure 29;
Figure 31 is the block diagram that shows the structure example of receiving equipment 2 shown in Figure 28;
Figure 32 is a flow chart of explaining the processing of receiving equipment 2 shown in Figure 31;
Figure 33 is the block diagram that shows the structure example of transmission processing section 1016 shown in Figure 29;
Figure 34 is a flow chart of explaining the transmission processing of transmission processing section 1016 shown in Figure 33;
Figure 35 is the block diagram that shows the structure example of combined treatment part shown in Figure 31;
Figure 36 is the flow chart of the combined treatment shown in Figure 35 combined treatment partly of explanation;
Figure 37 is the practical structures schematic diagram that shows object extraction part 1014 shown in Figure 29;
Figure 38 is a flow chart of explaining the extraction processing of moving object image or stationary objects image;
Figure 39 explains the definite flow chart of handling of stagnant zone or zone of action;
Figure 40 A, 40B, 40C, 40D and 40E are the schematic diagrames of the computational methods of interpreted frame poor (frame difference);
Figure 41 explains to click the flow chart of determining processing continuously;
Figure 42 A, 42B, 42C, 42D, 42E and 42F are the schematic diagrames of explaining the object number distribution method;
Figure 43 is a flow chart of explaining the stationary objects combined treatment;
Figure 44 is a flow chart of explaining the moving object combined treatment;
Figure 45 is a flow chart of explaining that object extraction is handled;
Figure 46 A, 46B, 46C, 46D and 46E are the schematic diagrames of explaining the object extraction method;
Figure 47 is the block diagram that shows another structure example of combined treatment part 1022 shown in Figure 31;
Figure 48 is the block diagram that shows another structure example of transmitting apparatus 1 shown in Figure 28;
Figure 49 shows that variation shown in Figure 48 is determined and the block diagram of the structure example of classification part 240;
Figure 50 A and 50B are the flow charts of explaining that variation is determined and the object of interest regional change of classification part 240 is handled and the classification of object of interest zone is handled;
Figure 51 explains the definite flow chart of handling of interests change;
Figure 52 is the schematic diagram of detailed example of the 3rd structure of transmitting system shown in Figure 1;
Figure 53 is the block diagram that shows the structure example of the charging server 4 shown in Figure 52;
Figure 54 is a flow chart of explaining the processing of the charging server 4 shown in Figure 53;
Figure 55 is the block diagram of structure example that shows the computer of embodiments of the invention.
The best form that carries out an invention
Fig. 1 shows the structure of image transmission system (this system is an entity as the logical combination of multiple arrangement, and whether is arranged in the frame irrelevant with each device) according to an embodiment of the invention.
Dispensing device is made up of at least two terminal units 1 and 2.With regard to the relation of terminal unit 1 and 2, one is transmitting apparatus, and another is a receiving equipment.Image (view data) sends to receiving equipment from transmitting apparatus through transfer path 3.
According to embodiments of the invention, for example, suppose that terminal unit 1 is a transmitting apparatus, and terminal equipment 2 is receiving equipments, and the supposition view data sends between them and receives.Hereinafter, terminal unit 1 or 2 can be respectively referred to as transmitting apparatus 1 or receiving equipment 2.
In this case, transmitting apparatus 1 sends view data through transfer path 3 to receiving equipment 2.The view data that receiving equipment 2 receives from transmitting apparatus 1.The demonstration of view data on image output 23 (referring to Fig. 5) will be described below.Image output 23 for example is made up of LCD or CRT (cathode ray tube).In addition, receiving unit 2 sends control information to transmitting apparatus 1 through transfer path 3, and this control information control chart is as the spatial resolution of the shown image space direction of output 23 and the temporal resolution of time orientation.
Transmitting apparatus 1 receives the control information from receiving equipment 2, and corresponding to the transmission of this control information control view data, and the spatial resolution and the temporal resolution of the shown image of receiving equipment 2 are changed when satisfying predetermined condition.
With regard to transmitting apparatus 1 and receiving equipment 2, can use such as the terminal unit of PHS (personal handyphone system) (trade mark) unit with such as the portable terminal unit of portable telephone unit.When using the PHS unit, transfer path 3 has 1895.1500 to 1905.9500Mhz frequency band, and its transfer rate is 128Kbps (bit per second).
Fig. 2 shows the example of first structure of image transmission system shown in Figure 1; At this, be used as transmitting apparatus shown in Figure 11 and receiving equipment 2 such as the portable terminal unit of PHS unit and portable telephone unit.
According to embodiment shown in Figure 2, transfer path 3 shown in Figure 1 is made up of wireless base station 3-1 or 3-2 and switching station 3-3.This switching station 3-3 is the exchange that (for example) connects wireless base station 3-1 and 3-2.Wireless base station 3-1 or 3-2 to/send and receive a wireless signal from terminal unit 1 or 2.Between terminal unit 1 and 2, its each can be through the transfer path 3 formed by wireless base station 3-1 and 3-2, charging server 4 etc. to another transmission signal, and from another terminal received signal.The type of base station 3-1 can base station 3-2 type identical or inequality.
Referring to Fig. 2, terminal unit 1 comprises video camera part 1-1, display part 1-2, button part 1-3, speaker portion 1-4 and microphone 1-5.Video camera part 1-1 has the image pick-up device and the photosystem that can absorb live image.Display part 1-2 can character display, symbol, image etc.Operation push-button part 1-3 when input telephone number, character, order etc.Loud speaker 1-4 output sound.Microphone 1-5 sound import.Equally, terminal unit 2 comprises video camera part 2-1, display part 2-2, button part 2-3, loud speaker 2-4 and microphone 2-5, and they have respectively and video camera part 1-1, display part 1-2, structure that button part 1-3, loud speaker 1-4 are identical with microphone 1-5.
The voice signal that microphone 1-5 and 1-6 gather sends between terminal unit 1 and 2 and receives.In addition, can send and receive the view data of absorbing by video camera part 1-1 and 2-1.Therefore, the display part 1-2 of terminal unit 1 can show the view data that the video camera part 2-1 by terminal unit 2 obtains.Equally, the display part 2-2 of terminal unit 2 can show the view data that the video camera part 1-1 by terminal unit 1 obtains.
In other words, the view data absorbed of the video camera part 1-1 of transmitting apparatus 1 and send to receiving equipment 2 through the transfer path of forming by base station portion 3-1 and 3-2 and switching station 3-3 3 such as the information needed of frame rate.Receiving unit 2 shows the view data (corresponding image) that receives on the display part 2-1 that is made of for example LCD.On the other hand, receiving equipment 2 sends to transmitting apparatus 1 through transfer path 3 and is used to be controlled at the spatial resolution of the last image that shows of display part 2-1 and the control information of temporal resolution.Transmitting apparatus 1 receives from the control information of receiving equipment 2 and corresponding to the transmission of control information control view data, and the spatial resolution and the temporal resolution of the shown image of receiving equipment 2 are changed when satisfying predetermined condition.
Following Fig. 3 shows the example of the structure of transmitting apparatus shown in Figure 21.
In other words, background image extraction part 13 is extracted so-called background image and a background image that extracts is supplied to transmission processing section 16 from the image of image importation 11 supplies.In addition, background image extracts the background image that part 13 extracted and also is supplied to object extraction part 14 and additional information calculating section 15.
As the method for extracting background image, obtain the frequency of occurrences of the pixel value of a plurality of pixels on the same spatial location of a plurality of successive frames (for example present frame and 10 frames in the past).It is that background is treated that pixel value with highest frequency can be used as this position.On the other hand, can obtain the mean value (movable mean value) of the pixel value on the same position of a plurality of frames, with as this locational background image.
Additional information calculating section 15 detection background image mobile vectors and object mobile vector.The background image mobile vector is represented to extract the moving of background image that part 13 extracts (corresponding to moving of the background image that moves of the shooting direction of image importation 11) by background image.The object mobile vector is represented moving by the object of object extraction part 14 extractions.In addition, additional information calculating section 15 sends processing section 16 with offering as the positional information of the object in the frame of additional information etc. of providing of object extraction part 14.In other words, when object extraction part 14 was extracted an object, this object extraction part 14 was also extracted the information relevant with this object, such as the positional information of this object etc., and this information was supplied to additional information calculating section 15.Additional information calculating section 15 is also exported this positional information etc. as additional information.
It is multiplexed by the background images of background image extraction part 13 extractions, by the object of object extraction part 14 extractions and the additional information of being obtained by additional information calculating section 15 to send processing section 16, and through transfer path 3 synthetic multiplexed data is sent to receiving equipment 2 by its data rate.Send processing section 16 and be received from the control information that receiving equipment 2 sends through transfer path 3, and, the spatial resolution and the temporal resolution of the shown image of receiving equipment 2 are changed when satisfying predetermined condition corresponding to the transmission of this control information control background image, object and additional information.
Be briefly described the processing of transmitting apparatus shown in Figure 31 below in conjunction with flow chart shown in Figure 4.
The image of image importation 11 outputs is supplied with preprocessing part 12.On step S1, with the preliminary treatment of processing section 12 carries out image.In other words, on step S1, background image extraction part 13 or foreground image extract part 14 and extract background image or object respectively from the images of image importation 11 inputs, and the object of background image that extracts or extraction is supplied to transmission processing section 16.In addition, on step S1, additional information calculating section 15 obtains above-mentioned additional information from the image of image importation 11 inputs, and the additional information that obtains is supplied to transmission processing section 16.Send processing section 16 and send background image, foreground image and additional information through transfer path 3.After this, flow process turns back to step S1, the processing of transmitting apparatus 1 duplication similarity.
Fig. 5 shows the structure example of receiving equipment shown in Figure 22.
Transmitting apparatus 1 sends multiplexed data through transfer path 3.This multiplexed data divides 21 to receive by Return Reception Dept..Return Reception Dept. divides 21 these multiplexed datas to be separated into background image, object and additional information, and they are supplied to combined treatment part 22.
When control information sent part 25 and receives control information from control information importation 24, control information sent part 25 and sends control information to transmitting apparatus 1 through transfer path 3.
Processing below in conjunction with flow chart brief description shown in Figure 6 receiving equipment 2 shown in Figure 5.
In receiving equipment 2, Return Reception Dept. divides 21 through the multiplexed data of transfer path 3 receptions from transmitting apparatus 1.At step S11, Return Reception Dept. divides 21 to carry out a reception processing, and multiplexed data is separated into background image, object and additional information.Return Reception Dept. divides 21 to background image, object and the additional information of combined treatment part 22 supplies as the reception result.At step S12, combined treatment part 22 is combined as background image, object and additional information original image and this composition diagram is looked like to be supplied to image output 23.Image output 23 shows this combination image.After this, flow process turns back to step S11, subsequently the processing of receiving equipment duplication similarity.
In receiving equipment 2, when the user specified the consideration point of the shown image of an image output 23 with control information importation 24, the coordinate of this consideration point was placed in the control information.Should synthetic control information be supplied to control information transmission part 25.Control information sends part 25 and sends this control information through transfer path 3 to transmitting apparatus 1.When transmitting apparatus 1 receives aforesaid control information, send the transmission of processing section 16 control background image, object and additional informations, the spatial resolution and the temporal resolution of the shown image of receiving equipment 2 can be changed when satisfying agreed terms.After this, because controlled background image, object and additional information send to receiving equipment 2 from transmitting apparatus 1, the image that has therefore changed spatial resolution and temporal resolution when satisfying a predetermined condition is shown by receiving equipment 2.
Fig. 7 shows the structure example of the transmission processing section 16 of transmitting apparatus shown in Figure 31.
Background image, object and additional information are supplied with coded portion 31 and control section 35 from preprocessing part 12 (referring to Fig. 3).31 pairs of background images of coded portion, object and additional information are encoded, and this synthetic coded data is supplied to MUX (multiplexer) 32.MUX32 selects to supply with transmission part 33 from background image, object and the additional information of coded portion 31 receptions and the coded data of selecting as multiplexed data under the control of control section 35.Send the multiplexed data that part 33 modulation receive from MUX32, and the data of this modulation are sent to receiving equipment 2 through transfer path 3.Data volume calculating section 34 monitors that MUX32 to the multiplexed data that sends part 33 outputs, calculates the data rate of multiplexed data, and the data rate that calculates is supplied to control section 35.
The output of the multiplexed data of control section 35 control MUX32 makes the data rate of data volume calculating section 34 be no more than the transfer rate of transfer path 3.In addition, control section 35 receives the control information that sends from receiving equipment 2 through transfer path 3, and controls the selection of the coded data of MUX32 corresponding to this control information.
Fig. 8 shows the structure example of coded portion shown in Figure 7 31.
Background image is supplied to difference calculating portion 41B.Difference calculating portion 41 deducts the background image at preceding frame of firm processing from the background image of the present frame (hereinafter, this frame can be called considered frame) of processing, and the difference data of background image is supplied to hierarchical coding part 42B as subtracting each other the result.Hierarchical coding part 42B carries out hierarchical coding to the difference data of the background image that receives from difference calculating portion 41B, and coding result is offered storage area 43B.The hierarchical coding result that the interim storage of storage area 43B receives from hierarchical coding part 42.The hierarchical coding result who stores among the storage area 43B coded data of image as a setting is supplied to MUX32 (referring to Fig. 7).
The hierarchical coding result who stores among the storage area 43B is supplied to local decoder 44B.Local encoder 44B decodes the hierarchical coding result becomes original background image, and the background image of decoding is supplied to difference calculating portion part 41B.Background image by local decoder part 44B decoding is used by difference calculating portion 41B.Difference calculating portion 41B uses the background image of decoding to obtain the difference data of the background image of next frame.
Object is supplied to difference calculating portion 41F.Difference calculating portion 41F, hierarchical coding part 42F, storage area 43F and local decoder part 44F carry out and above-mentioned difference calculating portion 41B, the processing that hierarchical coding part 42B, storage area 43B are identical with local decoder 44B.Thereby according to the mode identical with background image, object is by hierarchical coding and be supplied to MUX32 (referring to Fig. 7).When having a plurality of object, difference calculating portion 41F, hierarchical coding part 42F storage area 43F and local decode part 44F are in the above described manner to a plurality of object hierarchy codings.
Additional information offers VLC (variable length code) part 45.VCL part 45 is encoded to additional information according to the variable length code coding method, and the additional information of coding is supplied to MUX32 (referring to Fig. 7).
To illustrate by the performed hierarchical coding/decoding of coded portion shown in Figure 8 31 below in conjunction with accompanying drawing 9.
Suppose that the mean value of four pixels of 2 * 2 pixels (level * vertical) of lower level is regarded as a pixel (pixel value) of higher level, and supposition by three hierarchical level to an image encoding.In this case, an image for the lowermost layer of considering 8 * 8 pixels shown in Fig. 9 (A) in secondary calculates the mean value m0 of four pixel h00, h01, h02 and h03 of the upper left of 2 * 2 pixels.This mean value is regarded as a pixel of the secondary upper left of the second layer by m0.Calculate the mean value m1 of four pixel h10, h11, h12 and h13 of the secondary upper right portion of lowermost layer.Calculate the mean value m2 of four pixel h20, h21, h22 and h23 of the secondary bottom left section of lowermost layer.Calculate the mean value m3 of four pixel h30, h31, h32 and h33 of the secondary upper right portion of lowermost layer.Mean value m1, m2 and m3 are regarded as the pixel on the secondary upper right portion of the second layer, bottom left section and the upper right portion respectively.In addition, calculate the mean value q of four pixel m0, m1, m2 and m3 of 2 * 2 secondary pixels of the second layer.It is a pixel of top secondary tri-layer level that this mean value is considered.
According to above-mentioned hierarchical coding, the spatial resolution of top image in secondary is minimum.The spatial resolution and the hierarchical level of image are inversely proportional to.In other words, the spatial resolution of the image of lowermost layer in secondary is the highest.
In the data volume under the situation that all pixel h00 to h03, h10 to h13, h20 to h23, h30 to h33, m0 to m3 and q are sent out greater than only secondary pixel m0 to m3 and q send data volume under the secondary image situation of lowermost layer by higher level.
Therefore, shown in Fig. 9 (B), the pixel q of tri-layer level is sent out, rather than for example the locational pixel m3 in bottom right of the secondary pixel m0 to m3 of the second layer is sent out.
In addition, shown in Fig. 9 (C), send the secondary pixel m0 of the second layer, rather than send the locational pixel h03 in bottom right of the secondary pixel m0 to m3 of lowermost layer for example.Equally, secondary residual pixel m1, m2 and the q of the second layer is sent out, rather than sends secondary pixel h13, h23 and the h33 of ground floor.Although the pixel q secondary pixel that is not the second layer, it is sent out, rather than sends the pixel m3 that directly obtains from pixel h30 to h33.Thereby, send pixel q, rather than send the pixel m3 of replacement pixels h33.
Thereby shown in Fig. 9 (C), the number of pixels of transmission becomes 4 * 4 pixels=16 pixels.Therefore the number of pixels that sends is same as the situation that only sends the secondary pixel of the lowermost layer shown in Fig. 9 (A).In this case, the data volume of transmission can be avoided increasing.
Pixel h03, h13, h23 and h33 that pixel m3 that replacement pixels q sends and replacement pixels m0 to m3 send can be decoded by following mode.
In other words, because q is the mean value of m0 to m3, therefore satisfy equation q=(m0+m1+m2+m3)/4.Therefore can be corresponding to pixel q and the second layer secondary pixel m0 to m2 acquisition (decoding) second layer secondary pixel m3 of equation m3=4 * q-(m0+m1+m2) with the tri-layer level.
In addition, because m0 is the mean value of h00 to h03, therefore satisfy equation m0=(h00+h01+h02+h03)/4.Therefore, corresponding to equation h03=4 * m0-(h00+h01+h02), can obtain the secondary pixel h03 of ground floor with the second layer secondary pixel m0 and the secondary pixel h00 to h02 of ground floor.By identical mode, can obtain h13, h23 and h33.
Therefore, a pixel of the hierarchical level of pixel that sends in the pixel that does not send in a specific hierarchical can be with this hierarchical level and upper level is decoded.
Transmission below in conjunction with flowchart text shown in Figure 10 transmission processing section 16 shown in Figure 7 is handled.
At step S21, in sending processing section 16, control section 35 determines whether receiving equipment 2 transmits control signal.When definite result of step S21 showed that control signal does not also send (that is, control section 35 does not also receive this control information) from receiving equipment 2, flow process advanced to step S22.Control section 35 control MUX32 select the coded data of background image, object and additional informations, and multiplexed they so that receiving equipment 2 can be with image of normal temporal resolution (for example, default time resolution) demonstration.
When normal temporal resolution is for example during 30 frames/second, MUX32 selects the coded data of background image, object and additional information, multiplexed this coded data, with these multiplexed data of output, so that this image uses the maximum time resolution that sends multiplexed data by the transfer rate of transfer path 3 to show.
In fact, for example, carrying out by three hierarchical level in the situation of hierarchical codings, when the data of (having only) tri-layer level sent with the transfer rate of transfer path 3, the coded data of selecting background image, object and additional information was to show the image of tri-layer level.Therefore, in this case, receiving equipment 2 usefulness are that the spatial resolution and the temporal resolution of 30 frame/seconds of 1/4 (horizontal direction and vertical direction) of original image (ground floor is secondary) shows an image.
After this, flow process advances to step S23.At step S23, send part 33 and send the multiplexed data of MUX32 through transfer path 3 outputs.After this, step turns back to step S21.
Definite result on step S21 shows that flow process advanced to step S24 when control signal had sent (that is, control section 35 has received this control information) from receiving equipment 2.Control section 35 is handled control information importation 24 corresponding to control signal, thus the consideration point of approval appointment.After this, flow process advances to step S25.
At step S25, control section 35 is appointed as a priority range in the such zone of the rectangle of preliminary dimension on every side having a consideration point (rectangle frame has at the parallel edges of frame horizontal direction and vertical direction and considers on the center of gravity of point at rectangle frame), improves background image, object and the additional information of spatial resolution and detection composition diagram picture in this priority range in this scope with priority.
After this, flow process advances to step S26.At step S26, the coded data of control section 35 control MUX32 selections and multiplexed background image, object and additional information, and multiplexed selecteed coded data is so that receiving equipment 2 can show the image in this priority range with higher temporal resolution.
In other words, when control section 35 received control signal from receiving equipment 2, control section 35 control MUX32 were the spatial resolution that cost is improved the image in the priority range to sacrifice temporal resolution.
Therefore, MUX32 select to show coded data, multiplexed selecteed coded data and these multiplexed data of output of background image, object and additional information that the secondary image of the image of tri-layer level and the second layer is required with priority.
In addition, at step S26, control section 35 control MUX32 are inserted into the positional information of priority range, size or the like in the additional information that is elected to be multiplexed data.After this, flow process advances to step S23.
At step S23, as described above, send part 33 and send the multiplexed data of MUX32 through transfer path 3 outputs.After this, flow process turns back to step S21.
For the sake of simplicity, suppose on step S26 the coded data that shows background information, object and the additional information of the image of tri-layer level for an image Continuous Selection, by the situation of multiplexed data volume greater than the secondary data of the second layer of the image in the priority range on step S22 as the tri-layer level of an image in the nonpreemption scope.According to embodiments of the invention,, then only send the data of tri-layer level with the transfer rate of transfer path 3 in a manner mentioned above with 30 frames/second display image.So just can not forwarding step S26 go up the multiplexed data that obtains, so that by 30 frames/second display image.Thereby under the extreme case of step S23, the multiplexed data that obtains on step S26 sends with the temporal resolution of 0 frame/second.
Therefore, in this case, for the image in the priority range, each spatial resolution of the horizontal direction of receiving equipment 2 shown images and vertical direction is 1/2 of original image (being that ground floor is a secondary) spatial resolution.In other words, receiving equipment 2 shows that each spatial resolution of its horizontal directions and vertical direction is twice in the image (second layer is secondary) of the image spatial resolution of tri-layer level.Yet its time resolution was 0 frame/second.In other words, receiving equipment 2 shows a rest image.
In transmission is used for priority range after the secondary data of the second layer of image, when having shown, definite result of step S21 sends (promptly from receiving equipment 2 once more, identical consideration point is also specified in user's continued operation control information importation 24) during control signal, flow process advances to step S24.At step S24, control section 35 these identical consideration points of approval.After this, flow process advances to step S25.At step S25, control section 35 is specified identical priority range.After this, flow process advances to step S26.
At step S26, control section 35 control MUX32 select the coded data and the multiplexed coded data that should select of background image, object and additional information, so that receiving equipment 2 can show the image in the priority range with higher spatial resolution.
In this case, for the image in the priority range, select the coded data of secondary background image, object and additional information of tri-layer level and the second layer with priority.In this case, select the coded data of secondary background image, object and additional information of ground floor, and press multiplexed data output with priority.In addition, at step S26, as mentioned above, high resolution information is inserted in the additional information.After this, flow process advances to step S23.At step S23, send part 33 and send the multiplexed data of MUX32 through transfer path 3 outputs.After this, flow process advances to step S21.
Therefore, in this case, for the image in the priority range, receiving equipment 2 shows the image with spatial resolution identical with original image (ground floor is secondary).In other words, receiving equipment 2 shows the image of spatial resolution with the image that is four times in the tri-layer level (in each of horizontal direction and vertical direction).Yet temporal resolution for example was 0 frame/second.In other words, receiving equipment 2 shows a rest image.
Thereby, when user's continued operation control information importation 24 when specifying identical consideration point, for the image in the priority range that contains this consideration point, be that transmission has improved the data of spatial resolution to cost by priority with the temporal resolution of image.Thereby although the temporal resolution of image worsens, the spatial resolution that contains the image in the priority of considering point is improved.Therefore, the image in the preference scope is clearly illustrated more.In other words, the image of the part of user's consideration is clearly illustrated more.
Therefore, the transmission Be Controlled of view data changes in the resolving range of the view data that sends by transfer path 3 transfer rates so that contain the spatial resolution and the temporal resolution of the interior image of the particular priority scope of a consideration point.In other words, the transmission Be Controlled of view data is so that the spatial resolution of the image in the priority range improves and the temporal resolution of this image is worsened.Thereby, the spatial resolution resolution that the view data that sends by means of the transfer rate by transfer path 3 is improved and the temporal resolution of deterioration.As a result, on the transmission rate of a restriction, the spatial resolution of the image that receiving equipment 2 is shown can further improve.
Following Figure 11 shows Return Reception Dept. shown in Figure 5 and divides 21 structure example.
Receiving unit 51 receives multiplexed data through transfer path 3.After multiplexed data is by demodulation, supply with DMUX (demodulation multiplexer) 52.DMUX52 resolves into the multiplexed data that receives from receiving unit 51 coded data of background image, object and additional information and gives decoded portion 53 data supply that decomposes.Decoded portion 53 is decoded into the coded data of background image, object and additional information their initial data and flows to combined treatment part 22 (referring to Fig. 5).
In other words, Figure 12 shows the structure example of decoded portion shown in Figure 11 53.
The hierarchical coding difference data of the coded data of image is supplied to adder 61B as a setting.In addition, decoded and a background image being stored in the former frame among the storage area 62B is also given adder 61B.The background image addition of the former frame that adder 61B receives with the difference data of background image with from storage area 62B is so that decode to the background image of a level of anticipation level.The background image of this decoding is supplied to storage area 62B.The background image that storage area 62B storage receives from adder 61B.The background image of this decoding is supplied to adder 61B and combined treatment part 22 (referring to Fig. 5).
Hierarchical coding difference data as the object coding data is supplied to adder 61F.Adder 61F carries out the processing similar with storage area 62B to adder 61B with storage area 62F.Thereby, according to the mode identical, the difference data of object is decoded as the object of the hierarchical level of expection with background image, be supplied to combined treatment part 22 (referring to Fig. 5) then.When having a plurality of object, addition section 61F and storage area 62F are to the difference data of a plurality of objects decode (hierarchical decoding).
The additional information with variable-length code encoding as the additional information coded data is supplied to reverse VLC part 63.Oppositely the 63 usefulness variable length codes of VLC part are decoded to this additional information.Thereby the acquisition original additional information, and be supplied to combined treatment part 22.
Figure 13 shows the structure example of combined treatment part 22 shown in Figure 5.
The background image (referring to Fig. 1) of decoded portion 53 outputs is supplied to background image and writes part 71.The object of decoded portion 53 outputs is supplied to object and writes part 72.The additional information of decoded portion 53 output is supplied to that background image writes part 71, object writes part 72 and built-up section 77.
Background image writes part 71 and continuously background image is write background image memory 73.When an image be video camera pan or the inclination situation under when taking, background image moves.At this point, background image writes part and locatees this background image, then the background image of location is write background image memory 73.Therefore, background image memory 73 can be stored an image of spatially being wider than the image of a frame.Background image is to locate background image corresponding to a background image mobile vector that is included in the additional information.
When background image write part 71 background image with high spatial resolution is write background image memory 73, background image write part 71 changes the background image sign of storing by the address of background image tag memory 74 corresponding to the pixel of from 0 to 1 background image numerical value.When background image write part 71 background image is write background image memory 73, background image write part 71 and checks background image mark memory 74.Background image writes part 71 and a background image with low spatial resolution is not write on the pixel that its background image sign is 1 (showing that this pixel storage has a background image of high spatial resolution).Thereby, substantially,, all be written to background image memory 73 no matter when background image is supplied to background image to write part 71.Yet background image writes part 71 and the background image with low spatial resolution is not written on the pixel of storing the background image with high spatial resolution.As a result, no matter when the background image of high spatial resolution is supplied to the background background image to write part 71, and the scope of high spatial resolution is broadened in the background image memory.
Object writes part 72 and continuously the object of being supplied with is written on the object memories 75.When having a plurality of object, object writes part 72 each object is written to object memories 75.When object writes part 72 identical object (distributing identical sign) when being written to object memories 75, object writes the new object of part 72 usefulness (being supplied to object to write the object of part 72 recently) replace old object.
In addition, when object write part 72 object of high spatial resolution is written to object memories 75, object write part 72 and changes a background image sign by an address storage of object identity memory 76 corresponding to from 0 to 1 object pixels.When object write part 72 object is written to object memories 75, object write part 72 checking object tag memory 76.Object writes part 72 and the object with low spatial resolution is not written on the object memories 75 of the object (object flag=1) of storing high spatial resolution.In the identical mode of background image memory 73, no matter when object supplies with object writes part 72, all this object is written to object and writes part 72.Yet object writes part 72 and the object of low spatial resolution is not written on the pixel of the object of storing high spatial resolution.As a result, no matter when the object of high spatial resolution is supplied to object and writes part 72, and the number of the object of high spatial resolution increases progressively in object memories 75.
73 storages of background image memory write the background image of part 71 supplies from background image.Background image tag memory 74 is pressed an above-mentioned bit background image sign of address storage of background image memory 73.Whether the background image of background image tag memory 74 expression high spatial resolutions has been stored on the address of background image memory 73.Object memories 75 comprises that at least a storage writes the memory portion of each object of part 72 supplies from object.Whether the above-mentioned bit object flag of object flag memory portion 76 storages, on behalf of the object of a high spatial resolution, this sign be stored in the object memories 75.
In this case, for the sake of simplicity, background image sign and object flag are bit flag.On the other hand, background image sign and object flag can be many bit flag.In the case, background image sign and object flag can be represented many grades resolution.In other words, a bit flag is only represented high-resolution and these two grades of low resolution.Yet bit flag is represented many grades resolution more than one.
Built-up section 77 is read the background image of the present frame (this frame can be referred to as considered frame) of storage in the background image memory 73 corresponding to a background image mobile vector that contains in the additional information.In addition, built-up section 77 makes up an object of storage in this background image and the object memories 75 corresponding to an object mobile vector that contains in the additional information.As a result, built-up section 77 generates the image of a considered frame, and this image is supplied with a display-memory 78.
In addition, when built-up section 77 receives from the control information of control information importation 24 (referring to Fig. 5), built-up section 77 is read an object on the consideration point that contains in this control information from object memories 75, and this object is supplied with subwindow memory 79.
Display-memory 78 is as a so-called VRAM (video read-only memory).The image of the considered frame that display-memory 78 temporary transient storage built-up sections 77 are supplied with.
The object that subwindow memory 79 temporary transient storage built-up sections 77 are supplied with.
Overlapping portion 80 is read the memory contents of display-memory 78, and the content of storage is supplied with image output 23 (referring to Fig. 5).Image output 23 shows the content of this storage.In case of necessity, overlapping portion 80 is opened the subwindow (hereinafter will describe) on the image output 23, the memory contents of reading subwindow memory 79, and on this subwindow, show this content.
Below, in conjunction with Figure 14 the processing (combined treatment) that combined treatment part shown in Figure 13 22 is performed will be described.
At first, at step S31, background image write part 71 or object write background image sign that part 72 stored in the above described manner corresponding to background image tag memory 74 respectively or object flag memory 75 in the above described manner institute's objects stored sign a background image or object that decoded portion 53 (referring to Figure 12) is supplied with write relational storage.
In other words, background image writes part 71 and checks background image tag memory 74, and the background image of supplying with is written on this address of background image memory 73, and this moment, the background image with this corresponding pixel in address was masked as 0.On the contrary, if when the spatial discrimination of the background image of supplying with is very high, background image writes part 71 and this background image is write on the address of background image memory 73, and this moment, the background image to pixel that should the address was masked as 1.
Equally, when the object flag of object memories 75 was 0, object write part 72 object of supplying with is written to object memories 75.When the object flag of object memories 75 was 1, if when the spatial resolution of the object of supplying with is very high, object write part 72 object of supplying with is written to object memories 75.
When background image write part 71 background image is written to particular address of background image memory 73, if a background image has been stored in this address, then background image write part 71 and this background image is covered writes on this address.This processing also is applicable to object memories 75.
After this, flow process advances to step S32.At step S32, background image writes part 71 and object and writes part 72 and determine whether additional informations contain high resolution information.Definite result on step S32 shows that additional information contains high resolution information (promptly, user control information operation importation 24 (referring to Fig. 5), synthetic control information has sent to transmitting apparatus, and transmitting apparatus 1 has been a background image and the object of an image transmission high spatial resolution in the priority range subsequently) time, flow process advances to step S33.At step S33, background image writes part 71 or object and writes part 72 the background image sign of background image tag memory 74 or the object flag of object flag memory 76 are changed into 1.
In other words, when a background image that has high spatial resolution for an image in the priority range sends when transmitting apparatus 1 and an object, at step S31, background image writes part 71 and object and writes background image and the object that part 72 will have high spatial resolution and write background image memory 73 and object memories 75 respectively.Thereby at step S33, forming that the background image sign of a pixel of background image with high spatial resolution and object and object flag be changed is 1.
After this, flow process advances to step S34.Built-up section 77 is read an object in the priority range from object memories 75, and this object is write subwindow memory 79.
In other words, when the definite result on the step S32 shows that additional information contains high resolution information, represent that then the user has operated that control information importation 24 (referring to Fig. 5), control information have been sent to transmitting apparatus 1 and transmitting apparatus 1 has a high-resolution background image and an object for an image in the priority range has sent subsequently.The control information that sends to transmitting apparatus 1 also is supplied to built-up section 77.Thereby, when built-up section 77 receiving control informations, coordinate approval (recognize) its priority range of the consideration point that built-up section 77 contains in corresponding to control information on step S34, from object memories 75, read the high spatial resolution object in this priority range that transmitting apparatus 1 sent, and this object is written to subwindow memory 79.
After this, flow process advances to step S35.At step S35, built-up section 77 is read the background image of the present frame (this frame is referred to as considered frame) of demonstration from background image memory 73 corresponding to the background image mobile vector that contains in the additional information.In addition, built-up section 77 is read a shown image about this considered frame from object memories 75.Built-up section 77 is combined with the object of reading from object memories 75 with the background image of this considered frame corresponding to the object mobile vector that additional information contains, and the combination image of this considered frame is written to display-memory 78.In other words, built-up section 77 is written to (for example) display-memory 78 with a background image, an object is covered write on the display-memory 78 subsequently.As a result, built-up section 77 is written to the image that a background image and object are combined into a considered frame image on the display-memory 78.
In this mode, the image that is written to display-memory 78 considered frames is supplied to image output 23 (referring to Fig. 5) with an object that is written to subwindow memory 79.Image output 23 shows image and the object of supplying with.
In transmitting apparatus 1, so-called soft strong for example can be contained in the additional information.In this case, built-up section 77 soft top-notch player's objects of use and background image are combined.
On the contrary, the definite result on step S32 shows that additional information does not comprise high resolution information (that is, the user is control information operation importation 24 (referring to Fig. 5) not), and flow process advances to step S35, skips steps 33 and 34.As mentioned above, built-up section 77 is read a background image of considered frame from background image memory 73.In addition, built-up section 77 is read a required object from object memories 75, and combines with the object of reading from object memories 75 corresponding to the background image of additional information with considered frame.As a result, generate the image of a considered frame, and be written into display-memory 78.After this, flow process turns back to step S31.Then, the processing of combined treatment part 22 duplication similarities.
In combinations thereof is handled, control information operation importation 24 (referring to Fig. 5) (not promptly as the user, by the control information importation of forming such as the pointing device of the mouse that is not dragged (or click) 24) time, shown in Figure 15 (A), the image of low spatial resolution is gone up with a default time resolution at image output 23 (referring to Fig. 5) and is shown.In Figure 15 (A), the image of low spatial resolution moves on the background image of low spatial resolution.
When user's control information operation importation 24 (referring to Fig. 5), cursor is moved to object, and this cursor is dragged on the position of object, as the above, control information is sent to transmitting apparatus 1.Transmitting apparatus 1 is that cost is that an image in the priority range sends and shows the required data of high spatial resolution images with the temporal resolution.The result, shown in Figure 15 (B) like that, although temporal resolution is (for example) 0 frame/second, on image output 23 (referring to Fig. 5), demonstrate the image that is improved gradually around the spatial resolution that drags a object in the priority range of position and a background image.In other words, corresponding to the time of dragging (or number of clicking), the spatial resolution of the image in priority range is improved gradually.
In the case, shown in Figure 15 (B), on image output 23 (referring to Fig. 5), open a subwindow.The mode that an object in the priority range that drags the position is improved gradually according to the spatial resolution of object shows.
After this, when the user with control information importation 24 (referring to Fig. 5) when stopping drag operation, as described above, built-up section 77 is read a background image of considered frame from background image memory 73 on step S35, read an object from object memories 75, corresponding to additional information that the background image of considered frame is combined with the object of reading from object memories 75, and this combination image is written to display-memory 78.As described above, because the object as drag operation result's high-space resolution is stored in the object memories 75, image output 23 (referring to Fig. 5) is presented on the relevant position of considered frame stickup as the image of drag operation result's a object with high spatial resolution with the conventional temporal resolution shown in Figure 15 (C).
After this, image output 23 (referring to Fig. 5) shows the image of an object of the high spatial resolution that moves corresponding to additional information with conventional temporal resolution.
Thereby when the user carried out the locational drag operation of the object that will carefully watch at him or she, he or she can watch the object with high spatial resolution.In other words, the user can watch an object in detail.
When the user carried out drag operation, the background image of the high spatial resolution in the priority range was stored in background image memory 73.Thereby, even the user stops drag operation, the also background image of the high spatial resolution of display background video memory 73 stored.Thereby, when the user carries out drag operation, because the background image in the priority range that drags the position is enhanced, therefore no matter when the user carries out each the locational drag operation at the display screen of image output 23 (referring to Fig. 5), and the background image of high spatial resolution is by mosaic mode broadening gradually.At last, image output 23 shows the background image of whole high spatial resolution.
In addition, according to this embodiment, as mentioned above, owing to a background image is stored in the background image memory 73, so transmitting apparatus 1 does not need to send a background image with low spatial resolution that has sent.Thereby the transmission bandwidth of background image (transfer rate) can be distributed to an object and background image of high spatial resolution.
In these cases, the object as drag operation result's high spatial resolution is stored in object memories 75.After stopping drag operation, the object of this high spatial resolution is adhered on the background image.Thereby, become high by the spatial resolution of a shown object of receiving equipment 2.Yet the state of the object that transmitting apparatus 1 is captured does not make a difference to receiving equipment 2 shown objects.
In order addressing this problem, after stopping drag operation, to be stored in the object that an object of the high spatial resolution in the object memories 75 stores in can the storage area 62F with decoded portion 53 (referring to Figure 12) and to substitute, and ignore object flag.In other words, the object of transmitting apparatus 1 transmission is stored on the storage area 62F of decoded portion 53 (referring to Figure 12) continuously.Thereby when object was written to object memories 75, the object of an image that shows on the image output 23 was subjected to the state of the object of transmitting apparatus 1 shooting to influence (yet the spatial resolution of this object reduces).
The MUX32 shown in Figure 7 of transmission processing section 16 that comprises transmitting apparatus 1 under the control of control section 35 with the having/do not have of high resolution information, be placed on the stem or analog of multiplexed data corresponding to frame rate in the frame rate (temporal resolution) in the priority range of this high resolution information and spatial resolution and the nonpreemption scope and spatial resolution.Receiving equipment 2 is corresponding to frame rate and spatial resolution and interior frame rate and the spatial resolution of nonpreemption scope in the having/does not have of information (hereinafter this information being called header message) the approval high resolution information of placing in this stem (header), the priority range.
In transmitting apparatus 1, header message can comprise (coordinate) of the consideration point that is contained in the control information that for example receiving equipment 2 sends.In this case, built-up section 77 shown in Figure 13 can be recognized same consideration corresponding to the header message that transmitting apparatus 1 is sent and puts a locational object.In other words, in above-mentioned situation, the control information that built-up section 77 is supplied with corresponding to control information importation 24 (referring to Fig. 5) is admitted the consideration that contains in the control information and is put a locational object, from object memories 75, read object, and this object is supplied to subwindow memory 79.Yet when header message contained a consideration point, built-up section 77 was admitted locational object of this consideration point corresponding to this header message.In the case, in Fig. 5, control information importation 24 does not need this control information is supplied to receiving equipment 2.
Below, in conjunction with Figure 16 the spatial resolution of the image that transmitting apparatus 1 sends to receiving equipment 2 through transfer path 3 and the relation between the temporal resolution are described.
The transfer rate of supposing transfer path 3 is R (bps) and has sent by a background image and three images that object #1 to #3 forms.For for simplicity, do not consider additional information in this embodiment.In addition, suppose the background image and the object # 1 to #3 that have particular space resolution in order to show, they each needs identical data volume.
In the case, when the user does not carry out drag operation, in transmitting apparatus 1, shown in Figure 16 (A), by transfer rate R divided by 4 R/4[bps] send background image and object # 1 to #3.When conventional temporal resolution was 1/T frame/second, transmitting apparatus 1 can send each the frame data of background image and object # 1 to #3 at T at the most in the time of second.Thereby in the case, receiving equipment 2 shows background image and the object # 1 to #3 with a spatial resolution of the data acquisition of every frame T * R/4 bit.
When the user carried out locational drag operation at for example object # 1, shown in Figure 16 (A), transmitting apparatus 1 stopped to send background image and object # 2 and #3, and only sends object # 1 with the whole transfer rate R of transfer path 3.After this, from the time t1 process time t2 of period 4T, when the user stopped drag operation, transmitting apparatus 1 sent background image and object # 1 to #3 by each transfer rate R/4.
Thereby when the user carried out drag operation, transmitting apparatus 1 sent the data of 4T * R bit for object #1.Therefore, suppose when the user carries out drag operation that temporal resolution was 0 frame/second, then receiving equipment 2 shows the object # 1 with a spatial resolution of the data acquisition of every frame 4T * R bit.In other words, when improving horizontal space resolution and vertical space resolution for same amount, although temporal resolution became for 0 frame/second, receiving equipment 2 shows it is the object # 1 of a spatial resolution in each of the horizontal direction of four times (=(4T * R/ (T * R/4 bit))) of user's situation of not carrying out drag operation and vertical direction.
In other words, the object of this spatial resolution can be that cost improves to sacrifice temporal resolution.In addition, compare with the situation of sacrificing temporal resolution, the spatial resolution of the part that the user considers can be improved fast.
In these cases, when the user carries out drag operation for object # 1, do not send all data of background image and other object # 1 and #3.Yet, shown in Figure 16 (B), can distribute to higher transfer rate the data of object # 1, and low transfer rate can be distributed to the data of background image and other object # 2 and #3.
On the other hand, even carried out drag operation, each the transfer rate of distributing to background image and object # 1 to #3 can not change from R/4 yet.In other words, in the case, owing to being that cost is improved spatial resolution with the temporal resolution, (for example) therefore can improve spatial resolution, and not need to change transfer rate although spend long-time video data.
According to this embodiment, as mentioned above, be stored in the object memories 75 as drag operation result's a object with high spatial resolution.After drag operation stopped, the object of high spatial resolution was adhered on the background image.Yet an object mobile vector that contains in the object additional information of transmitting apparatus 1 transmission is depended in the position of the background image of the object of stickup high spatial resolution.
Thereby receiving equipment 2 should be admitted the object of which type of particular frame corresponding to the object of which type of consecutive frame.Therefore, when the object extraction part 14 (referring to Fig. 3) of transmitting apparatus 1 was extracted an object, the information that object extraction part 14 will allow receiving equipment 2 will carry out such approval operation was added on the additional information.
Figure 17 shows the structure example of object extraction part 14 shown in Figure 3.
The background image that image of image input device 11 output and background image extract part 13 outputs is supplied to a subtraction part 81.The background image of subtraction part 81 subtracting background image extraction part 13 outputs from the image of image importation output and acquisition are as a foreground image of an object.The foreground image that subtraction part 81 obtains is supplied to frame memory 82 and part 83 is divided in the prime area.
The foreground image that frame memory 82 temporary transient storage subtraction parts 81 are supplied with.The foreground image (this foreground image is known as the foreground image at preceding frame) of the former frame of storage is carried out prime area division processing in the foreground image of the present frame (this frame is known as considered frame) of the processing that prime area division part 83 use subtraction parts 81 are supplied with and the frame memory 82.
In other words, part 83 is sorted out the foreground image of forming considered frame corresponding to the pixel value category each pixel is divided in the prime area.In fact, when a pixel value was represented by RGB (red, green, blue), the prime area was divided part 83 and is sorted out each pixel according to a vector (this vector can be referred to as color vector) of being made up of the key element of R, G, B value and the category that concerns between a plurality of zonules in this rgb space.In addition, each pixel that part 83 category classification in an identical manner comprises the foreground image of the former frame of storage in the frame memory 82 is divided in the prime area.
When considered frame is for example during the n frame, the prime area divide that part 83 goes up the foreground image of (n-1) frame of the foreground image of n frame and former frame the time that is divided into and the space on adjacent and by the zone of a plurality of pixels compositions of identical category classification.
In other words, for example, suppose that each pixel category shown in Figure 18 (A) of each pixel of the foreground image of forming the n frame and the foreground image of forming (n-1) frame is sorted out.In Figure 18 (also in Figure 19 and Figure 20), a group of a character C who comprises in rectangle and a numeral is represented the classification of a pixel.
In the situation shown in Figure 18 (A), when the foreground image of the foreground image of n frame and (n-1) frame is divided into a plurality of zone of being made up of a plurality of pixels of sorting out by identical category, the prime area of the dotted line representative shown in formation Figure 18 (B).
The prime area that prime area division part 83 is obtained is supplied to regional splicing part 84 shown in Figure 17.
Zone splicing part 84 execution area BORDER PROCESSING are carried out BORDER PROCESSING to the prime area that prime area division part 83 is supplied with.
In other words, regional splicing part 84 is read an object that comprises in (n-1) frame from object information memory 88 object information, and admit the position and the scope of the object that comprises in (n-1) frame.In addition, the pixel of the object that is contained in (n-1) frame is formed in regional splicing part 84 approvals, and splicing contains the prime area of pixel.
In fact, for example, suppose in the scope (square frame) that the solid line shown in Figure 19 (A) is represented to comprise a specific object Obj that this object Obj is made up of the many pixels that are classified as classification c2, c3, c4 and c5.The prime area that comprises these pixels is spliced.Thereby in this example, the prime area of being made up of the pixel of classification c2, c3, c4 and c5 (hereinafter, the prime area of the pixel of classification c#i composition is referred to as prime area c#i) is spliced into the shadow region shown in Figure 19 (B).
In addition, the distance between regional splicing part 84 prime area (following can be referred to as splicing regions) of calculating splicing and the prime area that is close to this splicing regions.In this example, the distance between the prime area of a splicing regions and a vicinity (following can be referred to as contiguous prime area) can be the continuity that comprises the pixel value (color) of the pixel in the boundary vicinity in the distance (this distance can be referred to as the distance of rgb space) of mean value of pixel value (color) of pixel in two zones (splicing regions and contiguous prime area) or two zones.
When the distance between splicing regions and the contiguous prime area during, regional splicing part 84 splicing splicing regions and contiguous prime area, and generate a new splicing regions less than (or equaling) predetermined threshold.Zone splicing part 84 calculates above-mentioned distance, and repeats to splice contiguous prime area and splicing regions till not having to be spliced into the contiguous prime area of splicing regions.
Thereby, form a splicing regions shown in (for example) Figure 19 (C) with the splicing regions shown in Figure 19 (B).In Figure 19 C, because the distance between prime area c7 and the c8 is approaching, they are spliced into the splicing regions shown in Figure 19 (B).On the contrary, because the distance between prime area c1 and the c6, they can not be spliced into a splicing regions.
When contiguous prime area can be spliced into splicing regions shown in Figure 19 (C), zone splicing part 84 extracts a part of being made up of the pixel of the foreground image of forming the n frame, as with the corresponding object of the object Obj of (n-1) frame (this object can be referred to as corresponding objects), specify the identical sign of sign with the object Obj of n frame, and this object is exported to splicing regions processing section 85 and separated region processing section 86.
In other words, according to this embodiment, the designated identical sign of object of unanimity in each frame.Receiving equipment 2 is corresponding to certain object of this sign approval contiguous frames suitable with certain object of contiguous frames.
The sign of distributing to the object Obj of (n-1) frame is comprised in the object information, and is stored in the object information memory 88.For object information memory 88, the object Obj sign of (n-1) frame is distributed in regional splicing part 84 approvals.
In addition, when regional splicing part 84 extracts object with the corresponding n frame of all objects of (n-1) frame in the above described manner, zone splicing part 84 extracts a zone that each prime area of n frame and its another contiguous prime area is spliced into an object, and a new sign any sign of the object of (n-1) frame (rather than be assigned to) is distributed to the object that is extracted and an object that is extracted is exported to splicing regions processing section 85 and separated region processing section 86.
Splicing regions processing section 85 and separated region processing section 86 are carried out a splicing regions processing respectively and a separated region is handled.Carrying out splicing regions when the splicing object handles.Carrying out separated region when separating the splicing object handles.
In other words, for example, with regard to (n-1) frame, n frame and these three successive frames of (n+1) frame, as shown in figure 20, in (n-1) frame, object A and B are approaching mutually.In the n frame, object A and B are overlapped.In other words, in the n frame, object A and B can be spliced into an object.In the case, being spliced into the object A of an object and B moves continuously by their direction.In (n+1) frame, the splicing object is separated into two object A and B.
In the case, in the splicing of the zone of regional splicing part 84 is handled, the n frame be spliced object be equivalent to object A in ground (n-1) frame and B the two.On the contrary, separated object A and B are equivalent to a splicing object in the N frame in (n+1) frame.According to this embodiment, suppose that an object of particular frame is equivalent to an object of former frame.Thereby, as mentioned above, preferably need not relevant two objects of object.On the contrary, preferably need not two relevant objects of object.
Thereby by being spliced object with an object A of (n-1) frame and of calculating relevant n frame of B, splicing processing section 85 is carried out a splicing regions and is handled.One of two object A by being spliced relevant (n+1) frame of object with of n frame and B, separated region processing section 86 is carried out separated regions and is handled.
In other words, the sign that one of the object A of 85 of splicing regions processing sections and (n-1) frame and B are identical is distributed to a splicing object of n frame.On the other hand, 86 signs identical with the splicing object of n frame in separated region processing section are distributed to one of two separated object A of (n+1) frame and B, and a new sign are distributed to another separated object of (n1) frame.
Splice by regional splicing part 84 as the result's who handles by the splicing regions of splicing regions processing section 85 execution object extraction result with as the result's who handles by the separated region of separated region processing section 86 execution object extraction result, and be supplied to a new region processing section 87.
When the object extraction result of splicing regions processing section 85 and separated region processing section 86 contained a new object, new region processing section 87 was carried out the new region of new object and is handled.
Three types object is arranged, and they are assigned with new sign in the object extraction result of splicing regions processing section 85 and separated region processing section 86.The first kind to as if a kind of like this object: because object moves fast and this object space overlap in considered frame and former frame not, therefore regional splicing part 84 does not extract the object of the considered frame as the corresponding objects of former frame.Second type to as if a kind of like this object: because the corresponding objects of former frame and the splicing of another object, so separated region processing section 86 can not be with the be correlated with objects of considered frame of the corresponding frames of former frame.The 3rd type to as if a kind of like this object: owing to a new object in considered frame, occurs, therefore distribute a new sign to give this new object.
In this object of three types, a new sign should be assigned to an object of the 3rd type.Thereby for the object of first and second types, a corresponding objects is detected in new region processing section 87 from former frame, and the sign identical with the considered frame object reallocated to the corresponding objects of former frame.
In fact, new region processing section 87 checking object information-storing devices 88, the object of several past frames of approval considered frame and acquisition are by each of approval object and the consideration distance between objects of considered frame, and this consideration object is assigned with a new sign.As distance between objects, also can use the distance between the characteristic quantity of object.The example of characteristics of objects amount is the bar chart (for example, along upper and lower, left and right, the bar chart of each direction of upper left, lower-left, upper right and upper right eight directions) of the tangential direction of the outline line be made up of each pixel of object of a zone, of object and a mobile vector of object.
New region processing section 87 obtains by approval object and the minimum value of considering distance between objects.When minimum value during less than (or equaling) predetermined threshold, new region processing section 87 having apart from an object handles considering the object minimum range is and the corresponding object of consideration object, the sign identical with the sign of the object that obtains distributed to the consideration object and exported this consideration object.When approval object and the minimum value of considering distance between objects are not less than predetermined threshold (promptly, there is not such object, its distance of considering object apart is short in the frame in the past) time, new region processing section 87 is this consideration object handles an object that occurs recently in the considered frame, a new sign is distributed to this considered frame and exported this considered frame.
The output of new region processing section 87 is supplied to additional information calculating section 15 (referring to Fig. 3) and sends processing section 16 (referring to Fig. 3).In addition, the output of new region processing section 87 is supplied to object information memory 88.An object of the temporary transient storage of object information memory 88 (position of this object and size (profile), form the pixel value etc. of the pixel of this object) and a sign that distributes as object information.
Handle from the object extraction that an image extracts an object below in conjunction with flowchart text shown in Figure 21.Object extraction is handled by object extraction part shown in Figure 17 14 and is carried out.
The background image that image of image output 11 output and background image extract part 13 outputs is supplied to subtraction part 81.At step S41, subtraction part 81 subtracting background image from the image of image importation 11 outputs extracts the background image of part 13 outputs, obtains a foreground image as an object.The foreground image that subtraction part 81 obtains is supplied to frame memory 82 and part 83 is divided in the prime area.Frame memory 82 storages are from the foreground image of subtraction part 81 outputs.
On the other hand, at step S42, the foreground image that part 83 is checked the former frame of considered frame is divided in the prime area, carries out according to the described prime area of Figure 18 and divides processing, obtains the prime area, and the prime area that obtains is supplied to regional splicing part 84.At step S43, the object information of the former frame of storage in the splicing part 84 checking object information-storing devices 88 of zone, handle according to the prime area execution area splicing that in conjunction with the described mode of Figure 19 is 83 outputs of prime area division part, and extract an object of considered frame.
The object that zone splicing part 84 is extracted is supplied to splicing regions processing section 85 and separated region processing section 86.At step S44, splicing regions is handled or separated region is handled according to carrying out in conjunction with the described mode of Figure 20 for splicing regions processing section 85 or separated region processing section 86, and the result who handles is exported to new region processing section 87.
At step S45, new region processing section 87 is that above-mentioned new region processing is carried out in the output of splicing regions processing section 85 and separated region processing section 86.Thereby, the final object extraction result of new region processing section 87 output considered frames.The extraction result of object is supplied to additional information calculating section 15 (referring to Fig. 3) and sends processing section 16 (referring to Fig. 3).In addition, the extraction result of object is supplied to object information memory 88 and is stored in wherein.
After this, flow process turns back to step S20, and on this step, next frame is designated as a new considered frame.The processing of object extraction part 14 duplication similarities then.
The step S43 that is described in detail in Figure 21 below in conjunction with flow chart shown in Figure 22 goes up the zone splicing processing of being carried out by regional splicing part 84.
In the zone splicing was handled, at first, on step S51, regional splicing part 84 was checked the object information about an object that contains in the former frame (at preceding frame) of considered frame, the object handles at preceding frame is one considers object.On step S51, use and consider object, shown in Figure 19 (B), the prime area of part 83 outputs is divided in regional splicing part 84 splicing prime areas, forms a splicing regions.
After this, flow process advances to step S52.At step S52, a prime area of regional splicing part 84 searching near splicing regions (this prime area can be referred to as contiguous prime area).After this, flow process advances to step S53.At step S53, the distance between the prime area of regional splicing part 84 calculating splicing regions and consideration.After this, flow process advances to step S54.At step S54, regional splicing part 84 determines that whether interregional distance is less than a predetermined threshold value.
When the definite result on the step S54 showed that distance between the prime area of splicing regions and consideration is less than predetermined threshold value, flow process advanced to step S55.At step S55, regional splicing part 84 is spliced to splicing regions to the prime area of considering, thereby forms a new splicing regions.After this, flow process advances to step S56.
On the contrary, when the definite result on the step S54 showed that distance between the prime area of splicing regions and consideration is not less than predetermined threshold value, flow process advanced to step S56, skips steps S55.In other words, regional splicing part 84 is not spliced to splicing regions to the prime area of considering.At step S56, regional splicing part 84 determines whether all prime areas of contiguous splicing regions are all searched.When the definite result on the step S56 showed that all prime areas of contiguous splicing regions are also not searched, flow process turned back to step S52.At step S52, the not searched contiguous prime area of arriving of regional splicing part 84 search.After this, the processing of regional splicing part 84 duplication similarities.
When the definite result on the step S56 showed that all prime areas of contiguous splicing regions are searched, flow process advanced to step S57.At step S57, regional splicing part 84 determine all objects of in preceding frame, containing each whether be designated as the consideration object.When each that shows all objects of containing as the definite result on the step S57 in preceding frame all was not designated as and considers object, flow process turned back to step S51.At step S51, regional splicing part 84 is appointed as the consideration object to an object that contains in preceding frame, is the processing of the object duplication similarity of consideration recently then.
On the contrary, when each that shows all objects of containing as the definite result on the step S57 in preceding frame had been designated as and considered object, flow process turned back to the process of being called.
Below, be described in detail in step S44 shown in Figure 21 in conjunction with flow chart shown in Figure 23 and go up the splicing regions processing of carrying out by splicing regions processing section 85.
At first, on step S61, splicing regions processing section 85 is appointed as a considered frame to a frame of handling, checking object information-storing device 88, spatially with under the overlapping condition of the consideration object of considered frame admitting the object number that contains in the former frame (at preceding frame) (promptly at the object of preceding frame, the object number at preceding frame of considered frame), and the number of object adjust to a variable N.
After this flow process advances to step S62.At step S62, splicing regions processing section 85 determines whether variable N is 2 or denys greater than 2.When variable N less than 2 when (that is, in preceding frame, do not contain on having living space and consider that the object that object is overlapping or the number of object are 1), flow process advances to step S65, skips steps S63 and S64.
On the contrary, the definite result on step S62 show variable N equal 2 or greater than 2 (that is, and in preceding frame, contain have living space on consider the overlapping two or more objects of object) time, flow process advances to step S63.At step S63, splicing regions processing section 85 calculate consider objects with spatially and the consideration object overlapping in the distance between each of two or more objects of preceding frame.After this flow process advances to step S64.
At step S64, an object to the considered frame minimum range is selected to have in splicing regions processing section 85 from the object that step S63 obtains, and a sign identical with the sign of selected object is distributed to the consideration object.
After this, flow process advances to sign S65.At step S65, whether each of all objects that contain in the 85 definite considered frames of splicing regions processing section has been designated as the consideration object.When each that shows all objects of containing in the considered frame as the definite result on the step S65 also was not designated as and considers object, splicing regions processing section 85 was appointed as considered frame to one in these objects.After this, flow process turns back to step S61.At step S61, the processing of splicing regions processing section 85 duplication similarities.
On the other hand, when each that shows all objects of containing in the considered frame as the definite result on the step S65 had been designated as and considered object, flow process turned back to the process of being called.
Below, describe the separated region of on step S44, carrying out shown in Figure 21 in detail in conjunction with flow chart shown in Figure 24 and handle by separated region processing section 86.
Separated region processing section 86 checking object information-storing devices 88 also are appointed as one to one of object that contains in the former frame of considered frame (at preceding frame) and are considered object.In addition, at step S71, separated region processing section 86 approval and object number (object of approval is referred to as corresponding objects) corresponding to the considered frame of considering object, and the number of object adjusted to variable N.
After this, flow process advances to step S72.At step S72, separated region processing section 86 determines whether variable N equal 2 or greater than 2.
Definite result on step S72 shows that variable N is not more than 2 when (that is, do not contain in the considered frame spatially and consider that the object that object is overlapping or the number of object are 1), and flow process advances to step S76, skips steps S73 to S75.
On the contrary, the definite result on step S72 shows that variable N equals 2 or greater than 2 when (that is, containing in the considered frame has living space goes up the two or more objects overlapping with considering object (corresponding to the object of considering object)), flow process advances to step S73.At step S73, each and consideration distance between objects of these objects calculated in separated region processing section 86.After this, flow process advances to step S74.
At step S74, separated region processing section 86 selects from these objects to consider that object has an object of minimum range apart, distributes to the consideration object with a sign identical with the sign of selected object.
After this, flow process advances to step S75.At step S75, separated region processing section 86 is distributed to a new sign at one of non-selected object on the step S74 (that is the object except that the object of considering the object beeline apart).After this, flow process advances to step S76.
At step S76, separated region processing section 86 determine all objects of in preceding frame, containing each whether be designated as the consideration object.When each that shows all objects of containing as the definite result on the step S76 in former frame also was not designated as and considers object, separated region processing section 86 specified one of these objects for considering object.After this, flow process turns back to step S71.At step S71, the processing of separated region processing section 86 duplication similarities.
When each that shows all objects of containing as the definite result on the step S76 in preceding frame had been designated as and has considered object, flow process turned back to the process of being called.
In these cases, when the user specifies one to consider point with control information importation 24, the transmission of transmitting apparatus 1 control data is so that be that cost is improved the spatial resolution that contains an image in the priority range of considering point with the temporal resolution of image.On the other hand, transmitting apparatus 1 can be learnt user's preference, the object or the analog that detect that the user plans to watch with a high spatial resolution corresponding to learning outcome, and the transmission of control data is so that show this object with a high spatial resolution.
Figure 25 shows the structure example of carrying out control section 35 shown in Figure 7 under a kind of like this situation of control at transmitting apparatus 1.
Priority range specified portions 91 receives a control signal that sends from receiving equipment 2, assigned priorities scope in a manner described, and the priority range of appointment is supplied to selects control section 92 and Characteristic Extraction part 93.
Select the data of background image, object and the additional information of control section 92 control MUX32 (referring to Fig. 7) to select.In other words, when selecting control section 92 when priority range specified portions 91 receives priority range, selecting control section 92 control MUX32 (referring to Fig. 7) is the spatial resolution that cost is improved image in this priority range with the temporal resolution of image.In addition, when selecting control section 92 to receive sign from object detection part 95, selecting control section 92 control MUX32 (referring to Fig. 7) is the spatial resolution that cost is improved the object with this sign with the temporal resolution of image.
Data volume calculating section 34 (referring to Fig. 7) is to the data rate of the multiplexed data of selecting control section 92 supply MUX32 outputs.Select the data of control section 92 control MUX32 to select, make multiplexed data rate be no more than the transfer rate of transfer path 3.
Background image, object and the additional information of preprocessing part 12 (referring to Fig. 3) output, and the priority range of priority range specified portions 91 outputs are supplied to Characteristic Extraction part 93.Characteristic Extraction part 93 is extracted the characteristic quantity of the image in the priority range of priority range specified portions 91 outputs.In other words, the characteristic quantity of the object that Characteristic Extraction part 93 extraction priority range contain makes this characteristic quantity reflect the trend of the image that user is considering.
In fact, for example as shown in figure 26, the characteristic quantity of the specific people's that Characteristic Extraction part 93 is extracted a object is represented: one to liking a people, the mobile of object is consistent, the position (depth of field) of the depth of field direction of object is a prospect, the position of the object that shows is central authorities, object moves (to liking a position of moving), the zone of object comprises eyes, (zone of object is by eyes for nose and face, nose and face are formed), the pattern of object is candy strip (to liking a striped part), and the color of object is red (to liking a red part).
Bar chart storage area 94 is stored the bar chart of characteristic quantity vector as the learning outcome of user's preferences.
Below, in conjunction with the control and treatment of flowchart text MUX32 (referring to Fig. 7) shown in Figure 27.This control and treatment is carried out by control section shown in Figure 25 35.
At first, at step S81, priority range specified portions 91 determines whether control signal sends from receiving equipment 2.When the definite result on the step S81 showed control signal from receiving equipment 2 transmissions, flow process advanced to step S82.On step S82, priority range specified portions 91 is specified a priority range in a manner described corresponding to this control signal, and this priority range is supplied to selection control section 92 and Characteristic Extraction part 93.
At step S83, selecting control section 92 control MUX32 (referring to Fig. 7) is that cost is improved the spatial resolution an of image (object and a background image) in the priority range that priority range specified portions 91 supplied with the temporal resolution of image.
On step S84, the characteristic quantity of the image in the priority range that Characteristic Extraction part 93 extraction priority range specified portions 91 are supplied with, acquisition has a characteristic quantity vector of the element of being made up of each characteristic quantity of object.On step S85, Characteristic Extraction part 93 is with the frequency increments 1 of the bar chart of the characteristic quantity vector of storage in the bar chart storage area 94.Flow process turns back to step S81.
In the circulation of step S85, bar chart storage area 94 forms a bar chart of the characteristics of objects vector that users plan to consider at step S81.In other words, bar chart storage area 94 study users' preference.
In addition, Characteristic Extraction part 93 can quantize the characteristic quantity vector that obtained, and increases progressively the frequency with the corresponding code of quantized result of characteristic quantity vector.In this case, the bar chart of bar chart storage area 94 temporary transient storage codes.
On the contrary, when the definite result on the step S81 showed that receiving equipment 2 does not also transmit control signal, flow process advanced to step S86.On step S86, object detection part 95 is according to obtaining the characteristic quantity vector of the object that preprocessing parts 12 (referring to Fig. 3) are provided with Characteristic Extraction part 93 same way as.In addition, on step S87, object detection part 95 is checked the bar chart of being stored in the bar chart storage area 94, and determines whether to comprise in the preset range of the characteristic quantity vector space around the characteristic quantity vector that highest frequency is arranged a characteristic quantity vector of the object that preprocessing part 12 (referring to Fig. 3) provided.In other words, at step S87, whether the distance that object detection part 95 is determined to have between the characteristic quantity vector of the characteristic quantity vector of highest frequency and the object that preprocessing part 12 is supplied with is equal to or less than a predetermined value.
As mentioned above, when bar chart storage area 94 storage during as vector quantization result's code bar chart, object detection part 95 quantizes the characteristic quantity vector that obtained.At step S87, object detection part 95 determine as vector quantization result's code whether with bar chart storage area 94 in the code of highest frequency of bar chart of storage be complementary.
Show that as the definite result on the step S87 distance between the characteristic quantity vector of the object that characteristic quantity vector with highest frequency and preprocessing part 12 are supplied with is not less than predetermined value (promptly, because user's favourite hobby, preprocessing part 12 supply with to as if the user do not plan the object considered) time, flow process advances to step S88.On step S88, select control section 92 control MUX32 (referring to Fig. 7), make conventional spatial resolution of receiving equipment 2 usefulness and conventional temporal resolution show an image.After this, flow process turns back to step S81.
On the contrary, show that as the definite result on the step S87 distance between the characteristic quantity vector of the object that characteristic quantity vector with highest frequency and preprocessing part 12 are supplied with is equal to or less than predetermined value (promptly, because user's favourite hobby, preprocessing part 12 supply with to as if user's object of planning to consider) time, object detection part 95 outputs to selection control section 92 to the sign of the object that preprocessing part 12 is supplied with.After this, flow process advances to step S89.
On step S89, selecting control section 92 control MUX32 (referring to Fig. 7) is the spatial resolution that cost is improved the object with sign of supplying with from object detection part 95 with the temporal resolution.After this, flow process turns back to step S81.
Thereby in this case, receiving equipment 2 shows an object that has from the sign of object detection part 95 outputs, so that be that cost improves its spatial resolution with the temporal resolution.After this, receiving equipment 2 shows the object with high spatial resolution continuously.
As a result, receiving equipment 2 is the object of explicit user tendency consideration automatically, thereby does not need the user to control the spatial resolution that (intervention) control information input unit 24 just can improve object.After this, receiving equipment 2 shows the object (yet in the case, as the above, the temporal resolution of image worsens) of high spatial resolution continuously.
In bar chart storage area 94 as the bar chart of the characteristic quantity vector of the learning outcome of user's preferences storage can be periodically, aperiodicity ground or ask to reset corresponding to the user of receiving equipment 2.
In these cases, its characteristic quantity vector equals (coupling) or improves less than the spatial resolution of the object of the characteristic quantity vector of bar chart highest frequency.On the other hand, the spatial resolution with all objects of a kind of like this characteristic quantity vector also improves: its characteristic quantity vector equals (coupling) or surpasses the characteristic quantity vector of a predetermined value less than the frequency in its bar chart.
Priority range is to consider that point is positioned at a preset range (in these cases, priority range is a rectangular extent) of center of gravity.Yet, also can be said to be: comprise a image-region in the rectangular extent considering point and be the interesting image-region of watching of user (below, this zone is referred to as the object of interest zone).
On the other hand, the mobile image that receiving equipment 2 shows has a mobile image-region (below, this zone can be referred to as moving area) and rest image zone (below, this zone can be referred to as stagnant zone).
In order to improve the spatial resolution in object of interest zone, object of interest zone that must approval (appointment) user is considering in the moving area of image or stagnant zone.When object of interest zone that the user is considering can be designated, just can obtain the image-region (for example, background image region) that the user does not consider.
Even the object of interest zone that the user considers can be designated at a special time, after this also can be changed.Thereby, when an image-region of user's consideration is changed to another image-region, must be the object of interest zone to the new image areas approval.
In addition, the object of interest zone that can exist a plurality of users to consider.In this case, must admit these object of interest zones discretely.
So Figure 28 shows the example that is used as second structure of image transmission system shown in Figure 1 under the situation of transmitting apparatus shown in Figure 11 and receiving equipment 2 in portable terminal unit.In Figure 28, the part similar with Fig. 2 has identical reference number to represent, and omits the explanation to them.In other words, structure with image transmission system shown in Figure 2 is identical basically for the structure of image transmission system shown in Figure 28.
In the embodiment shown in Figure 2, the spatial resolution and required all information, spatial resolution, temporal resolution and the transfer rate of temporal resolution of the coordinate of the consideration point considered such as the user of control send to transmitting apparatus 1 as control information from receiving equipment 2.On the contrary, according to embodiment shown in Figure 28, as following will illustrate, the user goes up the consideration point of shown image with the display part 2-2 of the button part 2-3 of receiving equipment 2 operation (clicks) information is sent out (hereinafter, can be referred to as click data to the information on considering) as control information.
When transmitting apparatus 1 is received click data from receiving equipment 2, transmitting apparatus 1 is specified the image-region (object of interest zone) that the user is considering from the shown image (this image is taken by the video camera part 1-1 of transmitting apparatus 1) of receiving equipment 2 corresponding to click data, and control sends to the amount of information of the view data of receiving equipment 2, the spatial resolution and the temporal resolution in change specify image zone when satisfying a predetermined condition with box lunch.
Following Figure 29 shows the structure example of transmitting apparatus shown in Figure 28 1.In Figure 29, represent by identical reference number with the part that Fig. 3 is similar, and omission is to their explanation.In transmitting apparatus shown in Figure 29 1, background image extracts part 1013, object extraction part 1014 and sends processing section 1016 and is set up, and extracts part 13, object extraction part 14 and sends processing section 16 to substitute background image respectively.In addition, the click data that receiving equipment 2 sends not only is supplied to and sends 16 corresponding transmission processing sections 1016, processing section, also is supplied to preprocessing part 12.Except that these aspects (main points), the structure of transmitting apparatus 1 shown in Figure 29 structure with transmitting apparatus 1 shown in Figure 3 substantially is identical.Yet in transmitting apparatus shown in Figure 29 1, the output that background image extracts part 1013 is not supplied to object extraction part 1014.On the contrary, in transmitting apparatus shown in Figure 31, the output that background image extracts part 13 is supplied to object extraction part 14.In addition, in transmitting apparatus shown in Figure 29 1, the output of object extraction part 1014 is supplied to background image and extracts part 1013.On the contrary, in transmitting apparatus shown in Figure 31, the output of object extraction part 14 is not supplied to background image to extract part 13.
Click data is supplied to preprocessing part 12.With processing section 12 in, click data is supplied to object extraction part 1014.The image-region (object of interest zone) that the user that object extraction part 1014 is extracted (appointments) receiving equipments 2 considers from the captured image in image importation 11, and be supplied to transmission processing section 1016 with the corresponding view data of extracting in (appointment) object of interest zone.When a plurality of object of interest zone that the user who has receiving equipment 2 considers image importation 11 captured images, object extraction part 1014 is supplied to transmission processing section 1016 to the view data in a plurality of object of interest zone.In addition, the view data in the object extraction part 1014 object of interest zone of being extracted also is supplied to additional information calculating section 15.
As an example, the object of interest zone that the user considers is the object such as a material object.Below, description object extracts the situation that part 1014 is extracted a object as an example in object of interest zone (below be referred to as object images).Should be noted that the object of interest zone is not limited to an object.On the contrary, an object of interest zone can be an image-region, rather than image-region in the object, object or background image part (below will describe).Yet, according to this embodiment, will be as situation that the object of interest zone is an object of an example explanation.Below description object is extracted part 1014 performed object extraction and handle (object of interest zone designated treatment).
Background image extraction part 1013 (that is, is removed the extra-regional image-region of object of interest corresponding to object extraction result's extraction from the view data that image importation 11 is supplied with of object extraction part 1014 with the part of a background image of this image; Following background image partly is referred to as background image) a corresponding signal (below be referred to as background image data), and a background image data that extracts is supplied to sends processing section 1016 and additional information calculating section 15.In this example, a plane picture zone of its activity low (that is, not having the like that value of image pattern) is treated to a background image.Certainly, as nugatory image, background image can be user's object of loseing interest in.In this example, for for simplicity, above-mentioned plane picture zone will be said to be background image.
Additional information calculating section 15 detects the background image mobile vector (mobile being equivalent to of this background image moves along the shooting direction of image importation 11) of representing background image to move corresponding to the background image data that background image extracts part 1013 supplies.In addition, additional information calculating section 15 detects the object mobile vector that representative object moves corresponding to the view data (hereinafter, this view data being referred to as object image data) of the object images of object extraction part 1014 supplies.Additional information calculating section 15 is supplied to the detected mobile vector as an additional information part and sends processing section 1016.In addition, additional information calculating section 15 is corresponding to the object image data from 1014 supplies of object extraction part as additional information, to sending the information of processing section 1016 supplies, such as the object's position and the profile of the image of taking by image importation 11 (frame) about object.In other words, when object extraction part 1014 was extracted an object images from view data, object extraction part 1014 was also extracted the information of relevant this object, such as the position and the profile of object, and this information was supplied to additional information calculating section 15.Additional information calculating section 15 is output as additional information to the information of relevant this object.
Sending the click data of supplying with corresponding to receiving equipment 2 processing section 1016 encodes to the object image data of object extraction part 1014 supplies, the background image data of background image extraction part 1013 supplies and the additional information of additional information calculating section 15 supplies, when the spatial resolution of the object images of the image that shows with convenient receiving equipment 2 is enhanced, satisfy the condition of the data rate of transfer path 3.After this, send the additional information of the background image data and the coding of the processing section 1016 multiplexed object image data that this is encoded, coding, and send multiplexed data, frame rate information etc. to receiving equipment 2 through transfer path 3.
Below in conjunction with the performed processing of flowchart text shown in Figure 30 transmitting apparatus shown in Figure 29 1.
At step S91, transmitting apparatus 1 inputs to preprocessing part 12 to the view data that image importation 11 obtains.
After this, at step S92, transmitting apparatus 1 receives the click data that sends from receiving equipment 2, and this click data is inputed to preprocessing part 12.
At step S93, the preprocessing part 12 that has received view data and click data is carried out the preliminary treatment that is used to extract background image, object and additional information, and the background image data, object image data and the additional information that obtain in preliminary treatment are supplied to transmission processing section 1016.
On step S94, send the data volume of processing section 1016 calculating object view data, background image data and additional information, so that satisfy the condition of the data rate of transfer path 3.After this, send processing section 1016 and object image data, background image data and additional information encoded corresponding to this data volume, multiplexed subsequently they.After this, send processing section 1016 and multiplexed data and frame rate information are sent to receiving equipment 2 through transfer path 3.
After this, flow process turns back to step S1.On step S1, the processing of transmitting apparatus 1 duplication similarity.
Figure 31 shows the structure example of receiving equipment shown in Figure 28 2.In Figure 31, represent with similar reference number to part similar among Fig. 5, and omission is to their explanation.In other words, in receiving equipment shown in Figure 31 2, a combined treatment part 1022 is set, replaces combined treatment part 22.In addition, be provided with click data importation 1024 and click data and send part 1025, replace being provided with control information importation 24 and control information respectively and send part 25.Except that these aspects, the structure of receiving equipment 2 shown in Figure 28 structure with receiving equipment 2 shown in Figure 5 basically is identical.
The multiplexed data that transmitting apparatus 1 sends through transfer path 3 is received processing section 21 and receives.Return Reception Dept. divides the multiplexed data of 21 receptions to be separated into coding background image data, coded object view data and coding additional information data, and the data of separating are decoded.After this, Return Reception Dept. divides background image data, object image data and the additional information of 21 decodings to be supplied to combined treatment part 1022.
Background image data, object image data and the additional information of 1022 groups of combined treatment parts and decoding, and the picture signal of this combination is supplied to image output 23.In addition, combined treatment part 1022 is corresponding to the spatial resolution and the temporal resolution of the image of the click data control combination of click data importation 1024 supplies.
The operation of the button part 2-3 that carry out corresponding to the user click data importation 1024 generates represent click location (coordinate position) and the click data of click time.Button part 2-3 is used to specify the coordinate position of the image that shows as a pointing device on the image output 23 of the display part 2-2 that is equivalent to receiving equipment 2 (referring to Figure 28).In other words, when the user was expection picture position (object of interest zone) the click keys part 2-3 of the image of demonstration on the image output 23, click data importation 1024 generated click datas, and it represents the coordinate information and the click time of click location.The click data that click data importation 1024 generates is sent to combined treatment part 1022 and click data sends part 1025.
When click data sent part 1025 and receives click data from click data importation 1024, click data sent part 1025 and through transfer path 3 this click data is sent to transmitting apparatus 1.
Below in conjunction with flow chart brief description shown in Figure 32 processing by receiving equipment shown in Figure 31 2 execution.
At first, at step S101, receiving equipment 2 receives the multiplexed data that transmitting apparatus 1 sends through transfer path 3.
At step S102, Return Reception Dept. divides 21 multiplexed data is separated into coding background image data, coded object view data and coding additional information data, and the separated coding data are decoded.The background image data of this decoding, object image data and additional information are sent to combined treatment part 1022.
On step S103, in receiving equipment 2, click data importation 1024 obtains the corresponding click data of clicking operation of the button part 2-3 that carries out with the user, and is supplied to click data combined treatment part 1022 and click data to send part 1025.Like this, this click data just sends part 1025 from click data and sends to transmitting apparatus 1.
At step S104, the click data that combined treatment part 1022 divides 21 background image data, object image data and additional informations of supplying with and click data importation 1024 to supply with corresponding to Return Reception Dept. makes up an image and controls the spatial resolution and the time resolution of this combination image.Transmitting apparatus 1 can be placed on the click data that receiving equipment 2 sends in the header message of multiplexed data, and synthetic header message is sent to receiving equipment 2.In the case, the combined treatment part 1022 of receiving equipment 2 can obtain click data from header message.Thereby, do not need this click data is supplied to combined treatment part 1022 from click data importation 1024.
At step S105, on the LCD of image output 23 or analog, show image by 1022 combinations of combined treatment part.
After this, flow process turns back to step S101.On step S101, the processing of receiving equipment 2 duplication similarities.
Figure 33 shows the structure example of true (reality) of the transmission processing section 1016 of transmitting apparatus shown in Figure 29 1.In Figure 33, the part similar to Fig. 7 represented with similar reference number, and omission is to their explanation.In other words, except transmission processing section 1016 shown in Figure 33 is supplied to click data rather than The whole control information the control section 35, the structure of transmission processing section 1016 shown in Figure 33 structure with transmission processing section 1016 shown in Figure 7 basically is identical.
In Figure 33, background image data, object image data and additional information are supplied to from preprocessing part shown in Figure 29 12 and send processing section 1016.Background image data, object image data and additional information are transfused to coded portion 31 and control section 35.Coded portion 31 is encoded to background image data, object image data and the additional information of supply in a manner described, and the coded data that obtains is supplied to MUX32.MUX32 selects background image data, the object image data of coding and the additional information data of coding of coding under the control of control section 35, and the data of selecting are supplied to transmission part 33 as multiplexed data.Transmission part 33 is modulated to downstream part (downstream data flow) to the multiplexed data that MUX32 supplies with corresponding to the transmission standard of transfer path 3, and sends the data of modulation to receiving equipment 2 through transfer path 3.
On the other hand, control section 35 is controlled from the output of the multiplexed data of MUX32 supply,, the data rate that data volume calculating section 34 is supplied is no more than the transfer rate of transfer path 3.In addition, control section 35 receives the click data that sends from receiving equipment 2 through transfer path 3, and selects and multiplexed coded data corresponding to click data control MUX32.
Below, handle in conjunction with the transmission that flow chart shown in Figure 34 is carried out by explanation transmission processing section 1016 shown in Figure 33.
At first, at step S111, the control section 35 that sends processing section 1016 determines whether click data sends from receiving equipment 2.Definite result on step S111 shows that flow process advanced to step S112 when receiving equipment 2 did not send click data (that is, control section 35 does not receive click data).At step S112, the same with the situation of step S22 shown in Figure 10, control section 35 control MUX32 select background image data, encoding target data and the coding additional information data of coding, and the multiplexed data that should select, so that receiving equipment 2 can show an image with conventional temporal resolution.
After this, flow process advances to step S113.On step S113, send processing section 1016 from sending part 33 sends the MUX32 supply through transfer path 3 multiplexed data.After this, flow process turns back to step S111.
Definite result on step S111 shows that flow process advanced to step S114 when receiving equipment 2 had sent click data (that is, control section 35 has received click data).On step S114, control section 35 has been used a coordinate (click location) and the click time of considering point of the button part 2-3 appointment of receiving equipment 2 corresponding to click data approval user.
After this, on step S115, control section 35 is specified the object of interest zone of user's consideration of receiving equipment 2 sides by the following mode that will describe corresponding to the coordinate (click location) of considering point and click time, the priority range of the image that its spatial resolution preferentially improved is appointed as in the object of interest zone, and detect in the priority range an image with and additional information.In this case, the image in the priority range is an object images.An image in the nonpreemption scope is for example as the like that image of a Background in the subject area of loseing interest in.
After this, on step S116, image (object images), the image (background image) in the nonpreemption scope and the coded data of additional information in the control section 35 control MUX32 selecting priority scopes, and multiplexed they.In other words, when control section 35 receives click data from receiving equipment 2, it is such that step S26 shown in the image pattern 10 goes up situation, and control section 35 control MUX32 are so that the spatial resolution of the image in the priority range is that cost improves with the temporal resolution.
In addition, on step S116, control section 35 control MUX32 are inserted into position and the dimension information of high resolution information as priority range in the additional information that is selected as multiplexed data.After this, flow process advances to step S113.
On step S113, transmitting apparatus 33 sends the multiplexed data of MUX32 output through transfer path 3.After this, flow process turns back to step S111.
As described above, in transmission shown in Figure 34 is handled, carry out the processing similar with processing shown in Figure 10.Thereby, when the user of receiving equipment 2 for comprise the priority of considering on the point in being divided into image and when data input unit 1024 (for example, he specifies identical consideration point continuously with her) was clicked in operation continuously, preferentially transmission improved the data of spatial resolution.Thereby the spatial resolution that comprises the image in the priority range of considering point is improved gradually.Image in the priority range has more clearly been shown as a result.In other words, an image considering of the user of receiving equipment 2 sides (an object of interest zone and an object images) is clearly illustrated more.
As mentioned above, the transmission of view data is controlled corresponding to the transfer rate of transfer path 3, so that the spatial resolution and the temporal resolution of the image in the priority range (an object of interest zone and an object images) change in the scope of resolution.Thereby, under limited transfer rate, can further improve a spatial resolution of considering a point corresponding object images shown with receiving equipment 2.In other words, because the spatial resolution of the object images in the priority range is that cost improves with the temporal resolution of image, therefore can clearly illustrate the object images (that is, spatial resolution can further be improved) that on receiving equipment 2, shows more with limited transfer rate.
Following Figure 35 shows the example of the structure of combined treatment part 1022 shown in Figure 31.In Figure 35, the part identical with part shown in Figure 13 represented by identical reference number, and omission is to their explanation.In other words, in the combined treatment part 1022 background image tag memory 74 is not set.What be provided with in the combined treatment part 1022 is built-up section 1077 rather than built-up section 77.In addition, supply built-up section 1077 is click data rather than control information.Except these aspects, the structure of combined treatment part 1022 shown in Figure 35 structure with built-up section 22 shown in Figure 13 basically is identical.
Referring to Figure 35, Return Reception Dept. divides the background image data of 21 (referring to Figure 31) output to be transfused to background image to write part 71.Return Reception Dept. divides the object image data of 21 outputs to be transfused to object and to write part 72.Return Reception Dept. divide 21 output additional informations be transfused to background image write part 71, object writes part 72 and built-up section 1077.
Background image writes part 71 background image data of supplying with is write background image memory 73 continuously.Yet, in embodiment shown in Figure 35, background image tag memory 74 embodiment illustrated in fig. 13 is not set.Thereby in Figure 35, when background image write part 71 and background image data write on the background image memory 73, background image write part 71 and does not check a background image sign.
Read out in a background image of the frame that the current time (present frame) shows in the background image data that built-up section 1077 is stored corresponding to the background image mobile vector that comprises in the additional information from background image memory 73, corresponding to an object mobile vector that comprises in the additional information that an object images and this background image of storage in the object memories 75 is combined, and the combination image of present frame is supplied to display-memory 78.
In addition, when built-up section 1077 receives click data from click data importation 1024 shown in Figure 31, built-up section 1077 is read from object memories 75 and is comprised an object image data of considering the point coordinates position, this consideration point is comprised in the click data, and the object image data that obtains is supplied to subwindow memory 79.
Below in conjunction with the performed processing (combined treatment) of flowchart text shown in Figure 36 combined treatment part shown in Figure 35 1022.
At first, on step S121, object writes part 72 and writes the object image data that decoded portion shown in Figure 35 53 is supplied in a manner described corresponding to an object identity of storing in the object memories 75.
After this, flow process advances to step S122.Object writes part 72 and determines whether additional information comprises high resolution information.Definite result on step S122 shows that additional information comprises high resolution information (promptly, the user of receiving equipment 2 is operation push-button part 2-3, click data has sent to transmitting apparatus 1, and transmitting apparatus 1 has the object image data of high spatial resolution for an image in the priority range sends subsequently) time, flow process advances to step S123.On step S123, object writes part 72 the related object sign of object flag memory 76 is adjusted to " 1 ".
In other words, when transmitting apparatus 1 had the object image data of high spatial resolution for the transmission of the image in the priority range, on step S121, the object image data with high spatial resolution was written into object memories 75.Thereby on step S123, the object flag of forming the object pixels with high spatial resolution is adjusted to " 1 ".
After this, flow process advances to step S124.On step S124, built-up section 1077 is read the object image data in the priority range from object memories 75, and the object image data that obtains is written to subwindow memory 79.
In other words, when the definite result on the step S122 shows that additional information comprises high resolution information, then as described above, the user is operation push-button part 2-3, click data has sent to transmitting apparatus 1, and transmitting apparatus 1 has the object image data of high spatial resolution for an image in the priority range sends subsequently.The click data that sends to transmitting apparatus 1 also is supplied to built-up section 1077.When built-up section 1077 receives click data, on step S124, built-up section 1077 is corresponding to coordinate that is included in the consideration point in the click data and click time approval priority range, from object memories 75, read the object in priority range that has sent from transmitting apparatus 1, and the object that obtains is write subwindow memory 79 with high spatial resolution.In addition, as described above, when the header message of transmitting apparatus 1 transmission comprised click data, built-up section 1077 can be admitted this priority range according to the click data that comprises in the header message.
After this, flow process advances to step S125.On step S125, built-up section 1077 is read the background image data of present frame from background image memory 73 corresponding to the background image mobile vector that comprises in the additional information.In addition, built-up section 1077 is read the object image data of the present frame of demonstration from object memories 75.After this, built-up section 1077 is combined with the object image data of having read from object memories 75 with the background image data of present frame corresponding to an object mobile vector that comprises in the additional information, and the combination image of this present frame is write display-memory 78.In other words, built-up section 1077 is written to display-memory 78 to background image data, object image data is covered write display-memory 78 then.As a result, built-up section 1077 view data that will make up the present frame of background image and object images is written to display-memory 78.
In aforesaid way, the view data that is written to the present frame of display-memory 78 is supplied to image importation 23 shown in Figure 31 with the object image data that is written to subwindow memory 79, and shows thereon.
On the contrary, the definite result on step S122 shows that additional information does not comprise high resolution information (that is, the user of receiving equipment 2 is operation push-button part 2-3 not also), and flow process advances to step S125, skips steps S123 and S124.As described above, at step S125, built-up section 1077 is read the background image data of present frame from background image memory 73, from object memories 75, read required object image data, and the background image of present frame and the object image data of having read from object memories 75 are made up corresponding to additional information.As a result, the view data of present frame is formed and is written to display-memory 78.After this, flow process turns back to step S121.On step S121, the processing of built-up section 1077 duplication similarities.
According to above-mentioned combined treatment, according to in conjunction with the identical mode of the described situation of Figure 15, explicit user is considered as an image with high spatial resolution of an object.Combined treatment part 1022 shown in Figure 35 does not have background image tag memory 74 shown in Figure 13.Therefore, background image shown in Figure 35 writes part 71 and always the background image data of supply is written to background image memory 73.Therefore, in combined treatment shown in Figure 36, the spatial resolution of background image is unlike improving with reference to figures 13 to 15 described situations.
Below, with the method for explanation corresponding to the click data extraction object images of supplying with from receiving equipment 2 (object of interest zone).This method is carried out by object extraction part shown in Figure 29 1044.
Figure 37 shows the structure example of the object extraction part 1014 of preprocessing part shown in Figure 29 12.
In Figure 37, the view data that image importation 11 shown in Figure 29 is supplied is stored in video memory 201.This view data is read from video memory 201, and is supplied to common port, object picture extraction part 213 and the selector switch 207 of stagnant zone and moving area determining section 203.Video memory 201 storage stagnant zones and moving area determining section 203 are carried out the view data that stagnant zone and moving area are determined required several at least frames, and this stagnant zone and moving area determining section 203 are downstream parts of video memory 201.
In addition, receiving equipment 2 is stored in click data memory 202 through the click data that transfer path 3 sends.This click data is read and is supplied to the common port of stagnant zone and moving area determining section 204, clicks determining section 204 and selector switch 206 continuously from click data memory 202.Click data memory 202 is stored in as the continuous click determining section 204 performed continuous clicks of a downstream part of click data memory 202 determines the required click data of a predetermined period of time (for example, greater than 500 to 700 milliseconds).
Stagnant zone and moving area determining section 203 determine that the image-region (for example, 16 * 16 pixels) by click location (coordinate figure on the image) the local fritter on every side of the current click data representative of receiving equipment 2 transmissions is moving area or stagnant zone.In other words, stagnant zone and moving area determining section 203 are that 16 * 16 pixels around the click location obtain the difference between the image-region of the image-region of present frames and a past frame, this past frame is prior to the several frames of present frame (below, this past frame is called past frame).When the difference of interframe was equal to or less than predetermined threshold, stagnant zone and moving area determining section 203 determined that this image-region is a stagnant zone.On the contrary, when this difference during greater than predetermined threshold, stagnant zone and moving area determining section 203 determine that these image-regions are moving areas.When handling coloured image, stagnant zone and moving area determining section 203 be R, G and B each the image-region of image acquisition present frame of 16 * 16 pixels and the difference between the image-region of past frame.When the average absolute of the frame difference that obtains for R, G, B was equal to or less than a predetermined threshold (for example, being equal to or less than 10), stagnant zone and moving area determining section 203 determined that this image-region is a stagnant zone.On the contrary, when this mean value during greater than predetermined threshold, stagnant zone and moving area determining section 203 determine that these image-regions are moving areas.When moving area and stagnant zone determining section 203 determined that image-regions are a stagnant zone, stagnant zone and moving area determining section 203 determined that the current click data of click data memories 202 outputs is static click (clicks in the stagnant zone).When stagnant zone and moving area determining section 203 determined that image-regions are a moving area, stagnant zone and moving area determining section 203 determined that the current click data of click data memories 202 outputs is to move to click (click in the moving area).After this, stagnant zone and moving area determining section 203 are determined result's the static click of expression and are moved the information of clicking as stagnant zone and moving area to handling determining section 205 transmissions.
Click determining section 204 continuously and determine corresponding to the click time of the click data that sends from receiving equipment 2 whether the user of receiving equipment 2 has carried out clicking operation continuously.In other words, click time difference (that is the click time interval) between click time of click time that determining section 204 obtains the current click data that receiving equipments 2 send and last click data continuously.When this time difference is equal to or less than predetermined threshold, clicks determining section 204 continuously and determine that the user does not carry out continuous clicking operation.When continuous click determining section 204 determines that the user has carried out continuous clicking operation, click determining section 204 continuously and handle from the current click data of click data memory 202 outputs by continuous click.On the contrary, click determining section 204 continuously and determine that the user does not also carry out continuous clicking operation (promptly, time difference between current click time and last click time is equal to or less than predetermined threshold) time, click determining section 204 continuously and handle from the current click data of click data memory 202 outputs by discontinuous click.After this, click determining section 204 continuously and determine result's the continuous click of expression or the information of discontinuous click to handling the continuous click of determining section 205 transmission conducts.
Handle the continuous click that determining section 205 determines the result corresponding to the stagnant zone of stagnant zone and moving area determining section 203 and moving area and click determining section 204 continuously and determine that the result controls selector switch 206, selector switch 207 and selector switch 208.
In other words, determine the result and click definite result continuously corresponding to stagnant zone and moving area, when the current click data of click data memory 202 outputs is static click and continuous the click, handle determining section 205 control selector switches 206, make the current click data of click data memory 202 outputs be sent to stationary objects connection processing part 211.In addition, handle determining section 205 control selector switches 207, make the view data of video memory 201 outputs be sent to rest image connection processing part 211.Yet, handle determining section 205 control selector switches 208, make object extraction result memory 214 output last click data (back will illustrate), distribute to the object number (be equivalent to above-mentioned sign, and sort out (sign) object) of this click data and send to stationary objects connection processing part 211 with the corresponding object image data of this object number.
In addition, determine the result and click definite result continuously corresponding to stagnant zone and moving area, when the current click data of click data memory 202 outputs is a mobile click and continuous a click, handle determining section 205 control selector switches 206, make the current click data of click data memory 202 outputs send to mobile object connection processing part 210.In addition, handle determining section 205 control selector switches 207, make the view data of video memory 201 outputs send to mobile object connection processing part 210.In addition, handle determining section 205 control selector switches 208, make 214 outputs of object extraction result memory last click data (back will describe), distribute to the object number of this click data and send to mobile object connection processing part 210 corresponding to the object image data of this object number.
In addition, determine the result and click definite result continuously corresponding to stagnant zone and moving area, when the current click data of click data memory 202 outputs is a static click and a continuous click (time difference between current click time and last click time is equal to or greater than predetermined threshold), handle determining section 205 control selector switches 206, make the current click data of click data memory 202 outputs send to object number distribution portion 209.In addition, handle determining section 205 control selector switches 207, make from the view data of video memory 201 outputs and send to stationary objects connection processing part 211.At this moment, selector switch 208 control selector switches 208.Make last click data, object number and the object image data of 214 outputs of object extraction results processor not send to stationary objects connection processing part 211 (in the case, for example, selector switch 208 is opened).
In addition, determine the result and click definite result continuously corresponding to stagnant zone and moving area, when the current click data of click data memory 202 outputs is a mobile click and a discontinuous click (time difference between current click time and last click time is equal to or greater than predetermined threshold), handle determining section 205 control selector switches 206, make the current click data of click data memory 202 outputs not send to object number distribution portion 209.In addition, handle determining section 205 control selector switches 207, make the view data of video memory 201 outputs send to mobile object connection processing part 210.At this moment, handle determining section 205 control selector switches 208, make last click data, object number and the object image data of 214 outputs of object extraction result memory not send to mobile object connection processing part 210 (for example, opening selector switch 208).
Object number distribution portion 209 is distributed to the click data of being handled by a connection processing by stationary objects connection processing part 211 and mobile object connection processing part 210 (below will describe) as a discontinuous click rather than continuous click to a new object number, and this object number and click data are sent to object number memory 212.
When the current click data of handling determining section 205 definite click data memory 202 outputs is when moving click and continuous the click, mobile object connection processing part 210 determines whether last click data is to move to click, whether near the characteristics of image the current click location is comprised in the feature of the mobile object image area with object number of distributing to last click data, or whether their (current click datas and last click data) are identical.As definite result when being sure, mobile object connection processing part 210 determines that current clicks are the clicks to the same object image.Thereby mobile object connection processing part 210 is carried out a connection processing, and the object number identical with last click data distributed to current click data and this object number and click data are sent to object number memory 212.
When the definite result who handles determining section 205 shows that the current click data of click data memory 202 outputs is static click and continuous the click, stationary objects connection processing part 211 determines whether last click is static click, and whether current click location be comprised in the zone of the stationary objects image with object number of distributing to last click data, or whether current click location is near this zone.As definite result of stationary objects connection processing part 211 when being sure, stationary objects connection processing part 211 determines that current clicks are the clicks that are used for the object identical with last click.Thereby stationary objects connection processing part 211 is carried out a stationary objects connection processing, and the object number identical with last click data distributed to current click data and this object number and click data are sent to object number memory 212.
212 storages of object number memory have been distributed the click data of a plurality of past frames of object number by object number distribution portion 209, mobile object connection processing part 210 and stationary objects connection processing part 211, and the click data of this storage and object number are sent to object picture extraction part 213.
Object picture extraction part 213 is corresponding to supply and click datas that distributed a plurality of past frames of object number from object number memory 212, from the view data that video memory 201 is supplied with, extract stationary objects image, mobile object images, background image etc., and the result who extracts is supplied to object extraction result memory 214.
In other words, object picture extraction part 213 is corresponding to supply and the click datas that distributed a plurality of past frames of object number from object extraction part 213, obtains an object number that is dominant from the image section as the click data of the high click data density of having of static click.Object picture extraction part 213 forms the shape of an object corresponding to the distribution of the click data that distributes with the object number that is dominant, and extracts an image by an object images configuration object from this view data.
In addition, object picture extraction part 213 is near the image of the frame the click location of having distributed same object number in the click data that is confirmed as move clicking, carry out a pattern match operation, and carry out motion compensation for this image corresponding to this matching result.Yet object picture extraction part 213 obtains to have the high object number that is dominant of clicking an image-region (that is the image-region of adjusting by motion compensation) of density from a view data that is confirmed as the similar image zone.Object picture extraction part 213 constitutes the shape of an image corresponding to the distribution of the click data that has distributed the object number that is dominant, and extracts an image with an object images configuration from this image-region.
After this, object picture extraction part 213 is having low static click density and hanging down a mobile image section clicking density and be appointed as a background image.
The object image data that object extraction result memory 214 storage is extracted by object picture extraction part 213, and click data, object number etc.In case of necessity, object extraction result memory 214 extracts part 1013, additional information calculating section 15 and sends processing section 1016 these object image data of supply to background image shown in Figure 29.
Below, the click data that sends corresponding to receiving equipment 2 in conjunction with flowchart text shown in Figure 38 extracts the processing of the object images (interested subject area) that the user of receiving equipment 2 considers from an image of taking.This is handled by object extraction equipment shown in Figure 37 1014 and carries out.
At first, at step S131, the view data (frame image data no matter when input sends) of a frame of video memory 201 memory image importations, 11 inputs.Video memory 201 is stored in step S133 and goes up execution stagnant zone and the definite view data of handling required several at least frames of moving area.
On step S131, when receiving equipment 2 when transmit path 3 sends click datas, click data memory 202 these click datas of storage.Click data memory 202 be stored in continuous click that step S133 go up to carry out determine to handle (below will describe) at least one predetermined period of time (for example, reaching 500 to 700 milliseconds) required click data.
After this, flow process advances to step S132, and at step S132, object extraction part 1014 has determined whether to store that sent and the not processed click data of receiving equipment 2.When the definite result on the step S132 showed that click data memory 202 does not have to store not processed click data, flow process turned back to step S131.On step S131, object extraction part 1014 is waited for input image data and click data.On the contrary, when the definite result on the step S132 showed that click data memory 202 has been stored not processed click data, flow process advanced to step S133.On step S133, object extraction part 1014 is appointed as current click data to the oldest not processed click data.Stagnant zone and moving area determining section 203, click determining section 204 and handle determining section 205 continuously for should current click data carrying out stagnant zone and moving area and determine to handle and clicking continuously and determine processing.
In other words, on step S133, the click location information (image coordinate value) that contains in the current click data that stagnant zone and moving area determining section 203 use receiving equipment 2 to send is carried out definite processing of stagnant zone and moving area, to determine that an image-region as the local fritter around the click location is a moving area or a stagnant zone.
Below, stagnant zone that carries out on step S133 shown in Figure 38 and the definite processing of moving area carried out by stagnant zone and moving area determining section 203 are described more practically.Shown in the flow chart as shown in figure 39, on step S141, stagnant zone and moving area determining section 203 are read the view data and the click data of several frames respectively from video memory 201 and click data memory 202.When handling a coloured image, shown in Figure 40 (A), stagnant zone and moving area determining section 203 are read the view data of the several frames that are used for R (red), G (green), B (indigo plant).In addition, stagnant zone and moving area determining section 203 are read respectively from image data memory 201 and click data memory 202 and are comprised corresponding to the view data of several past frames of a frame of current click data with corresponding to the click data for the performed click of these several frames.
After this, on step S142, for the local fritter of being made up of 16 * 16 pixels on horizontal direction around the click location of current click data and the vertical direction, stagnant zone and moving area determining section 203 are calculated image-region of present frames and the difference between the past frame (this past frame is referred to as past frame) of the several frames of this present frame in advance.When handling a coloured image, on step S142, shown in Figure 40 (B) and 40 (A), stagnant zone and moving area determining section 203 obtains to be used for difference between each the frame of an image of being made up of 16 * 16 pixels of R, G and B and acquisition and is used for each the mean value of absolute difference of R, G and B.
After this, on step S143, stagnant zone and moving area determining section 203 determine whether the frame difference that calculates is equal to or less than a definite threshold value on step S142.After this, flow process advances to step S144.When frame difference was equal to or less than predetermined threshold, the fritter (this fritter can be referred to as current block) that stagnant zone and moving area determining section 203 determine to comprise the click location of current click data was a stagnant zone.On the contrary, when frame difference during greater than predetermined threshold, stagnant zone and moving area determining section 203 determine that these current blocks are moving areas.In addition, when definite result of stagnant zone and moving area determining section 203 showed that current block is a stagnant zone, 203 click datas corresponding to the image-region of current block of stagnant zone and moving area determining section were appointed as a static click.On the contrary, when definite result of stagnant zone and moving area determining section 203 showed that current block is a moving area, stagnant zone and moving area determining section 203 were appointed as one to the image-region click data that is equivalent to current block and are moved click.Stagnant zone and 203 outputs of moving area determining section determine that as stagnant zone and moving area the result's should determine the result.
Under the situation of handling a coloured image, at step S144, shown in Figure 40 (D), (for example be less than or equal to a predetermined threshold when being used for each the average absolute of frame difference of each piece of 16 * 16 pixels of R, G and B, " 10 ") time, stagnant zone and moving area determining section 203 are for example adjusted to " 0 " to predetermined flag.On the contrary, when this mean value during greater than predetermined threshold, stagnant zone and moving area determining section 203 are for example adjusted to " 1 " to predetermined flag.On step S144, shown in Figure 40 (E), when all signs of R, the G of the current block that is used for 16 * 16 pixels and B all are " 0 ", stagnant zone and moving area determining section 203 determine that current blocks are stagnant zones, and handle is appointed as static click corresponding to the click data of the image-region of current block.On the contrary, when one of sign was " 1 ", stagnant zone and moving area determining section 203 determined that current block is that a moving area and handle are appointed as mobile a click with the corresponding click data of the image-region of current block.Stagnant zone and 203 outputs of moving area determining section are determined result's static click or are moved the information of clicking as stagnant zone and moving area.
Turn back to Figure 38, on step S133, click continuously determining section 204 and click continuously and determine to handle, determine that whether the performed clicking operation of the user of receiving equipment 2 is the corresponding continuous clicking operation of click time that comprises in the click data that sends with receiving equipment 2.
The following describes the processing of the continuous click determining section of on step S133, carrying out 204 shown in Figure 38.According to flow chart shown in Figure 41, on step S151, click data memory 202 reads out in the click data of storage in the click data memory 202.
After this, on step S152, click the time difference (clicking the interval) between click time (in the preceding time) of click time that determining section 204 calculates the current click data that sends from receiving equipment 2 and last click data continuously.
Then, on step S153, click determining section 204 continuously and determine whether this time difference is equal to or less than a predetermined threshold.When the definite result on the step S153 shows that this time difference is equal to or less than predetermined threshold, click determining section 204 continuously and determine that current click data is continuous click.On the contrary, when the definite result on the step S153 shows that this time difference is greater than predetermined threshold, click determining section 204 continuously and determine that current click data is not continuous click.After this, flow process advances to step S154.On step S154, click determining section 204 output representatives continuously and hit definite result's the continuous click or the information of discontinuous click as continuity point.
In other words, be when clicking continuously when the definite result on the step S153 shows current click data, the user of receiving equipment 2 carries out clicking operation continuously to an object images often.This be because, when the user of receiving equipment 2 asked transmitting apparatus 1 transmission to have the object image data (data in the object of interest zone) of high spatial resolution, he or she often clicked the object images part (object of interest zone) that the user wants to improve its spatial resolution continuously.Thereby, when definite result of continuous click determining section 204 shows that current clicking operation is continuous clicking operation, click determining section 204 continuously current click data is appointed as continuous click.On the contrary, when definite result of continuous click determining section 204 show current clicking operation be not continuous clicking operation (promptly, time difference between the click time of current click and the click time of last click is equal to or greater than predetermined threshold) time, click determining section 204 continuously current click data is appointed as a discontinuous click.Click continuous click of determining section 204 outputs continuously and determine the result.
Turn back to Figure 38, on step S133, when definite result of stagnant zone and moving area determining section 203 and continuous click determining section 204 showed that current click data is static click and continuous the click, processing determining section 205 was controlled selector switch 206, selector switch 207 and selector switch 208 in a manner described.Thereby on step S135, stationary objects connection processing part 211 is carried out a rest image connection processing.When definite result shows that current click data is a mobile click and continuous a click, handle determining section 205 and control selector switch 206, selector switch 207 and selector switch 208 in a manner described.Thereby on step S136, mobile object connection processing part 210 is carried out a mobile object connection processing.When definite result shows that current click data right and wrong are clicked continuously, handle determining section 205 and control selector switch 206, selector switch 207 and selector switch 208 in a manner described.On step S136, mobile object connection processing part 210 is carried out mobile object connection processing.When definite result shows that current click data is a discontinuous click, handle determining section 205 and control selector switch 206 in a manner described.Thereby on step S134, object number distribution portion 209 is carried out new object number allocation process.
In other words, when the definite result on the step S133 showed that current click data right and wrong are clicked continuously, flow process turned back to step S134.On step S134, object number distribution portion 209 is distributed to current click data to a new object number.After this, flow process turns back to step S131.
In fact, shown in Figure 42 (A), when distributing to a object number by the last click data CL1 of solid line X representative is for example when " 0 ", if (promptly by the current click data CL2 of the dotted line X mark among Figure 42 (A) representative, do not specify the click data of an object number) when being confirmed as a discontinuous click, object number distribution portion 209 is distributed to a new object number the current click data CL2 (in this example, new object number is " 1 ") that is represented by the solid line X mark among Figure 42 (B).
On the contrary, when the definite result on the step S133 shows that current click data is click continuously and static click, if last click is static click, and current click location is comprised in and distributes in the corresponding image-region of object number of last click data, perhaps current click location is near this image-region, and then stationary objects connection processing part 211 judges that current click is a click that is used for the object images identical with last click.Thereby on step S134, stationary objects connection processing part 211 is carried out the rest image connection processing, is used to current click data to distribute the object number identical with last click data.
In other words, shown in the flow chart as shown in figure 43, on step S161, stationary objects connection processing part 211 determines whether last click data is to click continuously and static click.When the definite result on the step S161 showed that last click data is click continuously and static click, flow process advanced to step S162.On the contrary, when the definite result on the step S161 showed that last click is not click continuously and static click, flow process advanced to step S164.
When the definite result on the step S161 shows that last click data is not continuous and static click, on step S164, stationary objects connection processing part 211 according to in conjunction with Figure 42 (A) and (B) described identical mode a new object number is distributed to current click data.After this, flow process advances to step S137 shown in Figure 38.
On the contrary, when the definite result on the step S161 showed that last click data is click continuously and static click, flow process turned back to step S162.On step S162, stationary objects connection processing part 211 obtains current click location and corresponding to the space length between the image-region of the object number of distributing to last click data.When current click location is comprised in and distributes in the corresponding image-region of object number of last click data, or current click location is during near this image-region, and stationary objects connection processing part 211 determines that current click datas are click datas of the object images identical with last click.On the contrary, when current click location is not included in and distributes in the corresponding image-region of object number of last click data, and current click location is during away from this image-region, and stationary objects connection processing part 211 determines that current click datas are click datas of the object images different with last click.When the definite result on the step S162 showed that current click data is the click data of the object images identical with last click, flow process advanced to step S163.On the contrary, be that flow process turns back to step S164 when being different from the click data of object images of last click when the definite result on the step S162 shows current click data.
When the definite result on the step S162 shows current click data is that flow process turns back to step S164 when being different from the click data of object images of last click.At step S164, stationary objects connection processing part 211 is distributed to current click data to a new object number.After this, flow process advances to step S137 shown in Figure 38.
On the contrary, when the definite result on the step S162 shows that current click data is the click data of the object images identical with last click, on step S163, stationary objects connection processing part 211 is carried out the stationary objects connection processing, is used for an object number identical with last click data and distributes to current click data.
In fact, shown in Figure 42 (C), when distributing to a object number by the represented last click data CL1 of solid line X mark and be " 0 ", if (promptly by the represented current click mark CL2 of the dotted line X mark shown in Figure 42 (C), specify an object number for current click data) be confirmed as clicking continuously and static click, before a bit be that static click and current click location are comprised in and distribute in the corresponding image-region of object number of last click data, perhaps current click location is near this image-region, and then 211 object number identical with last click data of stationary objects connection processing part (being " 1 " in this example) are distributed to the current click data CL2 that is represented by the solid line X mark shown in Figure 42 (D).
After 211 object number identical with last click data of stationary objects connection processing part are distributed to current click data.Flow process advances to step S137 shown in Figure 38.
When the definite result on the step S133 show current click data be click continuously and move click, last click be move click and current click location near characteristics of image be comprised in and distribute in the feature of the corresponding image-region of object number (16 * 16 pixel) of last click or the former when approaching the latter, mobile object connection processing part 210 determines that these clicks are clicks of the object images identical with last click.On step S136, mobile object connection processing part 210 is carried out mobile object connection processing, is used for an object number identical with last click data and distributes to current click data.
In other words, when the definite result on the step S133 shows that current click data is click continuously and mobile the click, then as shown in figure 44, on step S171, mobile object connection processing part 210 determines whether last click data is to click continuously and move and click.When the definite result on the step S171 showed that last click data is click continuously and mobile the click, flow process advanced to step S172.On the contrary, when the definite result on the step S171 was not click continuously and mobile the click, flow process advanced to step S174.
When the definite result on the step S171 showed that last click data is not click continuously and mobile the click, flow process advanced to step S174.At step S174, mobile object connection processing part 210 according in conjunction with Figure 41 (A) and (B) described same way as a new object number is distributed to current click data.After this, flow process advances to step S137 shown in Figure 38.
When the definite result on the step S171 showed that last click data is click continuously and mobile the click, flow process advanced to step S172.At step S172, mobile object connection processing part 210 obtain near the view data (16 * 16 pixel) the current click location feature and with the feature of the corresponding image-region of object number of distributing to last click.When near the feature of the image-region the current click location is included in and distributes in the feature of the corresponding image-region of object number of last click data, and the former is during near the latter, and Return Reception Dept. divides 210 to determine that these clicks are clicks of the object images identical with last click.On the contrary, when with the feature of the corresponding image-region of object number of distributing to last click data in when not comprising near the feature of the image-region the current click location, or the former is during away from the latter, and Return Reception Dept. divides 210 to determine that current click datas are the click datas that are different from the object images of last click.In the case, the feature of image-region is near color (average color, typical color, or analog), bar chart or the pattern of for example regional area (16 * 16 pixel) click location.When same object number is assigned to a plurality of mobile click, then means and in these click datas, follow the tracks of an object.When the definite result on the step S172 shows current click data is that flow process advances to step S173 when being used for the click data of the object images identical with last click.On the contrary, be that flow process advances to step S174 when being used to be different from the click data of object of last click when the definite result on the step S172 shows current click data.
When the definite result on the step S172 shows current click data is that flow process advances to step S174 when being used to be different from the click data of object images of last click.On step S174, mobile object coupling part 210 is distributed to current click data to a new object number in a manner described.After this, flow process advances to step S137 shown in Figure 38.
When the definite result on the step S172 shows current click data is that flow process advances to step S173 when being used for the click data of the object images identical with last click.At step S173,210 object number identical with last click data of mobile object connection processing part are distributed to current click data.
In fact, when an object number distributing to the last click data CL1 that is represented by the solid line X shown in Figure 42 (E) is for example when " 0 ", if the current click data CL2 that is represented by the dotted line X shown in Figure 42 (E) is confirmed as clicking continuously and moving and click, and near the feature of image the current click location is comprised in and distributes in the feature of the corresponding object images of object number of last click, perhaps the former approaches the latter, and 210 object number identical with last click data of then mobile object connection processing part (being " 0 " in this example) are distributed to the current click data CL2 that is represented by the solid line X mark shown in Figure 42 (F).
At step S173, after 210 object number identical with last click of mobile object connection processing part were distributed to current click data, flow process advanced to step S137.
When flow process when step S135 shown in Figure 38 advances to step S137, object picture extraction part 213 is extracted stationary objects image, mobile object images and other background image from being stored in the view data of the several frames of past in the video memory 20 with the corresponding input data neutralization of click data that has distributed object number and be stored in the several frames of past the object number memory 212.In other words, seemingly the stationary objects image is comprised in the image section with high static click packing density.At this moment, object picture extraction part 213 obtains and has distributed the corresponding static click packing density of several frames of past of object number, acquisition has the object number that is dominant of the image section of high static click density, an image by stationary objects image configuration object is extracted in formation with the shape of the corresponding object of distribution of the click data that has distributed the object number that is dominant with from view data.
When flow process when step S136 advances to step S137, object picture extraction part 213 is carried out the pattern matching operation near the image of the frame the click location of the mobile click data that distributed identical object number, motion compensation corresponding to this matching result carries out image, acquisition has the high object number that is dominant of clicking the pattern match image-region of density, form the shape of an object corresponding to the distribution of the click data that has distributed the object number that is dominant, and from view data, extract a image by mobile object images configuration object.
On step S137, object picture extraction part 213 is treated to the current background image to an image section with low static click density or low mobile click data.In other words, object picture extraction part 213 is treated to a background image to an image section except that stationary objects image and mobile object images that extracts from view data.
Below in conjunction with the processing on the flow chart detailed description step S137 shown in Figure 45.At first, on step S181, object picture extraction part 213 is caught click data and the view data corresponding with it that several past frames distribute object number.After this on step S182, object picture extraction part 213 classifies as static click to click data and moves and click.When flow process when step S135 shown in Figure 38 advances to step S137, flow process advances to step S184 from step S182 shown in Figure 45.On the contrary, when flow process when step S136 shown in Figure 38 advances to step S137, flow process advances to step S184 from step S182 shown in Figure 45.
When flow process when step S135 shown in Figure 38 advances to step S137, flow process advances to step S184 from step S182 shown in Figure 45.On step S184, object picture extraction part 213 obtain to be used for 16 * 16 pixels each piece distribution the static click packing density of each static click of an object number.
After this, on step S185, whether the static click density that object picture extraction part 213 is identified for the represented static click of the X mark by shown in Figure 46 (A) of each piece bk of 16 * 16 pixels represented by an image frame of broken lines is equal to or greater than a predetermined value.
In sending to an image of receiving equipment 2, the image section with high static click density often comprises a stationary objects image.Thereby when one specific static click density was equal to or greater than predetermined value, flow process advanced to step S186.On the contrary, when one specific static click density during less than predetermined value, flow process advances to step S190.
At step S186, when the static click density of piece surpassed a predetermined value, shown in Figure 46 (E), object picture extraction part 213 obtained the object number that is dominant most from the object number of the click data of distributing to piece.After this, 213 combinations of object picture extraction part and the object number that is dominant corresponding (BK0, BK2, BK4, and BK5) and the configuration object shown in Figure 46 (B).Object picture extraction part 213 is extracted the image by stationary objects image configuration object from view data.After step S186, flow process advances to step S138 shown in Figure 38.
On the other hand, when flow process when step S136 shown in Figure 38 advances to step S137, flow process advances to step S183 from step S182 shown in Figure 45.
On step S183, shown in Figure 46 (C), object picture extraction part 213 is carried out the pattern matching operation near the image of a plurality of past frames the click location of having distributed the click data object number identical with the mobile click of being represented by the X mark shown in Figure 46 (C), and is to carry out motion compensation corresponding to the image of matching result.
After this, on step S187, object picture extraction part 213 obtains the mobile density of clicking in the pattern match image-region.
After this, on step S188, object picture extraction part 213 determines whether the mobile click density by the mobile click of the expression of the X mark shown in Figure 46 (D) of these images is equal to or greater than predetermined value.
An image section that has carried out motion compensation and had a high mobile click density often comprises a mobile object images.Thereby when the mobile click density of an image-region that has carried out motion compensation was equal to or greater than predetermined value, flow process advanced to step step S189.On the contrary, when the mobile click density of an image-region that has carried out motion compensation during less than predetermined value, flow process advances to step step S190.
On step S189, object picture extraction part 213 moves an image-region clicking density obtain the object number that is dominant most from the object number of distributing to click data for having of the predetermined value of being equal to or greater than.After this, shown in Figure 46 (D), 213 combinations of object picture extraction part and be dominant corresponding of object number (BK3 and BK6) and configuration object.After this, object picture extraction part 213 is extracted the image by mobile object images configuration from this view data.After step S189, flow process advances to step S138 shown in Figure 38.
On step S185 and S188, when clicking density less than predetermined value, flow process advances to step S190.On step S190, object picture extraction part 213 is having the background image region that a low static click density or a low image section that moves click density are treated to present image.In other words, object picture extraction part 213 is treated to background image to the image section except that stationary objects image and mobile object images that has extracted from view data.After step S190, flow process advances to step step S138 shown in Figure 38.
Object picture extraction part 213 had been extracted a stationary objects image, mobile object images and background image from view data after, flow process advanced to step S138 shown in Figure 38.On step S138, object picture extraction part 213 determines whether the object extraction processing is finished.When the definite result on the step S138 for negate the time, flow process advances to step S131.As the definite result on the step S138 when being sure, 213 of object extraction parts have been finished the object extraction processing.
In above-mentioned processing, the object extraction part 1014 of transmitting apparatus 1 shown in Figure 29 can be corresponding to clicking operation corresponding click data extraction stationary objects image, mobile object images and the background image performed with the user of receiving equipment 2.
In the embodiment shown in Figure 28, its movable little plane picture zone (that is, not having image to be worth) is treated to background image.For this background image, do not improve spatial resolution.On the other hand, can extract background image corresponding to the click data that sends from receiving equipment 2, so that improve its spatial resolution.
In this case, background image shown in Figure 29 extracts the corresponding background image of click data that part 1013 is extracted and receiving equipment 2 sends.Send processing section 1016 and send background images, so that the same way as that is enhanced according to the spatial resolution of object images is improved its spatial resolution.
In this case, as shown in figure 47, background image tag memory 74 shown in Figure 13 is added on the combined treatment part 1022 shown in Figure 31.Except background image tag memory 74, the structure of combined treatment part 1022 shown in Figure 47 is identical with the structure of combined treatment part 1022 shown in Figure 35.As structure shown in Figure 13, in structure shown in Figure 47, when background image write part 17 background image with object of high spatial resolution is written to background image memory 73, the background image sign of storing on an address of background image tag memory 74 corresponding to each pixel of forming background image became " 1 " from " 0 ".In other words, when background image write part 71 background image data is written to background image memory 73, background image write part 71 and checks background image tag memory 74.When background image is masked as " 1 " (promptly, when background image memory 73 has been stored the background image data with high spatial resolution) time, background image writes part 71 and the background image data with low spatial resolution is not written to background image memory 73.No matter when write part 71 supply background image datas to background image, this background image data all is written to background image and writes part 71.Yet when background image memory 73 had been stored the background image data with high spatial resolution, background image write part 71 and the background image data with low spatial resolution is not written to background image memory 73.Thereby, no matter when writing the background image data that part 71 supplies have high spatial resolution to background image, the number with background image of high spatial resolution all is increased in the background image memory 73.
In this example, when built-up section 1077 receives click data from click data importation 1024 shown in Figure 31, built-up section 1077 is read object image data and background image data from background image memory 73 and object memories 75, these data are included in a coordinate position of considering point that contains in the click data, and give subwindow memory 79 data supply that obtains.
The following describes definite (approval) receiving equipment 2 the user the object of interest zone variation and sort out the technology in each object of interest zone.
Whether various types of images are analyzed, changed and whether sorted out each object of interest zone to determine the user's interest subject area.This analysis has shown following result.
At first, people's (user) object of interest zone is the territory element (for example, object) with value.
The second, when user's object of interest changed, each territory element with value changed.
The 3rd, when user's object of interest changed, required cycle input time of the interested object of designated user was often elongated.
The 4th, when user's object of interest changed, the user specified the space length between the input position of an object of interest zone (for example, clicking operation) often relatively elongated.
Thereby, Figure 48 shows the structure example of transmitting apparatus 1, click data corresponding input time of the interval and the input position distance of user's input of its acquisition and receiving equipment 2, whether user's the object of interest zone of determining receiving equipment 2 owing to considering that analysis result (1) to (4) changes, and sort out the object of interest zone.Transmitting apparatus 1 shown in Figure 48 is the improvement of transmitting apparatus 1 shown in Figure 28.For for simplicity, in Figure 48, the part similar with Figure 29 represented by similar reference number, and omitted its explanation.In other words, determine and sort out part 240 newly to be set to form the module of preprocessing part 12 except changing, the structure of the structure of transmitting apparatus 1 shown in Figure 48 and transmitting apparatus 1 shown in Figure 29 is basic identical.
According to embodiment shown in Figure 48, change and determine and sort out part 240 to be set in the preprocessing part 12.On the other hand, variation determines and sorts out that object extraction part 1014 or background image that part 240 can be set at preprocessing part 12 extract in the part 1013.Again on the one hand, change definite and sort out part 240 and can also be independent of preprocessing part 12 settings.
In this embodiment, a position of representing with click data is the consideration point that the user is considering.Consider that point can not be a click data for one of the user, but user's consideration point can be admitted by the view direction that detects the user.
Figure 49 shows the structure example that part 240 is determined and sorted out in variation shown in Figure 48.
The 231 temporary transient storages of input picture storage area are by the view data of image importation 11 outputs.In case of necessity, input picture storage area 231 is supplied to view data the neighboring area to extract part 233 and stagnant zone and moving area determining section 234.
Click data obtains the data that part 230 temporary transient storage receiving equipments 2 send through transfer path 3, and is supplied to the click data of storage the neighboring area to extract part 233, stagnant zone and moving area determining section 234, input time interval calculation part 237 and input position distance calculation part 238.
In this embodiment, the same with video memory 201 shown in Figure 37, input picture storage area 231 can be set.In addition, the same with click data memory 202 shown in Figure 37, the click data store can be set divide 232.
Clicking the neighboring area extracts part 233 and extracts the corresponding image-region of click data supplied with click data storage area 232 from the view data that input picture storage area 231 is supplied with (this image-region is near the local fritter the click location for example; Hereinafter, image-region is referred to as and clicks the neighboring area).Data by the click neighboring area of clicking 233 extractions of neighboring area extraction part are sent to click neighboring area storage area 235.After data were stored in click neighboring area storage area 235, these data were sent to the object of interest zone and sort out part 236.
In addition, static and moving area determining section 234 is used with the frame difference of embodiment same way as shown in Figure 37 corresponding to the view data of supplying from input picture storage area 231 with from the click data that click data storage area 232 is supplied and is carried out stagnant zone and the definite processing of moving area.
Click the neighboring area extract processing and stagnant zone and moving area determine processing in conjunction with the drawings 38 described same treatment finish.Thereby, omit detailed description to these processing.In embodiment shown in Figure 49, click data is confirmed as static click or moves the result who clicks being output as static click or moving to click and determine in the identical mode of the foregoing description.In addition, for example, can export and click the result that the neighboring area is confirmed as stagnant zone or moving area.According to embodiment shown in Figure 49, for for simplicity, as stagnant zone and the definite result of moving area, will static click of explanation or the mobile situation about being output of clicking.
The stagnant zone of stagnant zone and moving area determining section 234 and moving area determine that the result is sent to interval calculation part 237 and input position distance calculation part 238 input time.
When stagnant zone and moving area determined that the result shows that click data is static click, input time, interval calculation part 237 was calculated the time interval of the input time of input time of last click and current click.In the case, at interval and for example do not consider whether a mobile click occurred between the input time of input time of last static click and current static click computing time.The data in the time interval of interval calculation part 237 calculating input time are sent to interests change determining section 239.
When stagnant zone and moving area determined that the result shows that click data is static click, input position distance calculation part 238 was calculated the space length between the input click location (coordinate position) of the input click location (coordinate position) of last static click and current static click.In the case, the computer memory distance, and for example for example do not consider between the input position of the input position of current static click and last static click, whether to exist one to move the input position of clicking.The data of the space length that input position distance calculation part 238 is calculated are sent to interests change determining section 239.
When stagnant zone and moving area determined that click data is static click as a result, the space length that interests change determining section 239 is calculated corresponding to time interval that input time, interval calculation part 237 was calculated and position distance calculation part 238 determined whether user's object of interest changes.In other words, the predetermined weighted of 239 times of implementation of interests change determining section interval and space length whether determine the weighting time interval above a predetermined threshold (time), and whether definite weighted space distance surpasses a predetermined threshold (distance).When the weighting time interval surpassed predetermined threshold and/or weighted space distance and surpasses predetermined threshold, interests change determining section 239 determined that users' object of interest changes.Object of interest determining section 239 changes object of interest determines that the result sends to the object of interest zone and sorts out part 236.
When definite result of interests change determining section 239 shows that user's object of interest does not change, the object of interest zone is sorted out part 236 and is determined that the click neighboring area of current static click is comprised in the image-region identical with the corresponding object of interest in click neighboring area of last (past) static click, the click neighboring area of current static click (is for example classified as the object of interest zone identical with the click neighboring area of last static click, identical classification number is distributed to the click neighboring area of current static click), and the result of output classification.In other words, when sorting out the object of interest zone, distribute identical object number according to the same way as of the foregoing description for each object.
On the contrary, when definite result of interests change determining section 239 shows that user's object of interest has changed, object of interest zone is sorted out part 236 and is determined that the click neighboring area of current static clicks is not comprised in the corresponding object of interest zone, click neighboring area with last (past) static click, then export the storage data of the click periphery image of current static click, and reset the data of storage in clicking neighboring area storage area 235.After this, the object of interest zone is sorted out part 236 the click neighboring area of current static click is classified as the object of interest zone different with the click neighboring area of last static click (for example, a different classification number being distributed to the click neighboring area of current static click).In other words, when sorting out an object of interest zone for each object, distribute new different object number with the same way as of above-mentioned situation.
Determine the result when stagnant zone and moving area and show that click data is to move when clicking, the time interval between the input time of same input time of calculating last mobile click of interval calculation part 237 input time and current mobile click.In the case, the time interval is calculated, and does not consider that whether for example static click takes place between the input time of input time of current mobile click and last mobile click.The data in the time interval of interval calculation part 237 calculating input time are sent to interests change determining section 239.
On the contrary, determine the result when stagnant zone and moving area and show that click data is to move when clicking that same input position distance calculation part 238 is calculated the space length between the input click location of the input click location of last mobile click and current mobile click.In the case, space length is calculated, and whether does not for example consider to exist between the input position of the input position of current mobile click and last mobile click the input position of a static click.The data of the space length that input position distance calculation part 238 is calculated are sent to interests change determining section 239.
In addition, determine that at stagnant zone and moving area the result shows that click data is one and moves under the situation about clicking, the space length that interests change determining section 239 is calculated corresponding to the time interval and the position distance calculation part 238 of interval calculation part 237 calculating input time determines whether user's object of interest changes.In other words, whether the predetermined weighted of 239 times of implementation of interests change determining section interval and space length determines the weighting time interval above predetermined threshold (time), and whether definite weighted space surpasses predetermined threshold (distance) at interval.When the weighting time interval surpassed predetermined threshold and/or weighted space distance and surpasses predetermined threshold, interests change determining section 239 determined that users' object of interest changes.The object of interest of interests change determining section 239 changes determines that the result is sent to the object of interest zone and sorts out part 236.
When definite result of interests change determining section 239 shows that user's object of interest does not change, the object of interest zone is sorted out part 236 and is determined that the click neighboring area of current mobile click was comprised in the image-region identical with corresponding object of interest zone, last (past) mobile click neighboring area of clicking, the click neighboring area of current mobile click is classified as the object of interest zone identical with last (old) mobile click neighboring area of clicking (promptly, identical classification number is distributed to the click neighboring area of current mobile click), and the result is sorted out in output.In other words, when sorting out an object of interest zone for each object, distribute identical object number.
On the contrary, when definite result of interests change determining section 239 shows that user's object of interest changes, the object of interest zone is sorted out part 236 and is determined that the click neighboring area of current mobile click was not comprised in and last (past) moves in corresponding object of interest zone, click neighboring area of clicking, export the storage data of the click periphery image of current mobile click, and reset the data of storage in clicking neighboring area storage area 235.After this, the object of interest zone is sorted out part 236 the click neighboring area of current mobile click is classified as the object of interest zone different with the click neighboring area of last mobile click (for example, a different classification number being distributed to the click neighboring area of current mobile click).In other words, when sorting out an object of interest zone for each object, distribute new different object number.
Determine and sort out the processing of part 240 below in conjunction with flowchart text shown in Figure 50 variation shown in Figure 49.
At step S201, change to determine and sort out part 240 to receive from the view data of image importation 11 with by the click data of user's input of receiving equipment 2.
After this, on step S202, the view data of image importation 11 supplies is stored in input picture storage area 231.The click data that click data acquisition part 230 is obtained is stored in the click data storage area 232.
After this, on step S203, click the neighboring area and extract part 233 and from the click data of the image that is stored and from input picture storage area 231, reads, extract and the corresponding image-region of click data (click neighboring area).On step S204, click the data that neighboring area storage area 235 is stored the click neighboring area that has been extracted.
After this, at step S205, stagnant zone and moving area determine that zone 234 uses frame difference to carry out stagnant zone in a manner described and moving area is determined to handle.
At step S205, when the upward definite result of step S205 showed that click data is static click, flow process advanced to step S206.On the contrary, be that flow process advances to step S212 when moving click when the definite result on the step S205 shows click data.
Definite result on step S205 shows that click data is static click, and flow process advances to step S206.On step S206, the time interval between the input time of the last static click of interval calculation part 237 calculating input time and the input time of current static click.Whether this time interval is calculated, and for example do not consider to exist between the input time of input time of current static click and last static click one to move and click.
After this, on step S207, the space length between the input click location (coordinate position) of the last static click of input position distance calculation part 238 calculating and the input click location (coordinate position) of current static click.In the case, the computer memory distance, and for example do not consider between the input position of the input position of current static click and last static click, to exist one to move the input position of clicking.
On step S208, object of interest determining section 239 determined corresponding to the last time interval of calculating of step S206 and the last space length that calculates of step S207 whether user's object of interest changes.In other words, as mentioned above, the predetermined weighted of 239 times of implementation of interests change determining section interval and space interval determines whether the weighting time interval surpasses a predetermined threshold (time) and whether definite weighted space distance surpasses a predetermined threshold (distance).When the weighting time interval surpassed predetermined threshold and/or weighted space distance and surpasses predetermined threshold, interests change determining section 239 determined that users' object of interest changes.When the definite result on the step S208 showed that object of interest has changed, flow process advanced to step step S209.When the definite result on the step S208 showed that object of interest does not change, flow process advanced to step S211.
When the definite result on the step S208 showed that object of interest does not change, flow process advanced to step step S211.At step S211, the object of interest zone is sorted out part 236 and is determined that the click neighboring area of current static click is comprised in the image-region identical with corresponding object of interest zone, the click neighboring area of last (past) static click, and the click neighboring area of current static click classified as the object of interest zone identical with the click neighboring area of last static click (that is, identical classification number being distributed to the click neighboring area of current static click).In other words, when sorting out an object of interest zone for each object, distribute identical object number in the identical mode of the foregoing description.After step S211, flow process advances to step S218.
On the contrary, when determining that on step S208 the result shows that user's object of interest changes, flow process advances to step step S209.On step S209, object of interest zone is sorted out part 236 and is determined that the click neighboring area of current static clicks is not comprised in the corresponding object of interest zone, click neighboring area with last (old) static click, export the storage data of the click periphery image of current static click, and reset the data of storage.After this, on step S210, the object of interest zone is sorted out part 236 the click neighboring area of current static click is classified as the object of interest zone different with the click neighboring area of last static click (for example, a different classification number being distributed to the click neighboring area of current static click).In other words, when sorting out an object of interest zone for each object, distribute new different object number in the identical mode of the foregoing description.After step S211, flow process advances to step S218.
On the contrary, be that flow process advances to step S212 when moving click when the definite result on the step S205 shows click data.On step S212, the time interval between the input time of the last mobile click of interests change determining section 239 calculating and the input time of current mobile click.In the case, the time interval is calculated, and does not for example consider whether there is a static click between the input time of input time of current mobile click and last mobile click.
After this, at step S213, the space length between the input click location of the last mobile click of interests change determining section 239 calculating and the input click location of current mobile click.In the case, this space length is calculated, and does not for example consider whether to exist between the input position of the input position of current mobile click and last mobile click the input position of a static click.
After this, at step S214, interests change determining section 239 is corresponding to determine in the time interval of calculating on the step S212 and the space length that calculates whether user's object of interest changes on step S213.In other words, the predetermined weighted of 239 times of implementation of interests change determining section interval and space length determines whether the weighting time interval surpasses a predetermined threshold (time) and whether definite weighted space distance surpasses a predetermined threshold (distance).When the weighting time interval surpassed predetermined threshold and/or weighted space distance and surpasses predetermined threshold, interests change determining section 239 determined that users' object of interest changes.When the definite result on the step S214 showed that object of interest has changed, flow process advanced to step S215.When the definite result on the step S214 showed that object of interest does not change, flow process advanced to step S217.
When the definite result on the step S214 showed that user's object of interest does not change, flow process advanced to step S217.At step S217, as mentioned above, the object of interest zone is sorted out part 236 and is determined that the click neighboring area of current mobile click is comprised in the image-region identical with the object of interest that moved the click neighboring area of clicking corresponding to last (past), the click neighboring area of current mobile click is classified as corresponding object of interest zone, click neighboring area (for example, identical classification number being distributed to the click neighboring area of current mobile click) with last mobile click.In other words, when sorting out an object of interest zone for each object, distribute identical object number according to the same way as of the foregoing description.After step S217, flow process advances to step S218.
On the contrary, when the definite result on the step S214 showed that user's object of interest has changed, flow process advanced to step S215.At step S215, the object of interest zone is sorted out part 236 and is determined that the click neighboring area of current mobile click was not comprised in and last (past) moves in corresponding object of interest zone, click neighboring area of clicking, export the storage data of the click periphery image of current mobile click, and reset the data of storage.After this at step S216, the object of interest zone is sorted out part 236 the click neighboring area of current mobile click is classified as the object of interest zone different with the click neighboring area of last mobile click (for example, a different classification number being distributed to the click neighboring area of current mobile click).In other words, when sorting out an object of interest zone for each object, distribute new different object number with the same way as of the foregoing description.After step S216, flow process advances to step S218.
After step S210, S211, S216 and S217, flow process advances to step S218.At step S218, variation is determined and is sorted out part 240 and determine whether all processing are finished.Definite result on step S218 whether regularly, flow process turns back to step S201.When step S218 upward determined that the result is sure, variation was determined and is sorted out part 240 and finish processing shown in Figure 50.
Below, change definite the processing in conjunction with the object of interest on step S208 and S214 shown in Figure 50 of the flowchart text shown in Figure 51.
At step S221, interests change determining section 239 obtains the information in the time interval.After this, at step S222,239 times of implementation of interests change determining section predetermined weighted at interval.At step S223, interests change determines that processing section 239 obtains the information of space length.After this, at step S224, interests change determining section 239 is carried out the predetermined weighted of space length.The order of step S221 and S222 can change.Equally, the order of step S223 and S224 also can change.The weighted in the time interval can be passed through compression time unit's (for example, compression of ms/10) and carry out.The weighted of space length can by in the horizontal direction with vertical direction on the compression pixel pitch carry out.
After this, flow process advances to step S225.On step S225, interests change determining section 239 uses the weighted space distance (X coordinate and Y coordinate) on the weighting time interval (t) and level and the vertical direction to generate trivector, and obtains the size of this trivector.The size of this trivector obtains by following calculating: in Euclid (Euclidean) distance of calculating on the three dimensions of the X reference axis of the input position that time shaft (t) is added to click data and Y reference axis between current input point and the last input point.After step S225, flow process advances to step S226.
On step S226, whether the size that interests change determining section 239 determining step S225 go up the trivector that obtains surpasses predetermined threshold.When the definite result on the step S226 showed that the size of trivector does not surpass predetermined threshold, flow process advanced to step step S227.At step S227, interests change determining section 239 determines that the user's of receiving equipment 2 object of interest does not change.When the size of trivector surpassed predetermined threshold, flow process advanced to step S228.At step S228, interests change determines that processing section 239 definite users' object of interest changes.
In aforesaid way, change object of interest variation definite and that classification part 240 is finished the user corresponding to the click data of receiving equipment 2 transmissions and determine and its classification.
In addition, when the user's of receiving equipment 2 object of interest zone is classified, can handle classified object of interest zone best.In other words, the amount of information of distributing to each object of interest zone of user can change.For example, large information capacity can be distributed to user's special interests subject area.On the other hand, can preferentially send the data in user's object of interest zone.
In addition, can send an image-region of from click neighboring area storage area 235, reading, so that improve spatial resolution in a manner described for object of interest zone 236.
In the case, even non-object of interest zone is clicked on the user error ground of receiving equipment 2, also can avoid determining of execution error.
According to this embodiment, to separate with another object on time or space even have an object of object of interest zone meaning, they also can be classified as an object.In addition, also can extract the zone that is different from such as the object of an entity with certain sense.
In addition, can merge in conjunction with the described embodiment of Figure 37 to 46 with in conjunction with the described embodiment of Figure 49 to 51.In the case, on step S206 and S207, can determine that for the static click of clicking continuously one is clicked continuously in the manner described above.Equally, on step S212 and S213, can determine that for the mobile click of clicking continuously one is clicked continuously in the manner described above.
Figure 52 shows the 3rd structure example of image transmission apparatus shown in Figure 1.In Figure 52, the part identical with Figure 28 represented with similar label, and omitted its explanation.In other words, except that switching part 3-3 had charging server 4, the structure of the structure of the image transmission system shown in Figure 52 and image transmission system shown in Figure 28 was basic identical.
Although from receiving equipment 2 to transmitting apparatus 1 send click data (or control information) and from transmitting apparatus 1 to receiving equipment 2 provide have an image that has improved spatial resolution corresponding to a click data a kind of like this service (hereinafter, this service is referred to as the service of click) can be free service, but this service can be chargeable service.When the service of clicking was chargeable service, charging server 4 was carried out a kind of charge and is handled, and collected the service charge of this click service from the user.
In other words, Figure 53 shows the structure example of the charging server 4 shown in Figure 52.
Predetermined information is supplied to communication link to set up test section 301 from switching station 3-3.Communication link is set up test section 301 and is checked from the information of switching station 3-3 supply.As a result, communication link is set up test section 301 and is detected as the communication link of whether having set up between the terminal unit 1 of transmitting apparatus and the terminal unit 2 as receiving equipment, and is supplied to generated data terminal unit to admit part 302.
When terminal unit approval part 302 receives representative set up communication link between the terminal unit such as transmitting apparatus 1 and receiving equipment 2 information (hereinafter, this information is referred to as communication link and sets up information) time, terminal unit approval part 302 is checked from the information of switching station 3-3 supply.As a result, terminal unit approval part 302 is admitted the terminal unit of communication linkage.In addition, 302 approvals of terminal unit approval part are distributed to the ID (hereinafter, ID is referred to as terminal unit ID) of terminal unit and the terminal unit ID of approval are supplied to and click test section 303.
Click test section 303 and monitor the data that receive through switching station 3-3, adopt the terminal unit ID that receives from terminal unit approval part 302 to detect the click data that sends from terminal unit, and testing result and terminal unit ID are supplied to charge processing section 304.
In the case, for example, receiving equipment 2 will send click data and unit, local terminal ID sends together.The terminal unit ID that click test section 303 will append on the click data that switching station 3-3 receives compares with the terminal unit ID of terminal unit approval part 302 supplies, the click data that approval sends from the terminal unit of communication linkage, and detect this click data.
Hereinafter, one group of click data testing result and a terminal unit ID of click test section 303 are known as the detection information of clicking.
When charge processing section 304 receives when detecting information from the click of clicking test section 303, the memory contents of charging databases 305 is upgraded in charge processing section 304.In addition, charge processing section 304 is corresponding to the memory contents of charging database 305, and (for example) periodically (for example, every month once) carried out charge and handled.
Required information is handled in charging database 305 storage charges.
Below, in conjunction with the processing of the charging server 4 shown in the flowchart text Figure 53 shown in Figure 54.
Communication link is set up the information whether communication link between the 301 monitoring terminal unit, test section supplied with corresponding to switching station 3-3 and is set up.When communication link was set up test section 301 and detected communication link between transmitting apparatus 1 and the receiving equipment 2 and set up, communication link was set up test section 301 and communication link is set up information is supplied to terminal unit approval part 302.
When terminal unit approval part 302 received that communication link is set up communication link that test section 301 sends and set up information, at step S301, terminal unit approval part 302 was checked the information of switching station 3-3 supply.As a result, terminal unit approval part 302 is admitted the terminal unit of communication linkage () terminal unit ID for example, transmitting apparatus 1 and receiving equipment 2, and terminal unit ID is supplied to click test section 303.
When clicking test section 303 and receive terminal unit ID from terminal unit approval part 302, click test section 303 and begin to detect the click data that comprises terminal unit ID.
After this, flow process advances to step S302.At step S302, charge processing section 304 determines whether to detect click data from the terminal unit of communication linkage.When the definite result on the step S302 show also not when the terminal unit of communication linkage detects click data (, when clicking test section 303 not when detection information is clicked in 304 supplies of charge processing section), flow process advances to step S304, skips steps S303.
On the contrary, when the definite result on the step S302 shows when the terminal unit of communication linkage detects click data (, when clicking test section 303 when the detection information are clicked in 304 supplies of charge processing section), flow process advances to step S303.At step S303, the memory contents of charging database 305 is upgraded in charge processing section 304.
In other words, charging database 305 storages such as the click information (hereinafter, this information can be known as click information) of clicking the time between number and point and storage terminal unit send a calling and begin a call duration time of communicating by letter.Charging database 305 is stored the terminal unit ID of this click information and terminal unit relatively.At step S303, charge processing section 304 is detected information updating and is clicked the corresponding click information of terminal unit ID that contains in the detection information corresponding to clicking.
After step S303, flow process advances to step S304.On step S304, terminal unit approval part 302 determines that setting up the communication link of supplying with test section 301 with communication link sets up the corresponding communication link of information and whether interrupt.
In other words, communication link is set up the not only foundation of the communication link between the monitoring terminal unit of test section 301, but also monitors the middle-end of this link.When communication link interrupted, communication link was set up test section 301 be supplied to terminal unit approval part 302 as interrupted communication link result of information information.At step S304, terminal unit approval part 302 determines corresponding to interrupted communication link information whether communication link interrupts.
When the definite result on the step S304 showed that communication link does not interrupt, flow process advanced to step S302.At step S302, the processing of charging server 4 duplication similarities.
On the contrary, when the definite result on the step S304 showed that communication link has interrupted, 302 controls of terminal unit approval part were clicked test section 303 and are finished the supervision of the click data of the terminal unit of communication linkage.After this, charge processing server 4 is finished this processing.
After this, charging database 305 is periodically checked in charge processing section 304, carries out charge and handles, and calculate communication expense and click service charge, and account transfer (change and take) from user's bank account or analog.
As clicking service charge, can specify the charge unit of each click.Number of times corresponding to clicking can calculate the click service charge.On the other hand, as clicking service charge, can specify charge unit hourly.Corresponding to the click time, can calculate the click service charge.On the other hand, corresponding to number of clicks and click time, can calculate the click service charge.
Above-mentioned processing can be carried out by hardware or software.When these processing are carried out by software, computer, all-purpose computer or the analog etc. that are arranged in transmitting apparatus 1, the receiving equipment 2 as special hardware are installed the program of forming this software.
Figure 55 shows the structure example of the computer that the program of carrying out above-mentioned processing has been installed.
This program can be prerecorded as being built on the hard disk 405 or ROM403 of a recording medium in the computer.
On the other hand, this program can be temporarily or is for good and all stored (record) on removable recording medium such as floppy disk, CD-ROM (compact disc read-only memory), MO (magneto-optic) dish, DVD (digital universal disc), disk or semiconductor memory.Removable recording medium 411 can be set to so-called software kit.
This program can be installed on the computer from above-mentioned removable recording medium 411.On the other hand, this program can send this computer to through the digital satellite broadcasting satellite radio from download point.Again on the one hand, this program can send this computer to through network or the internet such as LAN (local area network (LAN)).In this computer, this program is received and is installed on the hard disk 405 by communications portion 408.
This computer contains CPU (CPU) 402.Input/output interface 410 connects CPU402 through bus 401.When the user uses such as the importation 407 of keyboard, mouse, microphone etc. through input/output interface 410 during to order of CPU402 input, program stored begins to carry out in ROM (read-only memory) 403.On the other hand, CPU402 is loaded into program stored in the hard disk 405 on the RAM (random access memory) 404, and utilizes RAM404 to carry out this program.Again on the one hand, CPU402 is transmitting from satellite or network, receiving and be installed to program on the hard disk 405 by communications portion 408, perhaps from the removable recording medium 411 that is connected to driver 409, read and be installed to the program on the hard disk 405, be loaded on the RAM404, utilize ARM404 to carry out this program.Thereby CPU402 carries out the processing corresponding to the flow chart shown in Fig. 4,6,10,14,21 to 24,27,30,32,34,36,38,39,41,43 to 45,50,51 and 54.And CPU402 carries out corresponding to the processing shown in Fig. 3,5,7,8,11 to 13,17,25,29,31,33,35,37,47 to 49 and 53 the block diagram.In case of necessity, CPU402 exports the result who has handled through input/output interface 410 from the output 406 that comprises LCD (LCD), loud speaker etc.On the other hand, CPU402 sends from communications portion 408 or records the result who has handled on the hard disk 405.
Specifically, the treatment step that makes computer carry out the program of various processing is not necessarily always carried out according to the order of flow chart.On the other hand, treatment step can walk abreast or carry out (for example, can carry out parallel processing or object handles) discretely.
This program can be by a Computer Processing.On the other hand, this program can be distributed by a plurality of computers.Again on the one hand.This program can be sent on the far-end computer, is carried out by this computer and handles.
According to embodiments of the invention, transmitting apparatus 1 is carried out hierarchical coding.According to the individual-layer data that sends, change the temporal resolution or the spatial resolution of the image of receiving equipment 2 demonstrations.On the other hand, the temporal resolution of the image that shows of receiving equipment 2 and spatial resolution can be revolved conversion coefficient or quantized level (quantizing step) changes according to the image that sends discrete surplus.
On the other hand, temporal resolution and spatial resolution can change according to the coding method of transmitting apparatus 1.In other words, when showing an image with a conventional temporal resolution, the coded portion 31 of transmitting apparatus 1 is to its contours of objects continuous programming code, and the mean value of an acquisition object pixels value of composition (color), as representative value.Receiving equipment 2 is according to the zone of this representative value display color object.When with the temporal resolution being cost when showing the image of spatial resolution with improvement, can be by the such hierarchical coding that uses of the above.
According to this embodiment, the improvement of spatial resolution is cost with the temporal resolution.On the contrary, temporal resolution can be that cost improves with the spatial resolution.The information that the selection of (improvement) resolution is sacrificed in representative can be contained in the control information such as click data, and sends to transmitting apparatus 1 from receiving equipment 2.
According to the bright embodiment of we, temporal resolution and spatial resolution are processed.Yet according to the present invention, the resolution of level direction (hereinafter, this resolution being called level resolution) also can be processed.In other words, when the bit number of distributing to data increased or reduce, temporal resolution and spatial resolution can improve or worsen.In the case, when the temporal resolution of image and spatial resolution changed, its shading value also changed.This level resolution can change by changing above-mentioned quantized level.
In addition, according to this embodiment, the improvement of the spatial resolution of the subregion of image (priority range) is cost with the temporal resolution.On the other hand, the spatial resolution of entire image also can improve.
On the other hand, the improvement of the spatial resolution of the specific part of image can be a cost with the spatial resolution of the remainder of image, rather than is cost with the temporal resolution (that is retention time resolution).
In addition,, handle image by this way, be about to it and be divided into background image and object according to present embodiment.And can handle image in the mode of not separating image.
In addition, the present invention is not only applicable to view data, but also is applicable to voice data.Under the situation of voice data, sample frequency is equivalent to temporal resolution, and the bit number of distributing to voice data is equivalent to level resolution.
In addition, variation shown in Figure 49 determine and the processing of sorting out part 240 applicable to the situation of extracting sound characteristic amount (for example expectation part of tone, speech or musical instrument).
In addition, object extraction part 14 and 1014 and change to determine and the processing of sorting out part 240 applicable to so-called object coding.In other words, for object extraction part 14 and 1014 and processing that change to determine and sort out part 240, owing to can extract object, therefore can handle corresponding to the object coding that obtains the information that representative object moves, the object that is extracted and the information in representative object profile or zone are encoded.This coded data can be sent out or record.
Receiving equipment 2 can use click data to carry out the processing similar to the processing of object extraction part 1014, so that extract an object.In the case, when an object of receiving equipment 2 extractions was stored, a database just was formed.
According to first transmitting apparatus of the present invention, first sending method, first recording medium and first signal, receive the control information that sends from receiving equipment.Corresponding to this control information, the resolution on the both direction at least of control time direction, direction in space and level direction.Controlling at least corresponding to control information, the data of both direction resolution are sent to receiving equipment.Thereby for example, the direction in space resolution of the image that receiving equipment is shown can further be improved.
According to receiving equipment of the present invention, method of reseptance, second recording medium and secondary signal, control information is sent to transmitting apparatus, so that corresponding to the resolution on the both direction at least of this control information control time direction, direction in space and level direction.In addition, control the data of both direction resolution at least, send, receive then and export from transmitting apparatus corresponding to this control information.Thereby for example, the spatial resolution of the image of output can further be improved.
According to transmitting and receiving device of the present invention, method of sending and receiving, the 3rd recording medium and the 3rd signal, transmitting apparatus receives the control information that sends from receiving equipment.Corresponding to this control information, the resolution of the both direction at least of control time direction, direction in space and level direction.Controlling at least corresponding to control information, the data of both direction resolution are sent to receiving equipment.In addition, receiving equipment sends control information to transmitting apparatus.Control the data of both direction resolution at least, send, receive then and export from transmitting apparatus corresponding to this control information.Thereby for example, the direction in space resolution of the image that receiving equipment is shown can further be improved.
According to second transmitting apparatus of the present invention, second sending method, the 4th recording medium and the 4th signal, receive the control signal that sends from receiving equipment.Corresponding to this control information, data are sorted out.Corresponding to the classification result of data, send data to receiving equipment.Thereby, for example, can send the image-region that the user is considering to receiving equipment, and not consider that move in this zone or static.
Label declaration
1 terminal unit (transmitting apparatus)
2 terminal units (receiving equipment)
31,3-2 wireless base station
4 charging servers
13 background image Extraction parts
14 object extraction parts
22 combined treatment parts
31 coded portions
34 data volume calculating sections
41B, the 41F difference calculating portion
42B, 42F hierarchical coding part
44B, the 44F local decoder
51 receiving units
62B, the 62F storage area
71 background images write part
73 background image memories
75 object memories
77 built-up sections
142 CPU
146 display parts
201 video memories
203 stagnant zones and moving area determining section
210 mobile objects connect the processing section
211 stationary objects connect the processing section
214 object extraction result memories
231 input picture storage areas
234 stagnant zones and moving area determining section
Part is sorted out in 236 targets of interest zone
237 input time the interval calculation part
238 input positions are apart from calculating section
239 interest change determining section
301 communication links are set up the test section
303 click the test section
304 charge processing sections
402 CPU
408 communications portion
411 removable recording mediums
1013 background image Extraction parts
1014 object extraction parts
1022 combined treatment parts
1024 click data importations
1025 click data transmitting portions
Claims (20)
1, a kind of transmitting apparatus to receiving equipment transmission data comprises:
Receiving system is used to receive the control information that sends from receiving equipment;
Sort out device, data are sorted out corresponding to control information; With
Dispensing device sends data corresponding to the classification result of data to receiving equipment.
2, transmitting apparatus according to claim 1,
Wherein said data are view data,
Wherein receiving equipment shows the view data that sends from described transmitting apparatus,
Wherein control information comprise by the consideration point of the shown view data of receiving equipment and
Wherein said classification device is sorted out view data corresponding to the consideration of regional of the consideration point that comprises view data.
3, transmitting apparatus according to claim 2 also comprises:
Stagnant zone and moving area are determined device, the consideration point that is used for determining view data be static or mobile and
Continuous definite device is used for determining to consider whether point is continuous on time orientation and direction in space,
Wherein said classification device is determined device and is determined that continuously definite result of device sorts out view data corresponding to described stagnant zone and moving area.
4, transmitting apparatus according to claim 3 also comprises:
Consider a some storage device, be used for being stored in static and at the consideration that consideration of regional contained point continuous on time and the direction in space be stored in consideration point mobile and that on time and direction in space, contained in the continuous consideration of regional; With
Classification logotype symbol adder is used for obtaining the classification logotype symbol that joins on the consideration point that is stored in consideration point storage device, and this identifier is added on this consideration point.
5, transmitting apparatus according to claim 4,
Wherein be in static and on time and direction in space under the situation in the continuous consideration of regional at current consideration point, last consideration point in being stored in described consideration point storage device is comprised in static and on time and direction in space in the continuous consideration of regional time, corresponding to current consideration point and contain the relation of locus between the zone of last consideration point, the classification logotype that described classification logotype symbol adder obtains to be added on the current consideration point accords with.
6, transmitting apparatus according to claim 4,
Wherein be in mobile and on time and direction in space under the situation in the continuous consideration of regional at current consideration point, last consideration point in being stored in described consideration point storage device is comprised in mobile and on time and direction in space in the continuous consideration of regional time, corresponding to the predetermined characteristic amount of the consideration of regional that contains current consideration point and the similitude of the predetermined characteristic amount of the consideration of regional that contains last consideration point, described classification logotype symbol adder obtains to be added into the classification logotype symbol of current consideration point.
7, transmitting apparatus according to claim 4,
Wherein said classification device is sorted out the presumptive area as the view data of an object corresponding to the density of the consideration point of storing in the described consideration point storage device.
8, transmitting apparatus according to claim 7,
Wherein said classification device corresponding to store in the described consideration point storage device and the density of consideration point of identical category identifier that has been included in distribution in the static consideration of regional, the presumptive area as the view data of an object is sorted out.
9, transmitting apparatus according to claim 7,
Wherein said classification device is corresponding to being included in the mobile consideration of regional of storing in the described consideration point storage device, distributed the identical category identifier and carried out the density of the consideration point of motion compensation, the presumptive area as the view data of object has been sorted out.
10, transmitting apparatus according to claim 3,
Wherein said stagnant zone and moving area are determined device corresponding to the consideration of regional of the consideration point that comprises present frame and comprise difference between the consideration of regional of consideration point of past frame, and the consideration of regional of determining to comprise current consideration point is static or mobile.
11, transmitting apparatus according to claim 3,
Wherein said continuously definite device determines corresponding to current consideration point and the time difference of considering point in the past whether current consideration point is continuous on time and direction in space.
12, transmitting apparatus according to claim 7,
Wherein said classification device improves the resolution in the zone of pressing the object classification.
13, transmitting apparatus according to claim 3,
Wherein said continuously definite device is corresponding to current consideration point and consider that in the past time and the distance on the direction in space between the point determine whether current consideration point is continuous, obtain stagnant zone and the moving area identical with the consideration of regional that comprises current consideration point and determine the result.
14, transmitting apparatus according to claim 13,
Wherein said classification device is sorted out view data corresponding to the weighted value that is used for time orientation and direction in space distance.
15, transmitting apparatus according to claim 13 also comprises:
Image data storage apparatus is used for the view data that storage package is contained in the consideration of regional of the continuous consideration point of time orientation and direction in space.
16, transmitting apparatus according to claim 15,
Wherein when current consideration point is discontinuous on time orientation and direction in space, after the content of described image data storage apparatus is read out, wipe its content, and comprising image data storage in the consideration of regional of current consideration point to described image data storage apparatus.
17, transmitting apparatus according to claim 16,
Wherein said classification device improves the resolution of the view data of reading from described image data storage apparatus.
18, transmitting apparatus according to claim 1,
Wherein control information is used to a charge processing.
19, transmitting apparatus according to claim 2,
Wherein view data is the object of coding.
20, a kind of sending method to receiving equipment transmission data may further comprise the steps:
The control information that reception sends from receiving equipment;
Corresponding to control information data are sorted out; With
Classification result corresponding to data sends data to receiving equipment.
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP225020/99 | 1999-08-09 | ||
JP22502099 | 1999-08-09 | ||
JP127657/00 | 2000-04-24 | ||
JP214237/00 | 2000-07-14 |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CNB008020485A Division CN100366090C (en) | 1999-08-09 | 2000-08-09 | Transmitting device and transmitting method, receiving device and receiving method, transmitting/receiving device and transmitting/receiving method, recorded medium and signal |
Publications (2)
Publication Number | Publication Date |
---|---|
CN1842158A CN1842158A (en) | 2006-10-04 |
CN100584003C true CN100584003C (en) | 2010-01-20 |
Family
ID=37030980
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN 200710142655 Expired - Fee Related CN101106708B (en) | 1999-08-09 | 2000-08-09 | Transmitting apparatus and method |
CN 200510084849 Expired - Fee Related CN100584003C (en) | 1999-08-09 | 2000-08-09 | Transmitting apparatus and method |
CN 200710142654 Expired - Fee Related CN101106707B (en) | 1999-08-09 | 2000-08-09 | Transmitting apparatus and method |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN 200710142655 Expired - Fee Related CN101106708B (en) | 1999-08-09 | 2000-08-09 | Transmitting apparatus and method |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN 200710142654 Expired - Fee Related CN101106707B (en) | 1999-08-09 | 2000-08-09 | Transmitting apparatus and method |
Country Status (1)
Country | Link |
---|---|
CN (3) | CN101106708B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103096086A (en) * | 2013-02-06 | 2013-05-08 | 上海风格信息技术股份有限公司 | Method of sampling forward in multi-picture display to achieve system optimization |
CN113660495A (en) * | 2021-08-11 | 2021-11-16 | 易谷网络科技股份有限公司 | Real-time video stream compression method and device, electronic equipment and storage medium |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5475421A (en) * | 1992-06-03 | 1995-12-12 | Digital Equipment Corporation | Video data scaling for video teleconferencing workstations communicating by digital data network |
TW358296B (en) * | 1996-11-12 | 1999-05-11 | Matsushita Electric Ind Co Ltd | Digital picture encoding method and digital picture encoding apparatus, digital picture decoding method and digital picture decoding apparatus, and data storage medium |
US6002803A (en) * | 1997-03-11 | 1999-12-14 | Sharp Laboratories Of America, Inc. | Methods of coding the order information for multiple-layer vertices |
-
2000
- 2000-08-09 CN CN 200710142655 patent/CN101106708B/en not_active Expired - Fee Related
- 2000-08-09 CN CN 200510084849 patent/CN100584003C/en not_active Expired - Fee Related
- 2000-08-09 CN CN 200710142654 patent/CN101106707B/en not_active Expired - Fee Related
Also Published As
Publication number | Publication date |
---|---|
CN101106707A (en) | 2008-01-16 |
CN1842158A (en) | 2006-10-04 |
CN101106707B (en) | 2010-11-03 |
CN101106708B (en) | 2010-11-03 |
CN101106708A (en) | 2008-01-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN100366090C (en) | Transmitting device and transmitting method, receiving device and receiving method, transmitting/receiving device and transmitting/receiving method, recorded medium and signal | |
KR100738241B1 (en) | Image processing apparatus | |
CN100437453C (en) | Tag information display control apparatus, information processing apparatus, display apparatus, tag information display control method and recording medium | |
EP1271957B1 (en) | Object-based digital image predictive coding and transfer method and apparatus, and decoding apparatus | |
CN101023662B (en) | Method and apparatus for motion vector processing | |
JP4552296B2 (en) | Information processing apparatus, information processing method, and recording medium | |
CN103338366B (en) | Picture coding device, method for encoding images, image decoder, image decoding method | |
CN100588227C (en) | Data processing apparatus, data processing method, and program | |
US20120287233A1 (en) | Personalizing 3dtv viewing experience | |
CN105635824A (en) | Personalized channel recommendation method and system | |
Park et al. | Mosaic: Advancing user quality of experience in 360-degree video streaming with machine learning | |
US20230124329A1 (en) | Method to Generate Additional Level of Detail When Zooming In On an Image | |
CN113096055A (en) | Training method and device for image generation model, electronic equipment and storage medium | |
CN100584003C (en) | Transmitting apparatus and method | |
CN101370131B (en) | Data processing device and method | |
CN113452996A (en) | Video coding and decoding method and device | |
JPH10336673A (en) | Edge detection method and its device in video signal coding system | |
CN1692373B (en) | Video recognition system and method | |
EP0993194A2 (en) | Method and apparatus for embedding an additional signal into an image signal | |
CN113382241A (en) | Video encoding method, video encoding device, electronic equipment and storage medium | |
Hsiao et al. | Content-aware video adaptation under low-bitrate constraint | |
CN118803348A (en) | Audio and video SDK interface for hong Mongolian system | |
Idris et al. | Detection of camera operations in compressed video sequences | |
KR20230143429A (en) | Method and system for optimizing video encoding using optimal encoding preset of video segment unit | |
Xie et al. | A Global Decoding Strategy with a Reduced-Reference Metric Designed for the Wireless Transmission of JPWL |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
C17 | Cessation of patent right | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20100120 Termination date: 20130809 |