EP2533533A1 - Dispositif de contrôle dýaffichage, procédé de contrôle dýaffichage et support dýenregistrement - Google Patents

Dispositif de contrôle dýaffichage, procédé de contrôle dýaffichage et support dýenregistrement Download PDF

Info

Publication number
EP2533533A1
EP2533533A1 EP12169579A EP12169579A EP2533533A1 EP 2533533 A1 EP2533533 A1 EP 2533533A1 EP 12169579 A EP12169579 A EP 12169579A EP 12169579 A EP12169579 A EP 12169579A EP 2533533 A1 EP2533533 A1 EP 2533533A1
Authority
EP
European Patent Office
Prior art keywords
image
display
display control
displayed
certain
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP12169579A
Other languages
German (de)
English (en)
Inventor
Takehiko Sasaki
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Publication of EP2533533A1 publication Critical patent/EP2533533A1/fr
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/21805Source of audio or video content, e.g. local disk arrays enabling multiple viewpoints, e.g. using a plurality of cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44008Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30232Surveillance

Definitions

  • the present technology relates to a display control device, a display control method, a program, and a recording medium.
  • processing performance of a camera, a browsing device, and a relay server is improved, and a technology to detect an object such as a person and a thing from an image has been advanced; therefore, it has become possible to extract relevance of objects of several images and to judge identicalness.
  • an image in which a portion that does not exist on the same plane as a road surface is removed from an image that has been subjected to projection conversion appropriate for each of several monitoring camera images is composited to be displayed on a map and correlation of each monitoring camera image is systematically adjusted on the map, thereby enabling a road condition in a monitoring area to be monitored on the map as an image (for example, see Japanese Patent Laid-Open No. 2005-286685 ).
  • terminals having various display functions and input systems such as a mobile terminal and a tablet has been used as a device to browse an image in addition to a conventional PC.
  • display transition is completed as quickly as possible.
  • the present technology is disclosed in consideration of such a situation, and makes it possible to complete display transition as quickly as possible and improve user operability when an image is displayed on a screen by switching several viewpoints.
  • Embodiments of the invention relate to a display control device, a display control method, a program, and a recording medium that can complete display transition as quickly as possible and may improve user operability when an image is displayed on a screen by switching several viewpoints.
  • An aspect of the present technology is a display control device including: an image acquisition unit obtaining each image taken by several cameras; an object detection unit detecting a particular object displayed in a certain image out of the obtained images; and a display control unit performing control so as to set a mode to display an object in an image different from the certain image where the particular object is detected using the object in the certain image as a reference and to display the object in the different image on the same screen when the same object is detected in the different image in the obtained several images.
  • the display control unit can set a location to display the object in the different image and a shape of the object in the different image on the screen, based on a location and the imaging direction of the camera that takes the object in the certain image.
  • the display control unit can further set a central axis of the object in the certain image and the object in the different image, and turn around the object in the certain image and the object in the different image with the central axis being the center to display the object in the certain image and the object in the different image on the screen based on an operation by a user.
  • the display control unit can display a certain detection frame around the object when the same object is detected in the image different from the certain image where the particular object is detected.
  • the display control unit includes, as display control modes, a normal mode in which a predetermined image out of the images taken by the several cameras is displayed as the certain image and a lock on mode in which the object in the certain image and the object in the different image are displayed on the same screen when the object in the certain image is selected by the user in a case where the same object is detected in the different image, and can enlarge an image of the object to be displayed in the lock on mode.
  • the object detection unit can extract a feature amount of the detected object and obtain information showing an attribute of the object by searching a certain database based on the feature amount, and the display control unit can perform control so as to display the attribute together with the object.
  • An aspect of the present technology is a display control method including: obtaining each image taken by several cameras by an image acquisition unit; detecting a particular object displayed in a certain image out of the obtained images by an object detection unit; and performing control so as to set a mode to display an object in an image different from the certain image where the particular object is detected using the object in the certain image as a reference and to display the object in the different image on the same screen by a display control unit when the same object is detected in the different image in the obtained several images.
  • An aspect of the present technology is a program causing a computer to function as a display control device that includes: an image acquisition unit obtaining each image taken by several cameras; an object detection unit detecting a particular object displayed in a certain image out of the obtained images; and a display control unit performing control so as to set a mode to display an object in an image different from the certain image where the particular object is detected using the object in the certain image as a reference and to display the object in the different image on the same screen when the same object is detected in the different image in the obtained several images.
  • each image taken by several cameras is obtained, a particular object displayed in a certain image out of the obtained images is detected, and control is performed so as to set a mode to display an object in an image different from the certain image where the particular object is detected using the object in the certain image as a reference and to display the object in the different image on the same screen when the same object is detected in the different image in the obtained several images.
  • FIG. 1 is a diagram explaining a system for imaging an object according to an embodiment of the present technology.
  • a person 300 and a thing 330 are imaged as objects by several cameras.
  • a camera 110, a camera 120, a camera 130, and a camera 140 are provided in this example.
  • the objects are located within an imaging area of the camera 11 ⁇ , the camera 120, and the camera 130 and located outside an imaging area of the camera 140. Therefore, the objects are imaged by the three cameras at the same time.
  • FIG. 2 is a block diagram illustrating a configuration example according to one embodiment of a monitoring system to which the present technology is applied.
  • This monitoring system 100 includes a camera system 101, a network 200, a browsing terminal 210, a browsing terminal 220, a browsing terminal 230, and a browsing terminal 240.
  • the camera system 101 includes, for example, the four cameras including the camera 110 to the camera 140, as described above with reference to FIG. 1 , and an object existing within an imaging area of each camera is to be imaged.
  • Data of an image taken by each camera constituting the camera system is transmitted to the network 200.
  • the data of the image of each camera is transmitted with the data being associated with, for example, information on installation location of each camera (for example, coordinate location) and information on the imaging direction of each camera (for example, vector representing an optical axis that is the center of the imaging area of each camera).
  • the network 200 is built by, for example, internet, intranet, and mobile telephone network.
  • Each of the browsing terminals 210 to 240 is configured so as to be connectable to the network 200, and can obtain the data of the image transmitted from the camera system 101 to display an image.
  • the browsing terminals 210 is configured as a personal computer, for example.
  • the browsing terminal 220 is configured as a mobile terminal such as a mobile telephone and a smartphone, for example.
  • the browsing terminal 230 is configured as a tablet terminal such as PDA, for example.
  • Each of the browsing terminals 210 to 230 is configured to be integrated with a display unit such as a display.
  • the browsing terminal 240 is configured as a digital apparatus having a microcomputer and the like for example, and causes an image to be displayed on a display 241, which is an external apparatus connected to the browsing terminal 240.
  • Each of the browsing terminals 210 to 240 can, as described later, display an image taken by the camera system 101 in a normal mode as well as in a lock on mode. Detail of the normal mode and the lock on mode will be described later.
  • the monitoring system 100 may be configured as shown in FIG. 3 , for example.
  • the monitoring system 100 includes the camera system 101, the network 200, the browsing terminal 210, the browsing terminal 220, the browsing terminal 230, the browsing terminal 240, a dedicated server 260, and an external server 270.
  • the dedicated server 260 is configured as a computer and the like possessed by a corporation operating the monitoring system 100, for example, and accumulates the data transmitted from the camera system 101 in a storage 261, for example.
  • the dedicated server 260 transmits the data accumulated in the storage 261 to the browsing terminals 210 to 240 that have accessed via the network 200.
  • the external server 270 is configured as a computer and the like operated by an internet service provider or the like, for example, and is a server lent to the corporation operating the monitoring system 100. As with the dedicated server 260, the external server 270 accumulates the data transmitted from the camera system 101 in a storage 271, for example. The external server 270 transmits the data accumulated in the storage 271 to the browsing terminals 210 to 240 that have accessed via the network 200.
  • the browsing terminals 210 to 240 do not obtain the data directly from the camera system 101, but obtain the data indirectly via the dedicated server 260 or the external server 270.
  • the monitoring system 100 may be configured in this manner.
  • FIG. 4 is a block diagram illustrating a detailed configuration example of the camera 110 shown in FIG. 1 .
  • the camera 110 includes an imaging unit 112, an encoder 113, a CPU 114, a memory 115, a storage 116, and a network interface 117.
  • the imaging unit 112 includes a photoelectric conversion element such as a CCD (Charge Coupled Devices) imaging sensor and a CMOS (Complementary Metal Oxide Semiconductor) imaging sensor, and outputs an image signal.
  • a photoelectric conversion element such as a CCD (Charge Coupled Devices) imaging sensor and a CMOS (Complementary Metal Oxide Semiconductor) imaging sensor, and outputs an image signal.
  • CCD Charge Coupled Devices
  • CMOS Complementary Metal Oxide Semiconductor
  • image signal that is an electrical signal corresponding to an amount of the received light is output.
  • This image signal is subjected to A/D conversion, in other words, is sampled to be quantized (digitized), and is subjected to a certain image processing to be output to the encoder 113.
  • the encoder 113 supplies image data obtained by encoding the image signal output from the imaging unit 112 in accordance with certain format such as MPEG (Moving Picture Experts Group) format and JPEG (Joint Photographic Experts Group) format to the CPU (Central Processing Unit) 114.
  • MPEG Motion Picture Experts Group
  • JPEG Joint Photographic Experts Group
  • the CPU 114 executes software such as a program using the memory 115, and processes the data output from the encoder 113. For example, a processing such as packet generation to transmit image data in association with information on installation location of a camera (for example, coordinate location) and information on the imaging direction of each camera (for example, vector representing an optical axis that is the center of the imaging area of each camera) is performed.
  • a processing such as packet generation to transmit image data in association with information on installation location of a camera (for example, coordinate location) and information on the imaging direction of each camera (for example, vector representing an optical axis that is the center of the imaging area of each camera) is performed.
  • the CPU 114 holds data such as image data in the storage 116 including an HDD and the like as necessary.
  • the network interface 117 includes a LAN card and the like, for example, and controls transmission and reception of data with the network 200.
  • the camera 110 is configured in this manner.
  • the configuration shown in FIG. 4 is explained as the configuration example of the camera 110, the configuration shown in FIG. 4 is applied to the cameras 120 to 140.
  • FIG. 5 is a block diagram illustrating a detailed configuration example of the browsing terminal 210 shown in FIG. 2 or FIG. 3 .
  • a CPU 401 executes various types of processings in accordance with a program stored in a ROM (Read Only Memory) 402 or a program loaded from a storage unit 408 to a RAM (Random Access Memory) 403. Data necessary for the CPU 401 to execute various types of processings and the like are stored in the RAM 403 as appropriate.
  • ROM Read Only Memory
  • RAM Random Access Memory
  • the CPU 401, the ROM 402, and the RAM 403 are connected to each other via a bus 404.
  • An input-output interface 405 is also connected to the bus 404.
  • the communication unit 409 performs a communication processing via the network 200.
  • a drive 410 is connected to the input-output interface 405 as necessary. Also, a removable medium 411 such as a magnetic disc, an optical disk, a magneto optical disk, and a semiconductor memory is connected as appropriate, and a computer program read therefrom is installed to the storage unit 408 as necessary.
  • a removable medium 411 such as a magnetic disc, an optical disk, a magneto optical disk, and a semiconductor memory is connected as appropriate, and a computer program read therefrom is installed to the storage unit 408 as necessary.
  • a program constituting the software is installed from a network such as internet or a recording medium including the removable medium 411 and the like.
  • FIG. 5 is explained as a configuration example of the browsing terminal 210, the configuration shown in FIG. 5 is applied to the browsing terminals 220 to 240.
  • actual configuration of the input portion 406 and the output portion 407 varies depending on a model.
  • the storage unit 408 is provided as necessary, for example.
  • FIG. 6 is a block diagram illustrating a functional configuration example of software such as a program executed in the CPU 401 shown in FIG. 5 .
  • An image acquisition unit 451 shown in FIG. 6 controls acquisition of data of an image taken by each camera transmitted from the camera system 101 via the network 200.
  • An object detection unit 452 analyzes the image obtained by the image acquisition unit 451 and controls detection of an object displayed in the image.
  • the object detection unit 452 determines whether a person is displayed in the image of the data obtained by the image acquisition unit 451, for example. If it is determined that a person is displayed, the object detection unit 452 detects an area of the image where the person is displayed. Alternatively, the object detection unit 452 determines whether a particular thing is displayed in the image of the data obtained by the image acquisition unit 451, for example. If it is determined that a particular thing is displayed, the object detection unit 452 detects an area of the image where the thing is displayed.
  • the object detection unit 452 determines whether the same person is displayed in an image taken by several cameras of the camera system 101.
  • the object detection unit 452 extracts information such as a ratio of head to the whole body of the person, eye color, and hair color as a feature amount. Then, the object detection unit 452 determines whether the same person is displayed in the image taken by the several cameras by comparing the feature amount of the person in the image taken by the several cameras.
  • the object detection unit 452 controls an access to the dedicated server 260 or the external server 270 via the network 200 as necessary, for example.
  • the object detection unit 452 refers to a person database stored in the storage 261, the storage 271, and the like for example, and judges identicalness of the person stored in the person database and the person detected.
  • feature amounts of several persons are stored, for example.
  • information such as a ratio of head to the whole body, eye color, and hair color is stored as a feature amount.
  • the object detection unit 452 judges identicalness of the person stored in the person database and the person detected by comparing the feature amount of the person detected and the feature amount of each person stored in the person database, for example.
  • attributes of the person for example, information such as age, sex, occupation, and criminal record
  • the attributes of the displayed person can be also displayed on each browsing terminal, for example.
  • a display mode control unit 453 determines which one of the normal mode or the lock on display mode is to be used to display an image based on an operation by a user that is input via the input portion 406, for example, and outputs information specifying the mode to be displayed to a display image generation unit 454.
  • the display image generation unit 454 generates display data for the normal mode or the lock on display mode and an image signal corresponding to the display data to supply them to the display of the output portion 407.
  • FIG.7 is a diagram illustrating an example of an image displayed on a display 221 of the browsing terminal 220.
  • the browsing terminal 220 is configured as a mobile terminal such as a mobile telephone and a smartphone, for example.
  • FIG. 7 shows an example of a display screen in the normal mode.
  • An image (object image) 301 of the person 300 and an image (object image) 331 of the thing 330 taken by the camera 110 are displayed.
  • an object detection frame 321 is displayed around the object image 301.
  • the object detection frame 321 is displayed since an image of a person is detected by the object detection unit 452 in an image taken by the camera 110, and an image of the same person is also detected in an image taken by a different camera.
  • the detection frame 321 is displayed if the same person is detected in an image taken by the camera 120 and an image taken by the camera 130.
  • the detection frame 321 is displayed as a rectangular dotted line.
  • the display 221 is configured as a capacitance system touch screen for example, and detects change in capacitance and detects approach of a user's finger and like. For example, when a user brings a finger 310 closer to the display 221, change in capacitance on a certain portion on the panel is detected to output a signal representing the portion.
  • the display mode control unit 453 switches the display mode from the normal mode to the lock on mode.
  • FIG. 8 is a diagram explaining transition when an image displayed on the display 221 is switched to display in the lock on mode. As shown in FIG. 8 , selection of the object image 301 moves the object image 301 to the center of the screen first, and then the object image 301 is enlarged to be displayed. After that, the image on the display 221 becomes as shown in FIG. 9 .
  • the object image 301 When the object image 301 is enlarged to be displayed, the object image 301 may be subjected to pixel supplement (up sample) or the like to be converted to a high resolution image.
  • FIG. 9 is a diagram illustrating a display example on the display 221 in the lock on mode.
  • an object image taken by a different camera is displayed in addition to the object image 301.
  • an object image 302 that is an image of the person 300 taken by the camera 120 and an object image 303 that is an image of the person 300 taken by the camera 130 are displayed together with the object image 301.
  • the object image 302 is displayed together with an object detection frame 322, and the object image 303 is displayed together with an object detection frame 323.
  • the object image 302 and the object image 303 are displayed relative to positional relationship of the cameras 110 to 130, with display location of the object image 301 being a reference.
  • data transmitted from each camera is correlated with information on installation location of cameras (for example, coordinate location) and information on the imaging direction of each camera (for example, vector representing an optical axis that is the center of the imaging area of each camera).
  • the display image generation unit 454 sets display location and angle of the object image 302 and the object image 303 in the case where the display location of the object image 301 is used as a reference.
  • a user can recognize the direction in which each of the object images 301 to 303 is taken by the object detection frames 321 to 323. That is, although the object detection frame 321 corresponding to the object image 301 that is a reference is shown as a rectangle on the screen, the object detection frame 322 and the object detection frame 323 are shown as parallelograms on the screen.
  • the object detection frame may not be displayed.
  • information regarding attributes of a person obtained based on the person database may be displayed.
  • a central axis 340 is displayed in the center of the object image 301, the object image 302, and the object image 303.
  • the central axis 340 is also set by the display image generation unit 454 based on information on installation location of cameras (for example, coordinate location) and information on the imaging direction of each camera (for example, vector representing an optical axis that is the center of the imaging area of each camera). That is, since the image of the person 300 is taken by the three cameras from different directions, it is possible to estimate depth (thickness) of the person 300. For example, based on the depth of the person 300 thus estimated, the person 300 is assumed to be almost cylindrical and a central axis of the cylinder is set as the central axis 340.
  • the object image of the person 300 is turned around. For example, as shown in FIG. 10 , the object image displayed on the display 221 is turned around.
  • the object image is turned around in the counterclockwise direction with the central axis 340 being the center. Therefore, the object imagé 303 is displayed at the front though the object image 301 is displayed at the front (as a reference) in FIG. 9 .
  • the person 300 displayed as the object image 303 has a gun and is a dangerous person.
  • the object for example, the person 300
  • the object seen from a different direction can be recognized by turning around the object image displayed in the lock on manner. Therefore, for example, it is possible to easily check whether a person in an image is a dangerous person or not (whether the person is armed or not).
  • the display mode control unit 453 switches the display mode from the lock on mode to the normal mode.
  • FIG. 11 is a diagram explaining transition when an image displayed on the display 221 is switched to be displayed in the normal mode.
  • selection of the object image 303 moves the object image 303 to the display location in the image taken by the camera 130 (in this case, upper left of the screen).
  • the object image 303 that has been enlarged to be displayed in the lock on mode is reduced.
  • the image on the display 221 becomes as shown in FIG. 12 .
  • FIG. 12 is a diagram illustrating another example of an image displayed on the display 221 of the browsing terminal 220.
  • FIG12 is an example of display screen in the normal mode, and the image (object image) 303 of the person 300 and an image (object image) 333 of the thing 330 taken by the camera 130 are displayed. Also, as shown in FIG. 12 , the object detection frame 323 is displayed around the object image 303.
  • FIG. 13 display on the screen is switched as shown in FIG. 13 .
  • a viewpoint 1 image, a viewpoint 2 image, and a viewpoint 3 image are obtained as images taken by three different cameras.
  • the user displays the viewpoint 1 image on the display of the browsing terminal and it is switched to display of the viewpoint 3 image.
  • the viewpoint 1 image is displayed
  • the user selects an icon which says "viewpoint list" in the lower right part of the screen to switch the display on the display of the browsing terminal to the list, as shown in FIG. 13 . Accordingly, each of the viewpoint 1 image, the viewpoint 2 image, and the viewpoint 3 image are reduced to be displayed on the display of the browsing terminal.
  • the user selects the viewpoint 3 image which the user wants to browse on the screen showing the list (for example, selects the image by bringing a finger closer to the image). Accordingly, the viewpoint 3 image is enlarged to be displayed on the display of the browsing terminal.
  • the object image is displayed in the lock on mode; therefore, it is not necessary to make transition to the screen showing the list as in the related art.
  • viewpoint images have been switched through the list each time, like the viewpoint 1 image, the list, and then the viewpoint 3 image, for example. Therefore, it has taken some time to complete display transition.
  • the viewpoint image (object image) can be switched only by operating so as to turn around the object image displayed in the lock on mode. Therefore, it is possible to complete display transition quickly, and user operability also improves.
  • Step S21 the image acquisition unit 451 obtains data of the images taken by each camera transmitted from the camera system 101.
  • Step S22 the display image generation unit 454 displays any one of the images obtained in the processing of Step S21 in the normal mode. At that time, which image taken by the corresponding camera is to be displayed is determined by the initial setting, for example. Accordingly, for example, the image as described above with reference to FIG. 7 is displayed on the display 221 of the browsing terminal 220.
  • Step 23 the object detection unit 452 analyses the image obtained by the image acquisition unit 451. Accordingly, whether a certain object (for example, person) is displayed in the image or not is analyzed, and the feature amount and the like of the object are obtained.
  • a certain object for example, person
  • Step S24 the object detection unit 452 determines whether the certain object (for example, person) is detected as a result of the processing of Step S23. When it is determined in Step S24 that the certain object is detected, the processing proceeds to Step S25.
  • the certain object for example, person
  • the person database stored in the storage 261, the storage 271, and the like may be referred to to judge identicalness of the person stored in the person database and the person detected, for example. When it is judged that there is high identicalness, the attributes of the person are obtained.
  • Step S25 the object detection unit 452 determines whether the same person is detected in an image (from a different viewpoint) taken by a different camera. At that time, the object detection unit 452 compares the feature amount such as a ratio of head to the whole body of the person, eye color, and hair color to determine whether the same person is displayed in an image taken by several cameras. When it is determined in Step S25 that the same person is detected in the image from a different viewpoint, the processing proceeds to Step S26.
  • Step S26 a detection frame is displayed around the object that has been determine to be detected in the processing of Step S24. Accordingly, the object detection frame 321 is displayed around the object image 301 as shown in FIG. 7 , for example.
  • Step S27 the display mode control unit 453 determines whether the object image around which the detection frame is displayed in the processing of Step S26 is selected, for example. When it is determined in Step S27 that the object image around which the detection frame is displayed is selected, the processing proceeds to Step S28.
  • Step S28 the display image generation.unit 454 executes a lock on mode display processing described later with reference to the flow chart shown in FIG. 15 . Accordingly, an image is displayed in the lock on mode.
  • Step S24 When it is determined in Step S24 that the certain object is not detected, the processings of Step S25 to Step S28 are skipped.
  • Step S25 when it is determined in Step S25 that the same person is not detected in the image from a different viewpoint, the processings of Step S26 to Step S28 are skipped.
  • Step S27 when it is determined in Step S27 that the object image around which the detection frame is displayed is not selected, the processing of Step S28 is skipped.
  • Step S41 the display image generation unit 454 moves the object image determined to be selected in the processing of Step S27 to the center of the screen and enlarge the object image to be displayed. At that time, an image as shown in FIG. 8 is displayed, for example.
  • Step S42 the display image generation unit 454 sets display location and angle of each object image.
  • display location and angle of each object image are set based on, for example, information on installation location of cameras (for example, coordinate location) and information on the imaging direction of each camera (for example, vector representing an optical axis that is the center of the imaging area of each camera) as described above.
  • Step S43 the display image generation unit 454 sets a central axis.
  • the person 300 is assumed to be almost cylindrical and a central axis of the cylinder is set as the central axis 340 as described above with reference to FIG. 9 .
  • Step S44 each object image is displayed based on the setting in the processing of Step S42 and the setting contents in the processing of Step S43. That is, for example, the image is displayed in the lock on mode as described above with reference to FIG. 9 .
  • Step S45 the display image generation unit 454 determines whether the screen is swiped. That is, it is determined whether an operation to turn around the object image displayed in the lock on mode is carried out.
  • Step S45 the processing proceeds to Step S46.
  • Step S46 the display image generation unit 454 turns around the object image displayed in the lock on mode to display the object image.
  • the object image is turned around with the central axis 340 being the center as described above with reference to FIG. 10 .
  • the object image 303 is displayed at the front though the object image 301 is displayed at the front (as a reference) in FIG. 9 .
  • Step S45 When it is determined in Step S45 that the screen is not swiped, or after the processing of Step S46, the processing proceeds to Step S47.
  • Step S47 the display image generation unit 454 determines whether the object image around which the detection frame is displayed is selected. When it is determined in Step S47 that the object image around which the detection frame is displayed is selected, the processing proceeds to Step S48.
  • Step S48 the display image generation unit 454 reduces the object image. At that time, an image as shown in FIG. 11 is displayed, for example.
  • Step S49 the display image generation unit 454 displays an image in the normal mode. At that time, the image is displayed in the normal mode as shown in FIG. 12 , for example.
  • Step S47 When it is determined in Step S47 that the object image around which the detection frame is displayed is not selected, the processings of Step S48 and Step S49 are skipped.
  • the embodiment of the present technology is applied to the monitoring system 100 and a dangerous person is monitored in consideration of, for example, mainly crime prevention and security purposes, is explained above.
  • the embodiment of the present technology may be applied to an imaging system in consideration of television broadcasting purpose for events such as a sport competition and a concert.
  • a remarkable player can be displayed in the lock on mode in a football match and performance of the player can be watched from various angles.
  • a program constituting the software is installed from a network such as internet or a recording medium including the removable medium 411 shown in FIG. 5 and the like.
  • the recoding medium includes not only one constituted by the removable medium 411 including a magnetic disk (including floppy disk (registered trademark)), an optical disk (including CD-ROM (Compact Disk-Read Only Memory) and DVD (Digital Versatile Disk)), a magneto optical disk (including MD (Mini-Disk) (registered trademark)), a semiconductor memory, or the like in which the program is recorded and which is distributed to deliver the program to a user, but also one constituted by the ROM 402 in which the program is recorded, a hard disk included in the storage unit 408, and the like.
  • the removable medium 411 including a magnetic disk (including floppy disk (registered trademark)), an optical disk (including CD-ROM (Compact Disk-Read Only Memory) and DVD (Digital Versatile Disk)), a magneto optical disk (including MD (Mini-Disk) (registered trademark)), a semiconductor memory, or the like in which the program is recorded and which is distributed to deliver the program to a user, but
  • the present technology can also include following configurations.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Studio Devices (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
EP12169579A 2011-06-08 2012-05-25 Dispositif de contrôle dýaffichage, procédé de contrôle dýaffichage et support dýenregistrement Withdrawn EP2533533A1 (fr)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2011128100A JP2012257021A (ja) 2011-06-08 2011-06-08 表示制御装置および方法、プログラム、並びに記録媒体

Publications (1)

Publication Number Publication Date
EP2533533A1 true EP2533533A1 (fr) 2012-12-12

Family

ID=46578816

Family Applications (1)

Application Number Title Priority Date Filing Date
EP12169579A Withdrawn EP2533533A1 (fr) 2011-06-08 2012-05-25 Dispositif de contrôle dýaffichage, procédé de contrôle dýaffichage et support dýenregistrement

Country Status (4)

Country Link
US (1) US20120313897A1 (fr)
EP (1) EP2533533A1 (fr)
JP (1) JP2012257021A (fr)
CN (1) CN102819413A (fr)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6299492B2 (ja) * 2014-07-03 2018-03-28 ソニー株式会社 情報処理装置、情報処理方法、およびプログラム
KR101751347B1 (ko) 2015-06-16 2017-07-11 엘지전자 주식회사 이동 단말기 및 이의 제어방법
JP6566799B2 (ja) * 2015-09-07 2019-08-28 キヤノン株式会社 提供装置、提供方法、及びプログラム
KR20180020043A (ko) * 2016-08-17 2018-02-27 삼성전자주식회사 다시점 영상 제어 방법 및 이를 지원하는 전자 장치
JP6262890B1 (ja) * 2017-01-13 2018-01-17 株式会社日本エスシーマネージメント 視聴装置、水中空間視聴システム及び水中空間視聴方法
JP6482580B2 (ja) 2017-02-10 2019-03-13 キヤノン株式会社 情報処理装置、情報処理方法、およびプログラム
JP7030452B2 (ja) * 2017-08-30 2022-03-07 キヤノン株式会社 情報処理装置、情報処理装置の制御方法、情報処理システム及びプログラム
JP7125499B2 (ja) * 2018-09-21 2022-08-24 富士フイルム株式会社 画像処理装置及び画像処理方法

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1996031047A2 (fr) * 1995-03-31 1996-10-03 The Regents Of The University Of California Video d'immersion
US20030058111A1 (en) * 2001-09-27 2003-03-27 Koninklijke Philips Electronics N.V. Computer vision based elderly care monitoring system
US20040032495A1 (en) * 2000-10-26 2004-02-19 Ortiz Luis M. Providing multiple synchronized camera views for broadcast from a live venue activity to remote viewers
WO2004040896A2 (fr) * 2002-10-30 2004-05-13 Nds Limited Systeme de diffusion interactif
US20040263626A1 (en) * 2003-04-11 2004-12-30 Piccionelli Gregory A. On-line video production with selectable camera angles
JP2005286685A (ja) 2004-03-30 2005-10-13 Hitachi Ltd 画像生成装置、システムまたは画像合成方法。
US20070103558A1 (en) * 2005-11-04 2007-05-10 Microsoft Corporation Multi-view video delivery
US20080007720A1 (en) * 2005-12-16 2008-01-10 Anurag Mittal Generalized multi-sensor planning and systems
US20100188503A1 (en) * 2009-01-28 2010-07-29 Apple Inc. Generating a three-dimensional model using a portable electronic device recording

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040162154A1 (en) * 2003-02-14 2004-08-19 Dejohn David Kinetic motion analyzer

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1996031047A2 (fr) * 1995-03-31 1996-10-03 The Regents Of The University Of California Video d'immersion
US20040032495A1 (en) * 2000-10-26 2004-02-19 Ortiz Luis M. Providing multiple synchronized camera views for broadcast from a live venue activity to remote viewers
US20030058111A1 (en) * 2001-09-27 2003-03-27 Koninklijke Philips Electronics N.V. Computer vision based elderly care monitoring system
WO2004040896A2 (fr) * 2002-10-30 2004-05-13 Nds Limited Systeme de diffusion interactif
US20040263626A1 (en) * 2003-04-11 2004-12-30 Piccionelli Gregory A. On-line video production with selectable camera angles
JP2005286685A (ja) 2004-03-30 2005-10-13 Hitachi Ltd 画像生成装置、システムまたは画像合成方法。
US20070103558A1 (en) * 2005-11-04 2007-05-10 Microsoft Corporation Multi-view video delivery
US20080007720A1 (en) * 2005-12-16 2008-01-10 Anurag Mittal Generalized multi-sensor planning and systems
US20100188503A1 (en) * 2009-01-28 2010-07-29 Apple Inc. Generating a three-dimensional model using a portable electronic device recording

Also Published As

Publication number Publication date
US20120313897A1 (en) 2012-12-13
JP2012257021A (ja) 2012-12-27
CN102819413A (zh) 2012-12-12

Similar Documents

Publication Publication Date Title
EP2533533A1 (fr) Dispositif de contrôle dýaffichage, procédé de contrôle dýaffichage et support dýenregistrement
US11696021B2 (en) Video recording device and camera function control program
US10198867B2 (en) Display control device, display control method, and program
US9788065B2 (en) Methods and devices for providing a video
KR101800617B1 (ko) 디스플레이 장치 및 이의 화상 통화 방법
CN110086905B (zh) 一种录像方法及电子设备
JP5570176B2 (ja) 画像処理システム及び情報処理方法
EP2860968B1 (fr) Dispositif de traitement de données, procédé pour le traitement de données, et programme
US20160180599A1 (en) Client terminal, server, and medium for providing a view from an indicated position
EP2999217A1 (fr) Appareil de traitement d'image, procédé de traitement d'image, système de traitement d'image et programme
CN110506415B (zh) 一种录像方法及电子设备
US20070222858A1 (en) Monitoring system, monitoring method and program therefor
EP4284009A1 (fr) Procédé d'acquisition d'image et dispositif électronique
JP2011259384A (ja) 撮像装置、表示装置、プログラム及び記録媒体
JP6064995B2 (ja) 情報処理装置、情報処理方法およびプログラム
KR101350068B1 (ko) 관심 영역 이미지를 출력하기 위한 전자 장치
CN104243807A (zh) 图像处理装置及图像处理方法
CN104871532A (zh) 拍摄装置及其动作控制方法
JP2012095238A (ja) 画像データ取得装置および画像データ取得装置の制御プログラム
JP2008305156A (ja) 顔画像抽出蓄積システム、及び顔画像抽出蓄積方法

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20120525

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN

18W Application withdrawn

Effective date: 20150108