US20130329016A1 - Apparatus and method for generating a three-dimensional image using a collaborative photography group - Google Patents
Apparatus and method for generating a three-dimensional image using a collaborative photography group Download PDFInfo
- Publication number
- US20130329016A1 US20130329016A1 US13/964,702 US201313964702A US2013329016A1 US 20130329016 A1 US20130329016 A1 US 20130329016A1 US 201313964702 A US201313964702 A US 201313964702A US 2013329016 A1 US2013329016 A1 US 2013329016A1
- Authority
- US
- United States
- Prior art keywords
- cameras
- photographing
- camera
- collaborative
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H04N13/0203—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/243—Image signal generators using stereoscopic image cameras using three or more 2D image sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/40—Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
- G06F16/44—Browsing; Visualisation therefor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/296—Synchronisation thereof; Control thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
Definitions
- the following description relates to a technology enabling a plurality of cameras to collaboratively photograph an object to obtain a three-dimensional (3D) image of the object.
- a camera installed in a mobile device such as a mobile phone.
- the 3D object may not be accurately represented in a 3D space because the view of the camera may be limited.
- Examples of using a plurality of cameras to photograph a single subject may exist. For example, each member of a fan club of a singer may take a photo in the singer's concert using each member's camera, or students visiting a zoo may each take a photo of a particular animal using each camera.
- this method it is difficult to obtain images of the same object at the same time, and therefore, it may be difficult to accurately represent an object in 3D.
- a method of a target camera connected to a communication network comprising synchronizing the time of one or more neighboring cameras and the target camera that are each included in a previously formed collaborative photography group, in order to synchronize photographing times of the neighbor camera and the target camera, collecting direction angle information and location information about the one or more neighboring cameras, collecting image information of the one or more neighboring cameras, which includes image information transmitted from the one or more neighboring cameras, and obtained when the one or more neighboring cameras photograph an object at the synchronized photographing time, processing the image information based on the direction angle information and the location information of the one or more neighboring cameras, and generating a three-dimensional (3D) image of the object based on the processed image information.
- 3D three-dimensional
- the synchronizing may include synchronizing the time of the target camera and an access point of the collaborative photography group, in order to synchronize the time of the one or more neighboring cameras and the target camera.
- the method may further include photographing the object at the synchronized photographing time to generate image information of the target camera.
- the processing the image information may further include processing the image information of the target camera based on direction angle information and location information of the target camera.
- the collecting the direction angle information and the location information of the one or more neighboring cameras may include collecting the direction angle information and the location information of the one or more neighboring cameras using an ad hoc network.
- the collecting of the direction angle information and the location information of the one or more neighboring cameras may include estimating the location information of the one or more neighboring cameras based on a signal received from the one or more neighboring cameras, respectively.
- the collecting of the direction angle information and the location information of the one or more neighboring cameras may include collecting the direction angle of the one or more neighboring cameras using an accelerometer and a geomagnetic sensor that are installed in each of the one or more neighboring cameras.
- the collected image information may be network-coded.
- the access point may be any one of a plurality of cameras included in the collaborative photography group, or a base station of a cellular system.
- the method may further include receiving a multicast group identifier (ID) of the collaborative photography group to enable a multicast scheme to be used to collect at least one of the location information, the direction angle information, and the image information of the one or more neighboring cameras.
- ID multicast group identifier
- a method of a target camera connected to a communication network including synchronizing the time of one or more neighboring cameras and the target camera that are each included in a previously formed collaborative photography group, in order to synchronize photographing times of the neighbor camera and the target camera, collecting direction angle information and location information of the target camera, photographing a subject at the synchronized photographing time and generating image information of the target camera, and uploading the image information, the location information, and the direction angle information of the target camera to a predetermined server.
- the method may further include receiving a three-dimensional (3D) image of the subject from the predetermined server.
- a computer-readable storage medium having stored therein program instructions to cause a processor to implement a method of a target camera connected to a communication network, the method including synchronizing the time of one or more neighboring cameras and the target camera that are each included in a previously formed collaborative photography group, in order to synchronize photographing times of the neighbor camera and the target camera, collecting direction angle information and location information of the one or more neighboring cameras, collecting image information of the one or more neighboring cameras, which includes image information transmitted from the one or more neighboring cameras and obtained when the one or more neighboring cameras photograph an object at the synchronized photographing time, processing the image information based on the direction angle information and the location information of the neighbor camera, and generating a 3D image of the object.
- a target camera connected to a communication network, the target camera including a time synchronization unit to synchronize the time of one or more neighboring cameras and the target camera that are each included in a previously formed collaborative photography group, in order to synchronize photographing times of the one or more neighboring cameras and the target camera, a location/direction angle information collection unit to collect direction angle information and location information of the one or more neighboring cameras, an image information collection unit to collect image information of the one or more neighboring cameras, which includes image information transmitted from the one or more neighboring cameras and obtained when the one or more neighboring cameras photograph an object at the synchronized photographing time, an image generation unit to process the image information based on the direction angle information and the location information of the one or more neighboring cameras, and to generate a 3D image of the object.
- the target camera may further include a photography unit to photograph the object at the synchronized photographing time to generate image information of the target camera.
- FIG. 1 is a diagram illustrating a plurality of cameras photographing an object.
- FIG. 2 is a flowchart illustrating an example of a method performed by one or more cameras from a plurality of cameras.
- FIG. 3 is a flowchart illustrating an example of a method of a mobile telecommunications operator, a plurality of cameras, and a server.
- FIG. 4 is a flowchart illustrating an example of a plurality of cameras and a server.
- FIG. 5 is a flowchart illustrating an example of a method for synchronizing a plurality of cameras.
- FIG. 6 is a flowchart illustrating an example of a method for measuring absolute/relative locations and measuring direction angle.
- FIG. 7 is a flowchart illustrating an example of a method of simultaneously photographing an object.
- FIG. 8 is a block diagram illustrating an example of a camera.
- object may refer to a person, place, or thing, for example, a person, an animal, a setting, a physical object, a live object, or any other three-dimensional (3D) object that may be photographed.
- 3D three-dimensional
- FIG. 1 illustrates an example of a plurality of cameras photographing an object.
- a plurality of cameras may exist with respect to a single object.
- the object is a moving vehicle.
- four users who are members of a predetermined group may photograph the object from different locations.
- a communication module may be installed in each of the cameras, or a mobile terminal such as a cellular phone, a notebook computer, and the like.
- the plurality of cameras may photograph the subject from different locations and angles and thus, a three-dimensional (3D) image of the object may be generated. However, because the subject moves, or the background changes, the plurality of cameras should photograph the subject at a same photographing time to accurately represent the 3D image.
- times of the plurality of cameras should be synchronized to enable the plurality of cameras to have the same photographing time. That is, the cameras may be synchronized to photograph an object at the same time.
- the plurality of cameras may configure a communication network between the cameras, for example, an ad hoc network, a cellular communication network, and the like.
- the photographing time of the plurality of cameras may have the same photographing time.
- the plurality of cameras may share location information of the other cameras and direction angle information of other cameras with respect to an object.
- FIG. 2 illustrates an example of a method performed by a plurality of cameras.
- the plurality of cameras form a collaborative photography group.
- cameras of users who are members of a predetermined club, or cameras of users who go to a zoo may form a collaborative photography group to photograph an object in collaboration with each other. It should be understood that these examples are merely mentioned for purposes of example.
- the collaborative photography group may be formed in various ways.
- the plurality of cameras may form the collaborative photography group by sharing an authentication key in advance or by being registered on an online website.
- the plurality of cameras synchronize times to enable an object to be photographed at the same photographing time. Therefore, each of the cameras may photograph the object at the same photographing time and 3D modeling may be performed with respect to the object. A variety of time synchronization methods are further described with reference to FIG. 5 .
- each of the cameras measures an absolute/relative location of itself or an absolute/relative location of another camera.
- the absolute location may be measured by a Global Positioning System (GPS), and the relative location may be measured by analyzing signals received destined for other cameras.
- GPS Global Positioning System
- each of the four cameras may measure an absolute location of itself and an absolute location of the other three cameras using a GPS. Also, each of the cameras may sequentially transmit a well-known preamble.
- a predetermined camera may transmit the preamble at a predetermined point in time, and other cameras may estimate a distance between the predetermined camera and the other cameras based on, for example, the signal intensity of the received preamble, and the like.
- relative locations of the predetermined camera and the other cameras may be estimated.
- the distance and relative locations of the cameras may be estimated based on a Degree Of Arrival (DOA) of the received preamble.
- DOA Degree Of Arrival
- DTOA Difference of Time-Of-Arrival
- each of the cameras measures a direction angle of each of the cameras, with respect to the object.
- location information and direction angle information of each camera may be collected. Because images photographed by the cameras, may vary depending on a direction where each of the cameras faces, and the location of each camera, the location information and the direction angle information may be used to perform 3D modeling.
- the direction angle information may be obtained by an accelerometer and/or a geometric sensor installed in each of the cameras.
- each of the cameras simultaneously photographs the object at a requested photographing time.
- image information about an image photographed by each of the cameras, location information, and the direction angle information of each of the cameras may be shared with all the cameras of the group.
- each of the cameras may broadcast the image information, location information, and direction angle information to each of the other cameras among the group.
- each of the cameras may share the image information, location information, and direction angle information.
- the above-described information may be transmitted via a communication network. Also, the information may be network-coded and transmitted/received by the cameras.
- each of the cameras performs 3D rendering to generate a 3D image after sharing the image information, location information, and direction angle information. That is, image information of the cameras may be photographed at various locations and views (direction angles) and shared. Thus, each of the cameras may generate the 3D image by processing the image information.
- the flowchart of FIG. 2 is associated with an example where the cameras perform 3D rendering.
- the 3D rendering may be performed by a server, such as a server of a mobile telecommunications provider or the 3D rendering may be performed by the camera itself.
- the cameras may or may not configure a communication network between the cameras.
- the 3D rendering may be performed using only the captured images of the camera. That is, the 3D rendering may be performed without using at least one of the location information or the camera angle information. Also, not all cameras may have location information and/or camera angle information.
- FIG. 3 illustrates an example of a method of a mobile telecommunications operator, a plurality of cameras, and a server.
- the mobile telecommunications operator forms a collaborative photography group including a plurality of cameras.
- a user may previously subscribe to a service that is provided by the mobile telecommunications operator, through a webpage of the mobile telecommunications operator, and the like.
- the cameras may be included in a collaborative photography group by the mobile telecommunications operator.
- the cameras synchronize times using a network of a cellular communication system for synchronization of photographing times.
- the cameras may not be able to measure relative locations of the other cameras.
- the cameras may measure absolute locations, for example, using a GPS.
- the cameras measure a direction angle of the object using an accelerometer installed in each of the cameras.
- the cameras simultaneously photograph the object at the synchronized photographing time.
- each of the cameras uploads the image information to a predetermined server. Also, location information and direction angle information of each of the cameras may be uploaded to the server.
- the server may be operated by the mobile telecommunications operator, and connected to the Internet or the network of the cellular communication system.
- the server In operation 370 , the server generates a 3D image by processing (3D rendering) the image information of the cameras. That is, the server may generate the 3D image by processing the image information of all the cameras based on the location information and the direction angle information of the cameras.
- the server may upload the generated 3D image to the cameras. Accordingly, the cameras may obtain the 3D image without performing 3D rendering.
- FIG. 4 illustrates an example method of a plurality of cameras and a server.
- the cameras configure an ad hoc network between the cameras and 3D rendering is performed by a server.
- the plurality of cameras form a collaborative photography group.
- the plurality of cameras synchronize times through an ad hoc network, for example, a Wireless Local Area Network (WLAN), Bluetooth, and the like.
- the plurality of cameras measure relative or absolute locations of the cameras.
- the plurality of cameras measure a direction angle of the object using an accelerometer.
- the cameras simultaneously photograph the object at the synchronized photographing time.
- the cameras upload obtained image information, location information, and direction angle information to the server.
- the server generates a 3D image by processing (3D rendering) the image information of the cameras.
- the server uploads the 3D image to the plurality of cameras.
- FIG. 5 illustrates an example of two methods for synchronizing photographing time.
- the methods may be used in the example methods illustrated in FIG. 2 , FIG. 3 , and FIG. 4 .
- operations 220 , 320 , and 420 may be performed by either of the methods illustrated in FIG. 5 .
- the method for synchronizing photographing time includes two examples, a method 1 and a method 2, as described below. Any one of the method 1 and the method 2 may be selectively used.
- method 1 may be selected.
- cameras may store and share an authentication key in advance.
- any one of the cameras may be set as an access point (AP). For example, a camera having the greatest coverage may be set as the AP.
- the cameras may be associated with and authenticated by the AP. For the association, the cameras are synchronized with the AP.
- time synchronization of the cameras may be performed through the association.
- a network such as a Basic Service Set (BSS) may be formed.
- BSS Basic Service Set
- the AP may generate a multicast group identifier (ID) and share the multicast group ID with the cameras to enable the cameras, included in the collaborative photography group, to use a multicast scheme.
- ID multicast group identifier
- method 2 may be selected.
- a plurality of cameras may subscribe to a predetermined service that is provided by a mobile telecommunications operator, to form a collaborative photography group.
- the cameras may be associated with a base station for authentication.
- time synchronization may be performed with respect to the cameras.
- the base station of a cellular communication system may transmit a single multicast group ID to the cameras with respect to each of the cameras.
- photographing times of the cameras may be synchronized using either method 1 or method 2.
- FIG. 6 illustrates an example of two methods for measuring absolute/relative locations and measuring direction angle.
- the methods may be used in the example methods illustrated in FIG. 2 , FIG. 3 , and FIG. 4 .
- operations 230 and 240 , 330 and 340 , and 430 and 440 may be performed by either of the methods illustrated in FIG. 6 .
- FIG. 6 illustrates two example measurement methods, method 1 and method 2.
- the example method 1 uses an ad hoc network and the example method 2 uses a GPS.
- method 1 may be selected in operation 610 .
- an index n of the cameras may be set as “1”.
- an n th camera hereinafter referred to as camera n, may transmit a preamble.
- camera n may transmit a preamble.
- all cameras may sequentially transmit preambles.
- other cameras may receive the preamble that is transmitted by the camera n, and measure the received signal. For example, an intensity of the received signal may be measured.
- the cameras may measure relative locations based on the received signal in operation 625 . As described above, a DOA of the preamble that is received may be used to estimate the relative locations of the cameras.
- each of the cameras may measure a direction angle of the object using an accelerometer.
- each of the cameras may measure absolute locations using a GPS in operation 631 .
- each of the cameras may measure the direction angle of the object using the accelerometer.
- FIG. 7 illustrates a an example of two methods for simultaneously photographing an object.
- the methods may be used in the example methods illustrated in FIG. 2 , FIG. 3 , and FIG. 4 .
- operation 250 , 350 , and 450 may be performed by either of the methods illustrated in FIG. 7 .
- a plurality of cameras may simultaneously photograph in various ways. According to various embodiments, the two methods illustrated in FIG. 7 , method 1 and method 2, may be used. However, these methods are merely for purposes of example, and it should be understood that the method of simultaneously photographing may vary.
- method 1 may be used in operation 710 .
- a terminal that is, a camera n from among a plurality of cameras, may have a shutter button pushed.
- a photographing request of the camera n may be transmitted to an AP.
- the AP may forward the photographing request to each of the cameras included in the plurality of cameras.
- the AP may be any one of the plurality of cameras.
- each the cameras including the camera n may simultaneously photograph an object when a photographing request is received or at a time corresponding to the photographing request.
- a camera n from among the plurality of cameras may have a shutter button pushed in operation 731 .
- a photographing request of the camera n may be transmitted to a base station of a cellular communication system.
- the base station may forward the photographing request to all the cameras, for example, using a Short Message Service (SMS) scheme, a multicast scheme, and the like.
- SMS Short Message Service
- each the cameras including the camera n may simultaneously photograph an object when the photographing request is received or at a time corresponding to the photographing request.
- FIG. 8 illustrates an example of a camera.
- camera 800 may include a photographing time synchronization unit 810 , a photography unit 820 , a location/direction angle information collection unit 830 , an image information collection unit 840 , and an image generation unit 850 .
- the time synchronization unit 810 synchronizes the internal time of cameras for synchronizing photographing times of one or more neighboring cameras and the camera 800 .
- the one or more neighboring cameras and the camera 800 may be included in a previously formed collaborative photography group.
- the photography unit 820 and the one or more neighboring cameras may simultaneously photograph an object at a synchronized photographing time.
- the location/direction angle information collection unit 830 may collect direction angle information and location information of the one or more neighboring cameras and/or direction angle information and location information of the camera 800 .
- the image information collection unit 840 may collect image information of the one or more neighboring cameras.
- the image generation unit 850 may process the image information of the one or more neighboring cameras and the camera 800 , based on the direction angle information and the location information of the one or more neighboring cameras and the camera 800 . For example, the image generation unit 850 may perform 3D rendering based on locations and direction angles of the cameras, and may generate a 3D image of the object.
- 3D rendering may be performed based on the locations and direction angles of the cameras and a 3D video and/or a 3D image may be generated.
- the terminal device described herein may refer to mobile devices such as a cellular phone, a personal digital assistant (PDA), a digital camera, a portable game console, an MP3 player, a portable/personal multimedia player (PMP), a handheld e-book, a portable lab-top personal computer (PC), a global positioning system (GPS) navigation, and devices such as a desktop PC, a high definition television (HDTV), an optical disc player, a setup box, and the like, capable of wireless communication or network communication consistent with that disclosed herein.
- mobile devices such as a cellular phone, a personal digital assistant (PDA), a digital camera, a portable game console, an MP3 player, a portable/personal multimedia player (PMP), a handheld e-book, a portable lab-top personal computer (PC), a global positioning system (GPS) navigation, and devices such as a desktop PC, a high definition television (HDTV), an optical disc player, a setup box, and the like, capable of wireless communication or network communication consistent with that disclosed herein
- a computing system or a computer may include a microprocessor that is electrically connected with a bus, a user interface, and a memory controller. It may further include a flash memory device. The flash memory device may store N-bit data via the memory controller. The N-bit data is processed or will be processed by the microprocessor and N may be 1 or an integer greater than 1. Where the computing system or computer is a mobile apparatus, a battery may be additionally provided to supply operation voltage of the computing system or computer.
- the computing system or computer may further include an application chipset, a camera image processor (CIS), a mobile Dynamic Random Access Memory (DRAM), and the like.
- the memory controller and the flash memory device may constitute a solid state drive/disk (SSD) that uses a non-volatile memory to store data.
- SSD solid state drive/disk
- the methods described above may be recorded, stored, or fixed in one or more computer-readable storage media that includes program instructions to be implemented by a computer to cause a processor to execute or perform the program instructions.
- the media may also include, alone or in combination with the program instructions, data files, data structures, and the like.
- Examples of computer-readable storage media may include magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD ROM disks and DVDs; magneto-optical media such as optical disks; and hardware devices that are specially configured to store and perform program instructions, such as read-only memory (ROM), random access memory (RAM), flash memory, and the like.
- Examples of program instructions include machine code, such as produced by a compiler, and files containing higher level code that may be executed by the computer using an interpreter.
- the described hardware devices may be configured to act as one or more software modules in order to perform the operations and methods described above, or vice versa.
- a computer-readable storage medium may be distributed among computer systems connected through a network and computer-readable codes or program instructions may be stored and executed in a decentralized manner.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Studio Devices (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Mobile Radio Communication Systems (AREA)
Abstract
A communication network comprising a collaborative photography group including a plurality of cameras having synchronized photographing times, is provided. The plurality of cameras may share location information, direction angle information, and image information generated by photographing an object, and generate a three-dimensional (3D) image of the object.
Description
- This application is a Continuation application of U.S. patent application Ser. No. 12/896,499, filed Dec. 1, 2010, which claims the benefit under 35 U.S.C. §119(a) of Korean Patent Application No. 10-2009-0107389, filed on Nov. 9, 2009, in the Korean Intellectual Property Office, the entire disclosures of which are incorporated herein by reference for all purposes.
- 1. Field
- The following description relates to a technology enabling a plurality of cameras to collaboratively photograph an object to obtain a three-dimensional (3D) image of the object.
- 2. Description of the Related Art
- Currently, users are capable of taking video or still photographs using a camera installed in a mobile device such as a mobile phone.
- When a single user photographs a three-dimensional (3D) object using a single camera, the 3D object may not be accurately represented in a 3D space because the view of the camera may be limited.
- Examples of using a plurality of cameras to photograph a single subject may exist. For example, each member of a fan club of a singer may take a photo in the singer's concert using each member's camera, or students visiting a zoo may each take a photo of a particular animal using each camera. However, with this method, it is difficult to obtain images of the same object at the same time, and therefore, it may be difficult to accurately represent an object in 3D.
- In one general aspect, provided is a method of a target camera connected to a communication network, the method comprising synchronizing the time of one or more neighboring cameras and the target camera that are each included in a previously formed collaborative photography group, in order to synchronize photographing times of the neighbor camera and the target camera, collecting direction angle information and location information about the one or more neighboring cameras, collecting image information of the one or more neighboring cameras, which includes image information transmitted from the one or more neighboring cameras, and obtained when the one or more neighboring cameras photograph an object at the synchronized photographing time, processing the image information based on the direction angle information and the location information of the one or more neighboring cameras, and generating a three-dimensional (3D) image of the object based on the processed image information.
- The synchronizing may include synchronizing the time of the target camera and an access point of the collaborative photography group, in order to synchronize the time of the one or more neighboring cameras and the target camera.
- The method may further include photographing the object at the synchronized photographing time to generate image information of the target camera.
- The processing the image information may further include processing the image information of the target camera based on direction angle information and location information of the target camera.
- The collecting the direction angle information and the location information of the one or more neighboring cameras may include collecting the direction angle information and the location information of the one or more neighboring cameras using an ad hoc network.
- The collecting of the direction angle information and the location information of the one or more neighboring cameras may include estimating the location information of the one or more neighboring cameras based on a signal received from the one or more neighboring cameras, respectively.
- The collecting of the direction angle information and the location information of the one or more neighboring cameras may include collecting the direction angle of the one or more neighboring cameras using an accelerometer and a geomagnetic sensor that are installed in each of the one or more neighboring cameras.
- The collected image information may be network-coded.
- The access point may be any one of a plurality of cameras included in the collaborative photography group, or a base station of a cellular system.
- The method may further include receiving a multicast group identifier (ID) of the collaborative photography group to enable a multicast scheme to be used to collect at least one of the location information, the direction angle information, and the image information of the one or more neighboring cameras.
- In another aspect, there is provided a method of a target camera connected to a communication network, the method including synchronizing the time of one or more neighboring cameras and the target camera that are each included in a previously formed collaborative photography group, in order to synchronize photographing times of the neighbor camera and the target camera, collecting direction angle information and location information of the target camera, photographing a subject at the synchronized photographing time and generating image information of the target camera, and uploading the image information, the location information, and the direction angle information of the target camera to a predetermined server.
- The method may further include receiving a three-dimensional (3D) image of the subject from the predetermined server.
- In another aspect, there is provided a computer-readable storage medium having stored therein program instructions to cause a processor to implement a method of a target camera connected to a communication network, the method including synchronizing the time of one or more neighboring cameras and the target camera that are each included in a previously formed collaborative photography group, in order to synchronize photographing times of the neighbor camera and the target camera, collecting direction angle information and location information of the one or more neighboring cameras, collecting image information of the one or more neighboring cameras, which includes image information transmitted from the one or more neighboring cameras and obtained when the one or more neighboring cameras photograph an object at the synchronized photographing time, processing the image information based on the direction angle information and the location information of the neighbor camera, and generating a 3D image of the object.
- In another aspect, there is provided a target camera connected to a communication network, the target camera including a time synchronization unit to synchronize the time of one or more neighboring cameras and the target camera that are each included in a previously formed collaborative photography group, in order to synchronize photographing times of the one or more neighboring cameras and the target camera, a location/direction angle information collection unit to collect direction angle information and location information of the one or more neighboring cameras, an image information collection unit to collect image information of the one or more neighboring cameras, which includes image information transmitted from the one or more neighboring cameras and obtained when the one or more neighboring cameras photograph an object at the synchronized photographing time, an image generation unit to process the image information based on the direction angle information and the location information of the one or more neighboring cameras, and to generate a 3D image of the object.
- The target camera may further include a photography unit to photograph the object at the synchronized photographing time to generate image information of the target camera.
- Other features and aspects may be apparent from the following description, the drawings, and the claims.
-
FIG. 1 is a diagram illustrating a plurality of cameras photographing an object. -
FIG. 2 is a flowchart illustrating an example of a method performed by one or more cameras from a plurality of cameras. -
FIG. 3 is a flowchart illustrating an example of a method of a mobile telecommunications operator, a plurality of cameras, and a server. -
FIG. 4 is a flowchart illustrating an example of a plurality of cameras and a server. -
FIG. 5 is a flowchart illustrating an example of a method for synchronizing a plurality of cameras. -
FIG. 6 is a flowchart illustrating an example of a method for measuring absolute/relative locations and measuring direction angle. -
FIG. 7 is a flowchart illustrating an example of a method of simultaneously photographing an object. -
FIG. 8 is a block diagram illustrating an example of a camera. - Throughout the drawings and the description, unless otherwise described, the same drawing reference numerals should be understood to refer to the same elements, features, and structures. The relative size and depiction of these elements may be exaggerated for clarity, illustration, and convenience.
- The following description is provided to assist the reader in gaining a comprehensive understanding of the methods, apparatuses, and/or systems described herein. Accordingly, various changes, modifications, and equivalents of the methods, apparatuses, and/or systems described herein may be suggested to those of ordinary skill in the art. Also, descriptions of well-known functions and constructions may be omitted for increased clarity and conciseness.
- As described herein, the term object may refer to a person, place, or thing, for example, a person, an animal, a setting, a physical object, a live object, or any other three-dimensional (3D) object that may be photographed.
-
FIG. 1 illustrates an example of a plurality of cameras photographing an object. - Referring to
FIG. 1 , a plurality of cameras (camera 1 through camera 4) may exist with respect to a single object. In this example, the object is a moving vehicle. For example, four users who are members of a predetermined group may photograph the object from different locations. A communication module may be installed in each of the cameras, or a mobile terminal such as a cellular phone, a notebook computer, and the like. - The plurality of cameras may photograph the subject from different locations and angles and thus, a three-dimensional (3D) image of the object may be generated. However, because the subject moves, or the background changes, the plurality of cameras should photograph the subject at a same photographing time to accurately represent the 3D image.
- Accordingly, times of the plurality of cameras should be synchronized to enable the plurality of cameras to have the same photographing time. That is, the cameras may be synchronized to photograph an object at the same time. According to various embodiments, the plurality of cameras may configure a communication network between the cameras, for example, an ad hoc network, a cellular communication network, and the like. Thus, the photographing time of the plurality of cameras may have the same photographing time. Also, the plurality of cameras may share location information of the other cameras and direction angle information of other cameras with respect to an object.
-
FIG. 2 illustrates an example of a method performed by a plurality of cameras. - Referring to
FIG. 2 , inoperation 210, the plurality of cameras form a collaborative photography group. For example, cameras of users who are members of a predetermined club, or cameras of users who go to a zoo may form a collaborative photography group to photograph an object in collaboration with each other. It should be understood that these examples are merely mentioned for purposes of example. - The collaborative photography group may be formed in various ways. For example, the plurality of cameras may form the collaborative photography group by sharing an authentication key in advance or by being registered on an online website.
- In
operation 220, the plurality of cameras synchronize times to enable an object to be photographed at the same photographing time. Therefore, each of the cameras may photograph the object at the same photographing time and 3D modeling may be performed with respect to the object. A variety of time synchronization methods are further described with reference toFIG. 5 . - In
operation 230, each of the cameras measures an absolute/relative location of itself or an absolute/relative location of another camera. For example, the absolute location may be measured by a Global Positioning System (GPS), and the relative location may be measured by analyzing signals received destined for other cameras. - For example, in the example where there are four cameras, each of the four cameras may measure an absolute location of itself and an absolute location of the other three cameras using a GPS. Also, each of the cameras may sequentially transmit a well-known preamble. A predetermined camera may transmit the preamble at a predetermined point in time, and other cameras may estimate a distance between the predetermined camera and the other cameras based on, for example, the signal intensity of the received preamble, and the like.
- Accordingly, relative locations of the predetermined camera and the other cameras may be estimated. In some embodiments, the distance and relative locations of the cameras may be estimated based on a Degree Of Arrival (DOA) of the received preamble. When at least two cameras simultaneously transmit a preamble, distances and relative locations of the cameras may be estimated based on a Difference of Time-Of-Arrival (DTOA).
- In
operation 240, each of the cameras measures a direction angle of each of the cameras, with respect to the object. - To perform 3D modeling, location information and direction angle information of each camera may be collected. Because images photographed by the cameras, may vary depending on a direction where each of the cameras faces, and the location of each camera, the location information and the direction angle information may be used to perform 3D modeling. For example, the direction angle information may be obtained by an accelerometer and/or a geometric sensor installed in each of the cameras.
- In
operation 250, when photographing is requested by any one of the cameras, or requested by a predetermined server, each of the cameras simultaneously photographs the object at a requested photographing time. - In
operation 260, image information about an image photographed by each of the cameras, location information, and the direction angle information of each of the cameras may be shared with all the cameras of the group. - For example, each of the cameras may broadcast the image information, location information, and direction angle information to each of the other cameras among the group. Thus, each of the cameras may share the image information, location information, and direction angle information. The above-described information may be transmitted via a communication network. Also, the information may be network-coded and transmitted/received by the cameras.
- In
operation 270, each of the cameras performs 3D rendering to generate a 3D image after sharing the image information, location information, and direction angle information. That is, image information of the cameras may be photographed at various locations and views (direction angles) and shared. Thus, each of the cameras may generate the 3D image by processing the image information. - The flowchart of
FIG. 2 is associated with an example where the cameras perform 3D rendering. However, in various embodiments, the 3D rendering may be performed by a server, such as a server of a mobile telecommunications provider or the 3D rendering may be performed by the camera itself. Also, the cameras may or may not configure a communication network between the cameras. - However, it should be understood that the 3D rendering may be performed using only the captured images of the camera. That is, the 3D rendering may be performed without using at least one of the location information or the camera angle information. Also, not all cameras may have location information and/or camera angle information.
-
FIG. 3 illustrates an example of a method of a mobile telecommunications operator, a plurality of cameras, and a server. - Referring to
FIG. 3 , inoperation 310, the mobile telecommunications operator forms a collaborative photography group including a plurality of cameras. For example, a user may previously subscribe to a service that is provided by the mobile telecommunications operator, through a webpage of the mobile telecommunications operator, and the like. Thus, when the plurality of cameras are located in a predetermined area, the cameras may be included in a collaborative photography group by the mobile telecommunications operator. - In
operation 320, the cameras synchronize times using a network of a cellular communication system for synchronization of photographing times. - In
operation 330, when the cameras do not configure an ad hoc network between the cameras. Accordingly, the cameras may not be able to measure relative locations of the other cameras. Thus, the cameras may measure absolute locations, for example, using a GPS. - In
operation 340, the cameras measure a direction angle of the object using an accelerometer installed in each of the cameras. - In
operation 350, the cameras simultaneously photograph the object at the synchronized photographing time. - In
operation 360, when image information is obtained by simultaneously photographing the object, each of the cameras uploads the image information to a predetermined server. Also, location information and direction angle information of each of the cameras may be uploaded to the server. For example, the server may be operated by the mobile telecommunications operator, and connected to the Internet or the network of the cellular communication system. - In
operation 370, the server generates a 3D image by processing (3D rendering) the image information of the cameras. That is, the server may generate the 3D image by processing the image information of all the cameras based on the location information and the direction angle information of the cameras. - In
operation 380, the server may upload the generated 3D image to the cameras. Accordingly, the cameras may obtain the 3D image without performing 3D rendering. -
FIG. 4 illustrates an example method of a plurality of cameras and a server. In this example, the cameras configure an ad hoc network between the cameras and 3D rendering is performed by a server. - Referring to
FIG. 4 , inoperation 410, the plurality of cameras form a collaborative photography group. - In
operation 420, the plurality of cameras synchronize times through an ad hoc network, for example, a Wireless Local Area Network (WLAN), Bluetooth, and the like. Inoperation 430, the plurality of cameras measure relative or absolute locations of the cameras. Inoperation 440, the plurality of cameras measure a direction angle of the object using an accelerometer. - In
operation 450, the cameras simultaneously photograph the object at the synchronized photographing time. - In
operation 460, the cameras upload obtained image information, location information, and direction angle information to the server. Inoperation 470, the server generates a 3D image by processing (3D rendering) the image information of the cameras. Inoperation 480, the server uploads the 3D image to the plurality of cameras. -
FIG. 5 illustrates an example of two methods for synchronizing photographing time. The methods may be used in the example methods illustrated inFIG. 2 ,FIG. 3 , andFIG. 4 . For example,operations FIG. 5 . - Referring to
FIG. 5 , the method for synchronizing photographing time includes two examples, amethod 1 and amethod 2, as described below. Any one of themethod 1 and themethod 2 may be selectively used. - In
operation 510,method 1 may be selected. Inoperation 521, cameras may store and share an authentication key in advance. Inoperation 522, when the cameras having the authentication key enter a predetermined area, any one of the cameras may be set as an access point (AP). For example, a camera having the greatest coverage may be set as the AP. After the AP is set, the cameras may be associated with and authenticated by the AP. For the association, the cameras are synchronized with the AP. - Accordingly, in
operation 523, time synchronization of the cameras may be performed through the association. When the authentication is completed, inoperation 524, a network such as a Basic Service Set (BSS) may be formed. Inoperation 525, the AP may generate a multicast group identifier (ID) and share the multicast group ID with the cameras to enable the cameras, included in the collaborative photography group, to use a multicast scheme. - Alternatively, in
operation 510,method 2 may be selected. Inoperation 531, a plurality of cameras may subscribe to a predetermined service that is provided by a mobile telecommunications operator, to form a collaborative photography group. Inoperation 532, when the cameras subscribed to the service enter a predetermined area, the cameras may be associated with a base station for authentication. Inoperation 533, time synchronization may be performed with respect to the cameras. Inoperation 534, the base station of a cellular communication system may transmit a single multicast group ID to the cameras with respect to each of the cameras. - Accordingly, photographing times of the cameras may be synchronized using either
method 1 ormethod 2. -
FIG. 6 illustrates an example of two methods for measuring absolute/relative locations and measuring direction angle. The methods may be used in the example methods illustrated inFIG. 2 ,FIG. 3 , andFIG. 4 . For example,operations FIG. 6 . - Absolute or relative locations of the cameras may be measured using various methods.
FIG. 6 illustrates two example measurement methods,method 1 andmethod 2. Theexample method 1 uses an ad hoc network and theexample method 2 uses a GPS. - Referring to
FIG. 6 ,method 1 may be selected inoperation 610. Inoperation 621, an index n of the cameras may be set as “1”. - In
operation 622, an nth camera, hereinafter referred to as camera n, may transmit a preamble. In this example, all cameras may sequentially transmit preambles. - In
operation 623, other cameras may receive the preamble that is transmitted by the camera n, and measure the received signal. For example, an intensity of the received signal may be measured. - In
operation 624, it may be determined whether all the cameras have transmitted the preamble. When all N cameras do not transmit the preamble, for example, another camera such as camera n+1 may transmit the preamble. Conversely, when the N cameras transmit the preamble, the cameras may measure relative locations based on the received signal inoperation 625. As described above, a DOA of the preamble that is received may be used to estimate the relative locations of the cameras. - In
operation 626, each of the cameras may measure a direction angle of the object using an accelerometer. - Alternatively, when
method 2 is selected inoperation 610, each of the cameras may measure absolute locations using a GPS inoperation 631. Inoperation 632, each of the cameras may measure the direction angle of the object using the accelerometer. -
FIG. 7 illustrates a an example of two methods for simultaneously photographing an object. The methods may be used in the example methods illustrated inFIG. 2 ,FIG. 3 , andFIG. 4 . For example,operation FIG. 7 . - A plurality of cameras may simultaneously photograph in various ways. According to various embodiments, the two methods illustrated in
FIG. 7 ,method 1 andmethod 2, may be used. However, these methods are merely for purposes of example, and it should be understood that the method of simultaneously photographing may vary. - Referring to
FIG. 7 ,method 1 may be used inoperation 710. Inoperation 721, a terminal, that is, a camera n from among a plurality of cameras, may have a shutter button pushed. Inoperation 722, a photographing request of the camera n may be transmitted to an AP. Inoperation 723, the AP may forward the photographing request to each of the cameras included in the plurality of cameras. For example, the AP may be any one of the plurality of cameras. Inoperation 724, each the cameras including the camera n, may simultaneously photograph an object when a photographing request is received or at a time corresponding to the photographing request. - Alternatively, when
method 2 is selected inoperation 710, a camera n from among the plurality of cameras may have a shutter button pushed inoperation 731. Inoperation 732, a photographing request of the camera n may be transmitted to a base station of a cellular communication system. Inoperation 733, the base station may forward the photographing request to all the cameras, for example, using a Short Message Service (SMS) scheme, a multicast scheme, and the like. Inoperation 734, each the cameras including the camera n may simultaneously photograph an object when the photographing request is received or at a time corresponding to the photographing request. -
FIG. 8 illustrates an example of a camera. - Referring to
FIG. 8 ,camera 800 may include a photographingtime synchronization unit 810, aphotography unit 820, a location/direction angleinformation collection unit 830, an imageinformation collection unit 840, and animage generation unit 850. - The
time synchronization unit 810 synchronizes the internal time of cameras for synchronizing photographing times of one or more neighboring cameras and thecamera 800. In this example, the one or more neighboring cameras and thecamera 800 may be included in a previously formed collaborative photography group. - The
photography unit 820 and the one or more neighboring cameras may simultaneously photograph an object at a synchronized photographing time. - The location/direction angle
information collection unit 830 may collect direction angle information and location information of the one or more neighboring cameras and/or direction angle information and location information of thecamera 800. - The image
information collection unit 840 may collect image information of the one or more neighboring cameras. - The
image generation unit 850 may process the image information of the one or more neighboring cameras and thecamera 800, based on the direction angle information and the location information of the one or more neighboring cameras and thecamera 800. For example, theimage generation unit 850 may perform 3D rendering based on locations and direction angles of the cameras, and may generate a 3D image of the object. - Accordingly, 3D rendering may be performed based on the locations and direction angles of the cameras and a 3D video and/or a 3D image may be generated.
- As a non-exhaustive illustration only, the terminal device described herein may refer to mobile devices such as a cellular phone, a personal digital assistant (PDA), a digital camera, a portable game console, an MP3 player, a portable/personal multimedia player (PMP), a handheld e-book, a portable lab-top personal computer (PC), a global positioning system (GPS) navigation, and devices such as a desktop PC, a high definition television (HDTV), an optical disc player, a setup box, and the like, capable of wireless communication or network communication consistent with that disclosed herein.
- A computing system or a computer may include a microprocessor that is electrically connected with a bus, a user interface, and a memory controller. It may further include a flash memory device. The flash memory device may store N-bit data via the memory controller. The N-bit data is processed or will be processed by the microprocessor and N may be 1 or an integer greater than 1. Where the computing system or computer is a mobile apparatus, a battery may be additionally provided to supply operation voltage of the computing system or computer.
- It should be apparent to those of ordinary skill in the art that the computing system or computer may further include an application chipset, a camera image processor (CIS), a mobile Dynamic Random Access Memory (DRAM), and the like. The memory controller and the flash memory device may constitute a solid state drive/disk (SSD) that uses a non-volatile memory to store data.
- The methods described above may be recorded, stored, or fixed in one or more computer-readable storage media that includes program instructions to be implemented by a computer to cause a processor to execute or perform the program instructions. The media may also include, alone or in combination with the program instructions, data files, data structures, and the like. Examples of computer-readable storage media may include magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD ROM disks and DVDs; magneto-optical media such as optical disks; and hardware devices that are specially configured to store and perform program instructions, such as read-only memory (ROM), random access memory (RAM), flash memory, and the like. Examples of program instructions include machine code, such as produced by a compiler, and files containing higher level code that may be executed by the computer using an interpreter. The described hardware devices may be configured to act as one or more software modules in order to perform the operations and methods described above, or vice versa. In addition, a computer-readable storage medium may be distributed among computer systems connected through a network and computer-readable codes or program instructions may be stored and executed in a decentralized manner.
- A number of examples have been described above. Nevertheless, it should be understood that various modifications may be made. For example, suitable results may be achieved if the described techniques are performed in a different order and/or if components in a described system, architecture, device, or circuit are combined in a different manner and/or replaced or supplemented by other components or their equivalents. Accordingly, other implementations are within the scope of the following claims.
Claims (17)
1. A collaborative operation method of a camera, the method comprising:
forming a collaborative photography group through a communication network with at least one another camera;
receiving a photographing request through the communication network in response to a pushing of a shutter button of a single camera included in the collaborative photography group; and
photographing an object in response to the received photographing request.
2. The method of claim 1 , wherein the forming comprises forming the collaborative photography group by sharing an authentication key with the at least one another camera.
3. The method of claim 1 , wherein the communication network comprises a network of a cellular communication system, an ad hoc network, and a wireless local area network (WLAN).
4. The method of claim 1 , wherein the photographing comprises photographing the object simultaneously when the photographing request is received or at a time corresponding to the photographing request.
5. The method of claim 1 , wherein the photographing comprises photographing the object at a photographing time synchronized with cameras included in the collaborative photography group.
6. The method of claim 1 , further comprising:
synchronizing times of cameras included in the collaborative photography group to enable the cameras to photograph the object at the same time.
7. The method of claim 1 , further comprising:
collecting image information generated by cameras included in the collaborative photography group, and processing the image information.
8. A non-transitory computer-readable medium comprising a program for instructing a computer to perform the method of claim 1 .
9. A collaborative operation method using a plurality of independent cameras having a communication function, the method comprising:
synchronizing each of the plurality of cameras through the communication function; and
photographing the same object using the synchronized plurality of cameras.
10. The method of claim 9 , wherein the photographing comprises simultaneously photographing the same object using the plurality of cameras.
11. The method of claim 9 , further comprising:
transmitting images of the object photographed by the plurality of cameras, through the communication function.
12. The method of claim 11 , further comprising:
transmitting location information of the plurality of cameras and direction angle information of the plurality of cameras with respect to the object.
13. A method of obtaining a collaborative photography image using a plurality of independent cameras having a communication function, the method comprising:
synchronizing each of the plurality of cameras through the communication function; and
controlling the synchronized plurality of cameras to photograph the same object.
14. The method of claim 13 , wherein the controlling comprises controlling the plurality of cameras to simultaneously photograph the same object.
15. The method of claim 13 , further comprising:
receiving images of the object photographed by the plurality of cameras, through the communication function.
16. The method of claim 15 , further comprising:
generating a three-dimensional (3D) image using the received images.
17. The method of claim 13 , further comprising:
receiving location information of the plurality of cameras and direction angle information of the plurality of cameras with respect to the object.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/964,702 US20130329016A1 (en) | 2009-11-09 | 2013-08-12 | Apparatus and method for generating a three-dimensional image using a collaborative photography group |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2009-0107389 | 2009-11-09 | ||
KR1020090107389A KR101594048B1 (en) | 2009-11-09 | 2009-11-09 | 3 device and method for generating 3 dimensional image using cooperation between cameras |
US12/896,499 US8810632B2 (en) | 2009-11-09 | 2010-10-01 | Apparatus and method for generating a three-dimensional image using a collaborative photography group |
US13/964,702 US20130329016A1 (en) | 2009-11-09 | 2013-08-12 | Apparatus and method for generating a three-dimensional image using a collaborative photography group |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/896,499 Continuation US8810632B2 (en) | 2009-11-09 | 2010-10-01 | Apparatus and method for generating a three-dimensional image using a collaborative photography group |
Publications (1)
Publication Number | Publication Date |
---|---|
US20130329016A1 true US20130329016A1 (en) | 2013-12-12 |
Family
ID=43973885
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/896,499 Active 2031-11-08 US8810632B2 (en) | 2009-11-09 | 2010-10-01 | Apparatus and method for generating a three-dimensional image using a collaborative photography group |
US13/964,702 Abandoned US20130329016A1 (en) | 2009-11-09 | 2013-08-12 | Apparatus and method for generating a three-dimensional image using a collaborative photography group |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/896,499 Active 2031-11-08 US8810632B2 (en) | 2009-11-09 | 2010-10-01 | Apparatus and method for generating a three-dimensional image using a collaborative photography group |
Country Status (2)
Country | Link |
---|---|
US (2) | US8810632B2 (en) |
KR (1) | KR101594048B1 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9877292B2 (en) | 2014-11-20 | 2018-01-23 | Qualcomm Incorporated | Collaborative data capturing apparatuses and methods |
CN108076387A (en) * | 2016-12-29 | 2018-05-25 | 北京市商汤科技开发有限公司 | Business object method for pushing and device, electronic equipment |
CN108702497A (en) * | 2016-02-02 | 2018-10-23 | 三星电子株式会社 | Three-dimensional camera for shooting the image for providing virtual reality |
CN111324131A (en) * | 2020-03-31 | 2020-06-23 | 中通服创立信息科技有限责任公司 | Following monitoring method of track type inspection robot based on human body radar |
Families Citing this family (49)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9479768B2 (en) * | 2009-06-09 | 2016-10-25 | Bartholomew Garibaldi Yukich | Systems and methods for creating three-dimensional image media |
US8878773B1 (en) | 2010-05-24 | 2014-11-04 | Amazon Technologies, Inc. | Determining relative motion as input |
JP5709545B2 (en) * | 2011-01-18 | 2015-04-30 | キヤノン株式会社 | Imaging device |
US9210393B2 (en) * | 2011-05-26 | 2015-12-08 | Empire Technology Development Llc | Multimedia object correlation using group label |
US20130006953A1 (en) * | 2011-06-29 | 2013-01-03 | Microsoft Corporation | Spatially organized image collections on mobile devices |
KR101315218B1 (en) * | 2011-08-02 | 2013-10-08 | 엘지전자 주식회사 | Terminal and method for outputting signal information of a signal light in the terminal |
US10088924B1 (en) | 2011-08-04 | 2018-10-02 | Amazon Technologies, Inc. | Overcoming motion effects in gesture recognition |
US8683054B1 (en) * | 2011-08-23 | 2014-03-25 | Amazon Technologies, Inc. | Collaboration of device resources |
US9223415B1 (en) | 2012-01-17 | 2015-12-29 | Amazon Technologies, Inc. | Managing resource usage for task performance |
US9052208B2 (en) | 2012-03-22 | 2015-06-09 | Nokia Technologies Oy | Method and apparatus for sensing based on route bias |
JP5743221B2 (en) * | 2012-06-29 | 2015-07-01 | カシオ計算機株式会社 | Wireless synchronization system, wireless device, sensor device, wireless synchronization method, and program |
US9898829B2 (en) | 2012-09-18 | 2018-02-20 | Hanwha Techwin Co., Ltd. | Monitoring apparatus and system using 3D information of images and monitoring method using the same |
KR102046043B1 (en) | 2013-08-21 | 2019-11-18 | 한화테크윈 주식회사 | Monitoring apparatus and system using 3 dimensional information of images, and method thereof |
EP2731336B1 (en) * | 2012-11-12 | 2017-10-18 | Samsung Electronics Co., Ltd | Method and apparatus for generating 3D images using plurality of mobile devices |
US9264598B1 (en) * | 2012-12-12 | 2016-02-16 | Amazon Technologies, Inc. | Collaborative image capturing |
KR101381908B1 (en) * | 2013-01-03 | 2014-04-04 | 한국에이치디방송 주식회사 | A virtual rig system for filming 3d images and the control method using the same |
JP5867432B2 (en) * | 2013-03-22 | 2016-02-24 | ソニー株式会社 | Information processing apparatus, recording medium, and information processing system |
KR101457888B1 (en) * | 2013-05-24 | 2014-11-04 | 주식회사 이에스엠연구소 | 3D image generation method using Reference point |
KR20140141383A (en) * | 2013-05-31 | 2014-12-10 | 삼성전자주식회사 | apparatus for collaboration photographing and method for controlling thereof |
KR102072509B1 (en) * | 2013-06-03 | 2020-02-04 | 삼성전자주식회사 | Group recording method, machine-readable storage medium and electronic device |
GB2528058A (en) * | 2014-07-08 | 2016-01-13 | Ibm | Peer to peer camera communication |
GB2528060B (en) | 2014-07-08 | 2016-08-03 | Ibm | Peer to peer audio video device communication |
GB2528059A (en) | 2014-07-08 | 2016-01-13 | Ibm | Peer to peer camera lighting communication |
CN105791751B (en) * | 2014-12-26 | 2019-05-24 | 浙江大华技术股份有限公司 | A kind of image privacy screen method and ball machine based on ball machine |
EP3113485A1 (en) * | 2015-07-03 | 2017-01-04 | LG Electronics Inc. | Mobile terminal and method for controlling the same |
US11095869B2 (en) | 2015-09-22 | 2021-08-17 | Fyusion, Inc. | System and method for generating combined embedded multi-view interactive digital media representations |
US11006095B2 (en) | 2015-07-15 | 2021-05-11 | Fyusion, Inc. | Drone based capture of a multi-view interactive digital media |
US10222932B2 (en) | 2015-07-15 | 2019-03-05 | Fyusion, Inc. | Virtual reality environment based manipulation of multilayered multi-view interactive digital media representations |
US10242474B2 (en) | 2015-07-15 | 2019-03-26 | Fyusion, Inc. | Artificially rendering images using viewpoint interpolation and extrapolation |
US10147211B2 (en) | 2015-07-15 | 2018-12-04 | Fyusion, Inc. | Artificially rendering images using viewpoint interpolation and extrapolation |
KR101729164B1 (en) * | 2015-09-03 | 2017-04-24 | 주식회사 쓰리디지뷰아시아 | Multi camera system image calibration method using multi sphere apparatus |
KR101729165B1 (en) | 2015-09-03 | 2017-04-21 | 주식회사 쓰리디지뷰아시아 | Error correcting unit for time slice image |
US11783864B2 (en) | 2015-09-22 | 2023-10-10 | Fyusion, Inc. | Integration of audio into a multi-view interactive digital media representation |
KR102314611B1 (en) | 2015-09-23 | 2021-10-18 | 삼성전자주식회사 | Bidirectional Synchronizing Camera, Camera System including the Same and Method there-of |
US10979673B2 (en) * | 2015-11-16 | 2021-04-13 | Deep North, Inc. | Inventory management and monitoring |
US10220172B2 (en) | 2015-11-25 | 2019-03-05 | Resmed Limited | Methods and systems for providing interface components for respiratory therapy |
GB2550854B (en) | 2016-05-25 | 2019-06-26 | Ge Aviat Systems Ltd | Aircraft time synchronization system |
US11202017B2 (en) | 2016-10-06 | 2021-12-14 | Fyusion, Inc. | Live style transfer on a mobile device |
US10437879B2 (en) | 2017-01-18 | 2019-10-08 | Fyusion, Inc. | Visual search using multi-view interactive digital media representations |
US10523918B2 (en) | 2017-03-24 | 2019-12-31 | Samsung Electronics Co., Ltd. | System and method for depth map |
US10313651B2 (en) * | 2017-05-22 | 2019-06-04 | Fyusion, Inc. | Snapshots at predefined intervals or angles |
US11069147B2 (en) | 2017-06-26 | 2021-07-20 | Fyusion, Inc. | Modification of multi-view interactive digital media representation |
CN107517360B (en) * | 2017-08-01 | 2020-04-14 | 深圳英飞拓科技股份有限公司 | Image area shielding method and device |
KR102454920B1 (en) | 2018-03-29 | 2022-10-14 | 한화테크윈 주식회사 | Surveillance system and operation method thereof |
US10592747B2 (en) | 2018-04-26 | 2020-03-17 | Fyusion, Inc. | Method and apparatus for 3-D auto tagging |
GB2584282B (en) * | 2019-05-24 | 2021-08-25 | Sony Interactive Entertainment Inc | Image acquisition system and method |
KR20210007697A (en) | 2019-07-12 | 2021-01-20 | 삼성전자주식회사 | Image sensor and electronic device comprising the image sensor |
KR102551857B1 (en) * | 2020-11-30 | 2023-07-13 | (주)오버다임케이 | 3-dimensional image gemeration apparatus |
WO2022160328A1 (en) * | 2021-02-01 | 2022-08-04 | Huawei Technologies Co., Ltd. | A system for wireless synchronized capturing, and a smart device |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020036649A1 (en) * | 2000-09-28 | 2002-03-28 | Ju-Wan Kim | Apparatus and method for furnishing augmented-reality graphic using panoramic image with supporting multiuser |
US20040183915A1 (en) * | 2002-08-28 | 2004-09-23 | Yukita Gotohda | Method, device, and program for controlling imaging device |
US7106361B2 (en) * | 2001-02-12 | 2006-09-12 | Carnegie Mellon University | System and method for manipulating the point of interest in a sequence of images |
US8458462B1 (en) * | 2008-08-14 | 2013-06-04 | Juniper Networks, Inc. | Verifying integrity of network devices for secure multicast communications |
Family Cites Families (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6055012A (en) * | 1995-12-29 | 2000-04-25 | Lucent Technologies Inc. | Digital multi-view video compression with complexity and compatibility constraints |
JP2002027495A (en) | 2000-07-03 | 2002-01-25 | Sony Corp | Three-dimensional image generating system, three- dimensional image generating method and three- dimensional information service system, and program providing medium |
US6959120B1 (en) * | 2000-10-27 | 2005-10-25 | Microsoft Corporation | Rebinning methods and arrangements for use in compressing image-based rendering (IBR) data |
JP2002209208A (en) * | 2001-01-11 | 2002-07-26 | Mixed Reality Systems Laboratory Inc | Image processing unit and its method, and storage medium |
US20030076413A1 (en) * | 2001-10-23 | 2003-04-24 | Takeo Kanade | System and method for obtaining video of multiple moving fixation points within a dynamic scene |
US7046292B2 (en) * | 2002-01-16 | 2006-05-16 | Hewlett-Packard Development Company, L.P. | System for near-simultaneous capture of multiple camera images |
CN100523715C (en) * | 2002-12-27 | 2009-08-05 | 有泽博 | Multi-view-point video capturing system |
US20040162154A1 (en) * | 2003-02-14 | 2004-08-19 | Dejohn David | Kinetic motion analyzer |
US20050185711A1 (en) * | 2004-02-20 | 2005-08-25 | Hanspeter Pfister | 3D television system and method |
KR20060006304A (en) | 2004-07-15 | 2006-01-19 | 주식회사 팬택 | Master/slave wireless telecommunication terminal and its method for providing a function of photographing a panorama |
US7512261B2 (en) * | 2004-07-27 | 2009-03-31 | Microsoft Corp. | System and method for calibrating multiple cameras without employing a pattern by inter-image homography |
US20060023782A1 (en) * | 2004-07-27 | 2006-02-02 | Microsoft Corporation | System and method for off-line multi-view video compression |
US20060023787A1 (en) * | 2004-07-27 | 2006-02-02 | Microsoft Corporation | System and method for on-line multi-view video compression |
US7778770B2 (en) * | 2005-03-31 | 2010-08-17 | Honda Motor Co., Ltd. | Communication system between vehicles |
KR100629442B1 (en) | 2005-06-30 | 2006-09-27 | 주식회사 팬택 | Method and apparatus for photographing wide image by using plural camera-phone |
KR20070058263A (en) | 2005-12-03 | 2007-06-08 | 엘지전자 주식회사 | Method for photographing panorama image using plural cameras |
JP4570159B2 (en) * | 2006-01-06 | 2010-10-27 | Kddi株式会社 | Multi-view video encoding method, apparatus, and program |
JP4463215B2 (en) * | 2006-01-30 | 2010-05-19 | 日本電気株式会社 | Three-dimensional processing apparatus and three-dimensional information terminal |
EP1821116B1 (en) * | 2006-02-15 | 2013-08-14 | Sony Deutschland Gmbh | Relative 3D positioning in an ad-hoc network based on distances |
CN101518090B (en) * | 2006-09-20 | 2011-11-16 | 日本电信电话株式会社 | Image encoding method, decoding method, image encoding device and image decoding device |
JP4800163B2 (en) * | 2006-09-29 | 2011-10-26 | 株式会社トプコン | Position measuring apparatus and method |
EP2129999B1 (en) * | 2007-03-23 | 2019-09-04 | QUALCOMM Incorporated | Multi-sensor data collection and/or processing |
KR101396008B1 (en) | 2007-03-26 | 2014-05-16 | 삼성전자주식회사 | Method and apparatus for acquiring multiview video image |
US7873182B2 (en) * | 2007-08-08 | 2011-01-18 | Brijot Imaging Systems, Inc. | Multiple camera imaging method and system for detecting concealed objects |
US20090091798A1 (en) * | 2007-10-05 | 2009-04-09 | Lawther Joel S | Apparel as event marker |
CN101453662B (en) * | 2007-12-03 | 2012-04-04 | 华为技术有限公司 | Stereo video communication terminal, system and method |
DE102008007199A1 (en) * | 2008-02-01 | 2009-08-06 | Robert Bosch Gmbh | Masking module for a video surveillance system, method for masking selected objects and computer program |
EP2314072B1 (en) * | 2008-07-16 | 2014-08-27 | SISVEL International S.A. | Track and track-subset grouping for multi view video decoding. |
US8106924B2 (en) * | 2008-07-31 | 2012-01-31 | Stmicroelectronics S.R.L. | Method and system for video rendering, computer program product therefor |
EP2214137B1 (en) * | 2009-01-29 | 2024-04-03 | Vestel Elektronik Sanayi ve Ticaret A.S. | A method and apparatus for frame interpolation |
US9648346B2 (en) * | 2009-06-25 | 2017-05-09 | Microsoft Technology Licensing, Llc | Multi-view video compression and streaming based on viewpoints of remote viewer |
US8254755B2 (en) * | 2009-08-27 | 2012-08-28 | Seiko Epson Corporation | Method and apparatus for displaying 3D multi-viewpoint camera video over a network |
US9083956B2 (en) * | 2009-09-28 | 2015-07-14 | Samsung Electronics Co., Ltd. | System and method for creating 3D video |
US11711592B2 (en) * | 2010-04-06 | 2023-07-25 | Comcast Cable Communications, Llc | Distribution of multiple signals of video content independently over a network |
KR20110116525A (en) * | 2010-04-19 | 2011-10-26 | 엘지전자 주식회사 | Image display device and operating method for the same |
US8675049B2 (en) * | 2011-06-09 | 2014-03-18 | Microsoft Corporation | Navigation model to render centered objects using images |
-
2009
- 2009-11-09 KR KR1020090107389A patent/KR101594048B1/en active IP Right Grant
-
2010
- 2010-10-01 US US12/896,499 patent/US8810632B2/en active Active
-
2013
- 2013-08-12 US US13/964,702 patent/US20130329016A1/en not_active Abandoned
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020036649A1 (en) * | 2000-09-28 | 2002-03-28 | Ju-Wan Kim | Apparatus and method for furnishing augmented-reality graphic using panoramic image with supporting multiuser |
US7106361B2 (en) * | 2001-02-12 | 2006-09-12 | Carnegie Mellon University | System and method for manipulating the point of interest in a sequence of images |
US20040183915A1 (en) * | 2002-08-28 | 2004-09-23 | Yukita Gotohda | Method, device, and program for controlling imaging device |
US8458462B1 (en) * | 2008-08-14 | 2013-06-04 | Juniper Networks, Inc. | Verifying integrity of network devices for secure multicast communications |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9877292B2 (en) | 2014-11-20 | 2018-01-23 | Qualcomm Incorporated | Collaborative data capturing apparatuses and methods |
CN108702497A (en) * | 2016-02-02 | 2018-10-23 | 三星电子株式会社 | Three-dimensional camera for shooting the image for providing virtual reality |
US10750156B2 (en) | 2016-02-02 | 2020-08-18 | Samsung Electronics Co., Ltd. | Three-dimensional camera for capturing image to provide virtual reality |
CN108076387A (en) * | 2016-12-29 | 2018-05-25 | 北京市商汤科技开发有限公司 | Business object method for pushing and device, electronic equipment |
CN111324131A (en) * | 2020-03-31 | 2020-06-23 | 中通服创立信息科技有限责任公司 | Following monitoring method of track type inspection robot based on human body radar |
Also Published As
Publication number | Publication date |
---|---|
US8810632B2 (en) | 2014-08-19 |
US20110109726A1 (en) | 2011-05-12 |
KR20110050843A (en) | 2011-05-17 |
KR101594048B1 (en) | 2016-02-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8810632B2 (en) | Apparatus and method for generating a three-dimensional image using a collaborative photography group | |
TWI619397B (en) | Methods and systems for a ranging protocol | |
JP5965708B2 (en) | Wireless communication device, memory device, wireless communication system, wireless communication method, and program | |
US20180077234A1 (en) | Collaborative media capture and sharing system | |
US9277101B2 (en) | Method and system for generating interpolations of captured video content | |
US9341483B2 (en) | Methods and apparatus for position estimation | |
KR20130092522A (en) | Device and method for cooperation between cameras | |
JP2014112302A (en) | Prescribed area management system, communication method, and program | |
US9113068B1 (en) | Facilitating coordinated media and/or information capturing and aggregation | |
US20150139601A1 (en) | Method, apparatus, and computer program product for automatic remix and summary creation using crowd-sourced intelligence | |
WO2010127308A2 (en) | Automatic content tagging, such as tagging digital images via a wireless cellular network using metadata and facial recognition | |
JP6073378B2 (en) | POSITIONING METHOD, POSITIONING DEVICE, AND COMPUTER PROGRAM | |
CN109804346B (en) | Radio apparatus and radio system | |
JP5067477B2 (en) | Imaging parameter acquisition apparatus, imaging parameter acquisition method, and program | |
US20140354779A1 (en) | Electronic device for collaboration photographing and method of controlling the same | |
US11736802B2 (en) | Communication management apparatus, image communication system, communication management method, and recording medium | |
US9052866B2 (en) | Method, apparatus and computer-readable medium for image registration and display | |
WO2014183533A1 (en) | Image processing method, user terminal, and image processing terminal and system | |
US20160100110A1 (en) | Apparatus, Method And Computer Program Product For Scene Synthesis | |
JP5945966B2 (en) | Portable terminal device, portable terminal program, server, and image acquisition system | |
JP2016194784A (en) | Image management system, communication terminal, communication system, image management method, and program | |
JP2014120878A (en) | Predetermined-zone management system, predetermined-zone management method, and program | |
KR100853379B1 (en) | Method for transforming based position image file and service server thereof | |
KR20180041430A (en) | Mobile terminal and operating method thereof | |
WO2020062919A1 (en) | Data processing method, mec server and terminal device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |