WO2020032371A1 - Image sharing method and device - Google Patents
Image sharing method and device Download PDFInfo
- Publication number
- WO2020032371A1 WO2020032371A1 PCT/KR2019/006916 KR2019006916W WO2020032371A1 WO 2020032371 A1 WO2020032371 A1 WO 2020032371A1 KR 2019006916 W KR2019006916 W KR 2019006916W WO 2020032371 A1 WO2020032371 A1 WO 2020032371A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- user device
- view
- angle
- target user
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/21—Server components or server architectures
- H04N21/218—Source of audio or video content, e.g. local disk arrays
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/236—Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/238—Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
- H04N21/2381—Adapting the multiplex stream to a specific network, e.g. an Internet Protocol [IP] network
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/63—Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
- H04N21/643—Communication protocols
- H04N21/6437—Real-time Transport Protocol [RTP]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
Definitions
- the present invention relates to an image sharing method and apparatus. More particularly, the present invention relates to a method for sharing an image between user devices by using a relay device.
- An omnidirectional imaging system refers to an imaging system capable of recording image information in all directions (360 degrees) based on a specific viewpoint. It is possible to obtain a much wider field-of-view image than a conventional imaging system, and thus, in addition to research fields such as computer vision and mobile robot, surveillance systems, virtual reality systems, and pan-tilt-zoom Applications are becoming increasingly widespread in practical applications such as cameras and video conferencing.
- an omnidirectional image may be generated by bonding images obtained by rotating one camera based on an optical axis satisfying a single view point.
- a method of arranging a plurality of cameras in an annular structure and combining images obtained from each camera may be used.
- a user may generate an omnidirectional image using various omnidirectional image processing apparatuses (or omnidirectional image processing cameras or 360 degree cameras).
- the omnidirectional image processing apparatus may be utilized in various areas. For example, it can be used in areas requiring surveillance of omnidirectional video such as security / security, or it can be used to record the places visited by travelers when traveling.
- the omnidirectional image photographed based on the omnidirectional image processing apparatus may be edited and used as an image for sale of a product.
- the omnidirectional image processing apparatus may be used for various purposes.
- An object of the present invention is to solve all the above-mentioned problems.
- an object of the present invention is to transmit an image (for example, an omnidirectional image) to a target user device by using a relay device such as a user device.
- an object of the present invention is to allow a user of a target user device located at a far distance to receive an image captured by the image processing device in real time to share a situation at a location of the image processing device at a long distance.
- an object of the present invention is to prevent network load by minimizing unnecessary image data when sharing an image, and to prevent image disconnection.
- an image sharing method includes connecting a target user device to a relay user device based on a first communication network and capturing the target user device by an image processing device based on the first communication network. And receiving the generated image data in real time, wherein the image processing apparatus and the relay user apparatus may be connected based on a second communication network.
- a target user device for performing image sharing includes a communication unit implemented to receive image data and a processor operatively connected to the communication unit, wherein the processor comprises a relay user device; 1 may be configured to connect based on a communication network, and receive image data generated by an image processing apparatus in real time based on the first communication network, wherein the image processing apparatus and the relay user apparatus are provided in a second manner.
- the connection may be based on a communication network.
- an image (for example, an omnidirectional image) may be transmitted to the target user device by using a relay device such as a user device.
- a user of a remotely located target user device may receive an image captured by the image processing device in real time so that the situation at the location of the image processing device may be shared remotely.
- network load can be reduced by minimizing unnecessary image data when sharing images, and image breakage can be prevented.
- FIG. 1 is a conceptual diagram illustrating a video sharing method according to an embodiment of the present invention.
- FIG. 2 is a conceptual diagram illustrating a method of transmitting an image based on a user device according to an embodiment of the present invention.
- FIG. 3 is a conceptual diagram illustrating a method of transmitting an image based on a user device according to an embodiment of the present invention.
- FIG. 4 is a conceptual diagram illustrating a method of transmitting an image through a user device according to an embodiment of the present invention.
- FIG. 5 is a conceptual diagram illustrating a method of transmitting an image through a user device according to an embodiment of the present invention.
- FIG. 6 is a conceptual diagram illustrating a method of checking an image through a user device located at a far distance according to an embodiment of the present invention.
- FIG. 7 is a conceptual diagram illustrating a method for receiving an image of a specific field of view selected by a target user according to an embodiment of the present invention.
- FIG. 8 is a conceptual diagram illustrating a method of checking an image in a target user device according to an embodiment of the present invention.
- the image processing apparatus is an omnidirectional image processing apparatus, and that an image captured by the image processing apparatus is an omnidirectional image.
- the image processing apparatus may include not only an omnidirectional image processing apparatus but also various types of image processing apparatuses. It may include an image of an angle of view.
- FIG. 1 is a conceptual diagram illustrating a video sharing method according to an embodiment of the present invention.
- a method for transmitting an image eg, an omnidirectional image
- an image processing apparatus eg, an omnidirectional image processing apparatus
- the target user device may be a device that is not directly connected to the image processing device and receives an image captured by the image processing device.
- a relay user may wear the image processing apparatus 100 and carry the relay user apparatus 110.
- the image processing apparatus is a wearable image processing apparatus that may be worn by a relay user in the form of a neckband.
- the image processing apparatus may include not only an image processing apparatus for capturing a 360 degree image, but also an image processing apparatus for capturing an image having a lower angle of view.
- the relay user wears the image processing apparatus 100 and carries the relay user apparatus 110.
- the relay user wearing the image processing apparatus 100 and the relay user apparatus 110 carry the same.
- the users can be different.
- the target user may carry the target user device 120 and receive an image captured by the image processing device 100.
- the target user device 120 For convenience of description, it is assumed that one user wears the image processing apparatus 100 and carries the relay user apparatus 110, but the user and the relay user apparatus 110 wearing the omnidirectional image processing apparatus 100 are used. Carrying users may be different.
- Omni-directional image data generated by the image processing apparatus 100 may be transmitted to the target user device 120 through the relay user device 110.
- the target user may check the omnidirectional image of the current situation of the relay user at a distance through the target user device 120.
- FIG. 2 is a conceptual diagram illustrating a method of transmitting an image based on a user device according to an embodiment of the present invention.
- an image checking method based on a target user device is disclosed.
- the communication procedure disclosed in FIG. 2 is one example, and various other methods may be utilized to identify an image through the target user device.
- the image processing apparatus may include an omnidirectional image processing apparatus, and the image may include an omnidirectional image.
- the image processing apparatus 200 may operate.
- a first communication connection may be performed between the image processing apparatus 200 and the relay user device 210.
- the relay user device 210 may act as a router based on a function such as a hot spot.
- the image processing apparatus 200 may be combined with the relay user device 210 serving as a router.
- the target user device 220 may be connected to the image processing device 200 from the outside based on port forwarding.
- Port forwarding or port mapping is the process of forwarding a request for a combination of one Internet protocol (IP) address and port number combination to another while a packet crosses a network gateway, such as a router or firewall, in a computer network.
- IP Internet protocol
- State is a network address translation (NAT) technology.
- the relay user device 210 sets IP: PhoneA and port number: PhoneA for the image processing device 200, and through the port forwarding technology, the image processing device 200 and the target user device. 220 may be connected.
- the target user device 220 may receive an image data transmitted through the image processing apparatus 200 by inputting an IP and a port number set by the relay user device. Alternatively, information about the IP and the port number is automatically transferred between the relay user device 210 and the target user device 220 through a separate application so that the target user device receives data generated by the image processing device without additional input. can do.
- the target user device 220 may check an image captured by the image processing apparatus 200 transmitted based on a video transmission protocol such as a real-time streaming protocol (RTSP) scheme.
- a video transmission protocol such as a real-time streaming protocol (RTSP) scheme.
- RTSP real-time streaming protocol
- Port forwarding and real-time streaming protocol (RTSP) schemes are examples, and sharing of images may be performed in various ways.
- RTSP real-time streaming protocol
- the target user device connects to the relay user device based on the first communication network, and the target user device is captured by the image processing device based on the first communication network.
- the method may include receiving the generated image data in real time, but the image processing apparatus and the relay user apparatus may be connected based on the second communication network.
- the first communication network is generated based on IP and port information set by the relay user device of the target user device based on port forwarding
- the second communication network is a hot spot of the relay user device. Can be generated based on functionality.
- FIG. 3 is a conceptual diagram illustrating a method of transmitting an image based on a user device according to an embodiment of the present invention.
- FIG. 3 a method of performing communication between an image processing device and a user device through at least one access point (AP) is disclosed.
- the image processing apparatus 300 may start an operation and capture an image (eg, omnidirectional image).
- AP1 310 and AP2 320 may be connected in a wireless distribution system (WDS) manner.
- WDS wireless distribution system
- the image processing apparatus 300 may be connected to the AP1 310, and the user device 330 may be connected to the AP2 320.
- the user device 330 may input an internal IP and a port number of the image processing device 300 in the application.
- the user device 330 may check the image captured by the image processing apparatus in the RTSP method.
- the internal IP and port number of the image processing apparatus 300 may be automatically input based on the application without additional input on the user device.
- FIG. 4 is a conceptual diagram illustrating a method of transmitting an image through a user device according to an embodiment of the present invention.
- FIG. 4 a method for reducing the size of image data to be transmitted among the image transmission methods based on a user device is disclosed.
- the image processing apparatus is an omnidirectional image processing apparatus and the image is an omnidirectional image (360 degree image).
- a method for reducing the amount of image data transmitted in real time by providing only an image corresponding to a part of an angle of view of an omnidirectional image transmitted to a relay user device to a target user device is disclosed.
- the entire image data corresponding to the entire angle of view image (eg, 360 degree image) captured by the image processing apparatus may be transmitted to the relay user apparatus directly connected to the image processing apparatus.
- the image checked by the target user device may be an image of a part of the angle of view that the target user is interested in, not the entire angle of view image. Therefore, the image data transmitted by the relay user device may be transmitted not the entire field of view image data but only partial image data 450 for a partial field of view of interest by the target user. In this way, the burden on the communication network is reduced, and unnecessary data dropping due to the data load can be prevented.
- the target user device may transmit the image information 400 of the output image currently output by the target user device to the relay user device through the application.
- the image information 400 may include information about an output image angle of view output by the target user device.
- the target user device may transmit the image information 400 to the relay user device.
- the relay user device may receive the image information 400 and determine the partial image data 450 to be transmitted to the target user device based on the image information.
- the relay user device may further include the target user device up to the reference image area 420 determined based on the output image angle information included in the image information 400 and the peripheral image area 440 of the reference image area 420. Can be sent to.
- the reference image area 420 may be an image area of an angle of view of -30 degrees to 30 degrees indicated by the target user device
- the peripheral image area 440 is an image area of an angle of view of -30 degrees to +30 degrees. It may be a peripheral image area (eg, an image area of -60 degrees to -30 degrees, +30 degrees to +60 degrees) 440 extended based on the threshold.
- a threshold percentage for determining the surrounding image area may change according to the communication state / frequency of screen movement of the user. As the communication state is relatively good, the threshold percentage may be relatively increased. As the communication state is relatively good, the threshold percentage may be relatively decreased.
- the threshold percentage may be relatively increased, and as the screen movement frequency of the user of the target user device is relatively small, the threshold percentage may be relatively decreased.
- the original image data captured and generated by the image processing apparatus are image data of a first angle of view, the image data is image data of a second angle of view, and the first angle of view is greater than the second angle of view, and the second angle of view is the target user device. Can be determined by.
- the target user device generates image information, the image information includes output image angle information indicating a second angle of view, the output image angle of view information includes information on an output image angle of view to be output from the target user device, and output image The angle of view may be determined based on the direction or absolute direction of the image processing apparatus.
- FIG. 5 is a conceptual diagram illustrating a method of transmitting an image through a user device according to an embodiment of the present invention.
- FIG. 5 a method for reducing the size of image data to be transmitted among the image transmission methods based on a user device is disclosed.
- the image processing apparatus is an omnidirectional image processing apparatus and the image is an omnidirectional image (360 degree image).
- the image information 500 for reducing the size of the image data may include output image direction information, output image horizontal angle of view information, and output image vertical angle of view information.
- the output image direction information may be determined by at least one camera among a plurality of cameras implemented in the image processing apparatus as information on the imaging direction of the image processing apparatus. For example, when a user wears an image processing apparatus in the form of a neck band, a specific direction (eg, a front direction of the user) captured by a specific camera included in the image processing apparatus may be an output image direction.
- a specific direction eg, a front direction of the user
- the output image horizontal angle of view information may indicate an angle of view in the horizontal direction based on the direction indicated by the output image direction information.
- the horizontal direction may be a longitudinal direction.
- the output image horizontal angle of view information may include information about the horizontal angle of view selected by the target user.
- the output image vertical angle of view information may indicate the angle of view of the vertical direction based on the direction indicated by the output image direction information.
- the vertical direction may be a latitude direction.
- the output image vertical angle of view information may include information about the vertical angle of view selected by the target user.
- the relay user device receives the image information 500 including the output image direction information, the output image horizontal angle of view information, and the output image vertical angle of view information, and transmits partial image data to be transmitted to the target user device based on the image information 500. 550 may be determined.
- the relay user device may include the reference image area 520 and the reference image area 520 determined based on the output image angle of view information, the output image horizontal angle of view information, and the output image vertical angle of view information included in the image information 500.
- the peripheral image area 540 may be further transmitted to the target user device.
- the reference image area 520 may include a first direction, an output image horizontal angle of view (-30 degrees to +30 degrees), and an output image vertical angle of view (-30 degrees to +30 degrees) indicated by the target user device.
- the peripheral image area 540 may extend the output image horizontal angle of view (-30 degrees to +30 degrees) by a first threshold percentage in the first direction, and output image vertical angle of view (-30 degrees to +30 degrees). ) May be an area in which the second threshold percentage is expanded.
- the target user wants to see and the angle of view changes
- information about the changed direction and the changed angle of view may be transmitted from the target user device to the relay user device.
- the reference image area 520 and the surrounding image area 540 may be changed again.
- the first threshold percentage and / or the second threshold percentage for determining the surrounding image area 540 may be changed according to the communication state / frequency of screen movement of the user.
- the relatively good communication state may relatively increase the first and / or second threshold percentages, and the relatively poor communication state may relatively decrease the first and / or second threshold percentages. .
- first threshold percentage and / or the second threshold percentage are relatively increased as the screen movement frequency of the user of the target user device is relatively high, and the first threshold percentage is relatively small as the screen movement frequency of the user of the target user device is relatively small. And / or the second threshold percentage may be relatively reduced.
- FIG. 6 is a conceptual diagram illustrating a method of checking an image through a user device located at a far distance according to an embodiment of the present invention.
- FIG. 6 a method of automatically adjusting a screen of a target user device for convenience of screen viewing of a remotely located user device is disclosed.
- an image captured by the image processing apparatus may be delivered to the target user apparatus through the relay user apparatus or the AP as described above.
- the target user device may provide an image to the target user based on a reference point set by the target user.
- an image may be provided to the target user device based on the reference point 600 set by the target user.
- the reference point 600 may be set such that an image of a specific direction, a specific horizontal angle of view, and a specific vertical angle of view is positioned in the middle of the display of the target user device. In this case, even when the position of the image processing apparatus changes or is shaken, an image maintaining a specific horizontal / vertical angle of view and a specific direction may be provided on the target user device.
- the target user may set a specific direction and a specific horizontal / vertical angle of view in the omnidirectional image reproduced on the target user device.
- a target user sets a direction and a horizontal / vertical angle of view in an omnidirectional image by using a touch screen on the target user device, and generates a specific touch through the user interface to output the image direction information 620 and outputs the image.
- the image angle of view information 640 may be generated as the screen setting information.
- the target user device may output an image indicated by the screen setting information of the omnidirectional image on the target user device. For example, the user sets a horizontal angle of view of -60 degrees to +60 degrees, a vertical angle of view of -45 degrees to +45 degrees based on the reference point 600, and sets an x direction based on the image processing device as a selection direction.
- a screen determined based on the screen setting information among the omnidirectional images may be output. That is, only a specific region of the omnidirectional image may be output on the target user device, and through this method, the target user device may continuously receive a screen of a specific field of view regardless of the real-time movement of the image processing device.
- FIG. 7 is a conceptual diagram illustrating a method for receiving an image of a specific field of view selected by a target user according to an embodiment of the present invention.
- FIG. 7 when a screen of a specific field of view to be tracked by a target user is selected, a method for extracting direction and angle of view information on the screen of the field of view is disclosed. This may be applied to a method of determining the output image direction information, the output image horizontal angle of view information, and the output image vertical angle of view information described above with reference to FIGS. 4 and 5.
- the output image direction information may be determined by matching the omnidirectional image and the selected image (step S710).
- Direction information may be determined based on the selection screen.
- the omnidirectional image image may be shaped like a sphere, and the output image direction in the omnidirectional image image may be set in various ways.
- the output image direction may be generated based on the imaging directions of the plurality of cameras implemented in the image processing apparatus. Assuming that the image processing apparatus is at the center of the sphere, the direction information of 360 degrees may be determined based on the direction of the imaging line of the first camera of the image processing apparatus. The output image direction information may be determined based on which of 360 degrees the direction of the selection screen selected by the user indicates.
- the imaging line may be a leader line in the direction indicated by the center of the camera lens.
- the output image direction may be a direction relatively determined based on a specific direction (for example, north) rather than the direction of the image processing apparatus.
- the output image horizontal angle of view information / output image vertical angle of view information may be determined by matching the omnidirectional image with the selection screen (step S720).
- the output image horizontal angle of view information and the output image vertical angle of view information may be determined as the horizontal angle of view and the vertical angle of view based on the output image direction.
- the horizontal angle of view may be in the longitudinal direction on the sphere, and the vertical angle of view may be in the latitude direction on the sphere.
- the target user may be provided with a similar range of view despite the movement of the image capturing apparatus.
- the image direction selected by the user may be set based on an absolute position (for example, north), or may be set to a relative position (front of the user, first camera image pickup line, etc.).
- an absolute position for example, north
- a relative position front of the user, first camera image pickup line, etc.
- the image for the specific direction may be continuously provided regardless of the direction change of the user wearing the image processing apparatus.
- the relative position is selected in the image direction, the image reflecting the change of the direction of the user wearing the image processing apparatus may be continuously provided.
- the image output from the target user device may be changed in consideration of the change of direction of the target user device. For example, when the target user device returns to 30 degrees to the left based on the sensor of the target user device, the angle of view of the screen output from the current target user device is adjusted by -30 degrees to further reflect the movement of the target user device.
- the screen output from the user device can be determined
- the target user device may also provide a function of rotating the captured image.
- the omnidirectional image After capturing the omnidirectional image provided in real time, the omnidirectional image may be rotated on a touch basis, and a variety of angles of view and time that the target user did not check according to the rotation may be provided.
- omnidirectional image data related to the entire captured image may be transmitted to the target user device after the capture.
- the omnidirectional image is provided in real time, if the target user resets the capture function, the omnidirectional image of the time after the reset may be provided to the target user again.
- FIG. 8 is a conceptual diagram illustrating a method of checking an image in a target user device according to an embodiment of the present invention.
- FIG. 8 a method of recommending a region of interest to a target user device is disclosed.
- the image processing apparatus is an omnidirectional image processing apparatus and the image is an omnidirectional image.
- a recommendation screen may be provided to the target user device so that the target user may receive images of various angles of view. For example, when the target user is currently provided with a screen having an angle of view of -30 degrees to +30 degrees as the selection screen, a screen for another angle of view among the omnidirectional image images may be provided as the recommended screen.
- the relay user apparatus or the image processing apparatus may determine an angle of view in which a lot of change occurs and provide a screen for the angle of view to the target user as a recommendation screen.
- an optimal view to be provided to the target user may be found and provided to the target user.
- the default screen 850 may be provided to the target user with the same view as that of the user wearing the omnidirectional image processing apparatus.
- the field of view of the omnidirectional image processing apparatus wearer may be changed, and the field of view of the omnidirectional image processing apparatus wearer may be changed according to the operation of the omnidirectional image processing apparatus wearer. May provide a more natural default screen 850 to the target user through learning about the change.
- the current state 800 of the user wearing the omnidirectional image processing apparatus may be determined.
- the user who wears the omnidirectional image processing apparatus (hereinafter, the wearer of the omnidirectional image processing apparatus) may determine whether to stand, sit, move, or climb the stairs. Determination of an operation (or current state 800) of the wearer of the omnidirectional image processing apparatus may be performed based on a sensor implemented in the omnidirectional image processing apparatus. Alternatively, the determination of the operation (or the current state 800) of the wearer of the omnidirectional image processing apparatus may be performed based on the image change in the omnidirectional image processing apparatus.
- Learning of the wearer of the omnidirectional image processing apparatus may be performed, and a more accurate motion of the wearer of the current omnidirectional image wearing user (or the current state 800) may be determined based on the learning result. You can decide.
- the default screen 850 to be provided to the current target user may be determined based on the current state 800 of the wearer of the omnidirectional image processing apparatus.
- the image line reference of the first camera is based on the front surface of the wearer of the omnidirectional image processing apparatus.
- the 60- + 60 degree direction may be the same as that of the current omnidirectional image processing device wearer.
- the view of the wearer of the omnidirectional image processing device and the view of the wearer of the omnidirectional image processing device according to the current state 800 are reflected, so that the field of view of the forward image processing device wearer is the same as or similar to that of the front image processing device wearer.
- the corresponding screen may be provided as the default screen 850 to the target user device.
- the determination of the default screen 850 may be performed by at least one of the omnidirectional image processing apparatus, the relay user apparatus, and the target user apparatus.
- the current state 800 of the wearer of the omnidirectional image processing apparatus is determined based on the motion sensing information transmitted by the omnidirectional image processing apparatus and the screen change of the omnidirectional image processing apparatus. And provide a default screen 850.
- Embodiments according to the present invention described above can be implemented in the form of program instructions that can be executed by various computer components, and recorded in a computer-readable recording medium.
- the computer-readable recording medium may include program instructions, data files, data structures, etc. alone or in combination.
- the program instructions recorded on the computer readable recording medium may be specially designed and configured for the present invention, or may be known and available to those skilled in the computer software field.
- Examples of computer-readable recording media include magnetic media such as hard disks, floppy disks and magnetic tape, optical recording media such as CD-ROMs and DVDs, and magneto-optical media such as floptical disks. medium) and hardware devices specifically configured to store and execute program instructions, such as ROM, RAM, flash memory, and the like.
- Examples of program instructions include not only machine code generated by a compiler, but also high-level language code that can be executed by a computer using an interpreter or the like.
- Hardware devices may be modified with one or more software modules to perform the processing according to the present invention, and vice versa.
Abstract
Description
Claims (8)
- 영상 공유 방법에 있어서, In the video sharing method,타겟 사용자 장치가 중계 사용자 장치와 제1 통신 네트워크를 기반으로 연결하는 단계; 및Connecting the target user device to the relay user device based on the first communication network; And상기 타겟 사용자 장치가 상기 제1 통신 네트워크를 기반으로 영상 처리 장치에 의해 촬상되어 생성된 영상 데이터를 실시간으로 수신하는 단계를 포함하되Receiving, by the target user device, image data generated by the image processing apparatus based on the first communication network in real time;상기 영상 처리 장치와 상기 중계 사용자 장치는 제2 통신 네트워크를 기반으로 연결된 것을 특징으로 하는 방법.And the image processing device and the relay user device are connected based on a second communication network.
- 제1항에 있어서,The method of claim 1,상기 제1 통신 네트워크는 포트 포워딩을 기반으로 상기 타겟 사용자 장치의 상기 중계 사용자 장치에 의해 설정된 IP(internet protocol) 정보 및 포트 정보를 기반으로 생성되고,The first communication network is generated based on port information and IP (internet protocol) information set by the relay user device of the target user device based on port forwarding.상기 제2 통신 네트워크는 상기 중계 사용자 장치의 핫스팟 기능을 기반으로 생성되는 것을 특징으로 하는 방법.The second communication network is generated based on a hotspot function of the relay user device.
- 제1항에 있어서, The method of claim 1,상기 영상 처리 장치에 의해 촬상되어 생성되는 원본 영상 데이터는 제1 화각의 영상 데이터이고, Original image data captured by the image processing apparatus and generated are image data of a first angle of view,상기 영상 데이터는 제2 화각의 영상 데이터이고, The image data is image data of a second angle of view,상기 제1 화각은 상기 제2 화각보다 크고,The first angle of view is larger than the second angle of view,상기 제2 화각은 상기 타겟 사용자 장치에 의해 결정되는 것을 특징으로 하는 방법.And the second angle of view is determined by the target user device.
- 제3항에 있어서,The method of claim 3,상기 타겟 사용자 장치는 영상 정보를 생성하고, The target user device generates image information,상기 영상 정보는 상기 제2 화각을 지시하는 출력 영상 화각 정보를 포함하고, The image information includes output image angle of view information indicating the second angle of view,상기 출력 영상 화각 정보는 상기 타겟 사용자 장치에서 출력될 출력 영상 화각에 대한 정보를 포함하고, The output image angle of view information includes information on an output image angle of view to be output from the target user device.출력 영상 화각은 상기 영상 처리 장치의 방향 또는 절대 방향을 기준으로 결정되는 것을 특징으로 하는 방법.The output image angle of view is determined based on the direction or absolute direction of the image processing apparatus.
- 영상 공유를 수행하는 타겟 사용자 장치는,The target user device performing the video sharing,영상 데이터를 수신하기 위해 구현된 통신부; 및A communication unit implemented to receive image data; And상기 통신부와 동작 가능하게(operatively) 연결된 프로세서를 포함하되, Includes a processor operatively connected with the communication unit,상기 프로세서는 중계 사용자 장치와 제1 통신 네트워크를 기반으로 연결하고,The processor is connected with the relay user device based on the first communication network,상기 제1 통신 네트워크를 기반으로 영상 처리 장치에 의해 촬상되어 생성된 영상 데이터를 실시간으로 수신하도록 구현되되,It is implemented to receive in real time the image data generated by the image processing apparatus based on the first communication network,상기 영상 처리 장치와 상기 중계 사용자 장치는 제2 통신 네트워크를 기반으로 연결된 것을 특징으로 하는 타겟 사용자 장치.And the image processing apparatus and the relay user apparatus are connected based on a second communication network.
- 제5항에 있어서,The method of claim 5,상기 제1 통신 네트워크는 포트 포워딩을 기반으로 상기 타겟 사용자 장치의 상기 중계 사용자 장치에 의해 설정된 IP(internet protocol) 정보 및 포트 정보를 기반으로 생성되고,The first communication network is generated based on port information and IP (internet protocol) information set by the relay user device of the target user device based on port forwarding.상기 제2 통신 네트워크는 상기 중계 사용자 장치의 핫스팟 기능을 기반으로 생성되는 것을 특징으로 하는 타겟 사용자 장치.And wherein the second communication network is generated based on a hot spot function of the relay user device.
- 제5항에 있어서, The method of claim 5,상기 영상 처리 장치에 의해 촬상되어 생성되는 원본 영상 데이터는 제1 화각의 영상 데이터이고, Original image data captured by the image processing apparatus and generated are image data of a first angle of view,상기 영상 데이터는 제2 화각의 영상 데이터이고, The image data is image data of a second angle of view,상기 제1 화각은 상기 제2 화각보다 크고,The first angle of view is larger than the second angle of view,상기 제2 화각은 상기 타겟 사용자 장치에 의해 결정되는 것을 특징으로 하는 타겟 사용자 장치.The second angle of view is determined by the target user device.
- 제7항에 있어서,The method of claim 7, wherein상기 타겟 사용자 장치는 영상 정보를 생성하고, The target user device generates image information,상기 영상 정보는 상기 제2 화각을 지시하는 출력 영상 화각 정보를 포함하고, The image information includes output image angle of view information indicating the second angle of view,상기 출력 영상 화각 정보는 상기 타겟 사용자 장치에서 출력될 출력 영상 화각에 대한 정보를 포함하고, The output image angle of view information includes information on an output image angle of view to be output from the target user device.출력 영상 화각은 상기 영상 처리 장치의 방향 또는 절대 방향을 기준으로 결정되는 것을 특징으로 하는 타겟 사용자 장치.The output image angle of view is determined based on the direction or absolute direction of the image processing apparatus target user device.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/250,109 US20210235164A1 (en) | 2018-08-09 | 2019-06-09 | Image sharing method and device |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2018-0092938 | 2018-08-09 | ||
KR20180092938 | 2018-08-09 | ||
KR10-2018-0109022 | 2018-09-12 | ||
KR1020180109022A KR102101382B1 (en) | 2018-08-09 | 2018-09-12 | Method and apparatus for sharing image |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020032371A1 true WO2020032371A1 (en) | 2020-02-13 |
Family
ID=69413674
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2019/006916 WO2020032371A1 (en) | 2018-08-09 | 2019-06-09 | Image sharing method and device |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2020032371A1 (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20150039352A (en) * | 2013-10-02 | 2015-04-10 | 엘지전자 주식회사 | Electronic device and control method thereof |
KR20160122702A (en) * | 2014-02-17 | 2016-10-24 | 소니 주식회사 | Information processing device, information processing method and program |
KR20170056346A (en) * | 2015-11-13 | 2017-05-23 | 아바드(주) | Wearalbe Carmear |
KR20170084636A (en) * | 2016-01-12 | 2017-07-20 | 쿨클라우드(주) | Network system for internet of things |
KR101843335B1 (en) * | 2017-03-31 | 2018-03-29 | 링크플로우 주식회사 | Method for transaction based on image and apparatus for performing the method |
-
2019
- 2019-06-09 WO PCT/KR2019/006916 patent/WO2020032371A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20150039352A (en) * | 2013-10-02 | 2015-04-10 | 엘지전자 주식회사 | Electronic device and control method thereof |
KR20160122702A (en) * | 2014-02-17 | 2016-10-24 | 소니 주식회사 | Information processing device, information processing method and program |
KR20170056346A (en) * | 2015-11-13 | 2017-05-23 | 아바드(주) | Wearalbe Carmear |
KR20170084636A (en) * | 2016-01-12 | 2017-07-20 | 쿨클라우드(주) | Network system for internet of things |
KR101843335B1 (en) * | 2017-03-31 | 2018-03-29 | 링크플로우 주식회사 | Method for transaction based on image and apparatus for performing the method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2013115541A1 (en) | Terminal, image communication control server, and system and method for image communication using same | |
WO2020122488A1 (en) | Camera-based mixed reality glass apparatus, and mixed reality display method | |
WO2016186458A1 (en) | Image information collecting system and method for collecting image information on moving object | |
WO2011139070A2 (en) | Method and apparatus for recognizing location of user | |
WO2015050288A1 (en) | Social augmented reality service system and social augmented reality service method | |
WO2019194529A1 (en) | Method and device for transmitting information on three-dimensional content including multiple view points | |
WO2018030567A1 (en) | Hmd and control method therefor | |
WO2015160052A1 (en) | Method for correcting image from wide-angle lens and device therefor | |
WO2017119575A1 (en) | Image photographing device and image photographing method | |
WO2018164316A1 (en) | Omnidirectional image capturing method and device for performing method | |
WO2020054898A1 (en) | Image sharing method and device | |
WO2018092926A1 (en) | Internet of things-based outdoor self-photography support camera system | |
WO2020032371A1 (en) | Image sharing method and device | |
WO2018101533A1 (en) | Image processing device and method | |
WO2019004531A1 (en) | User signal processing method and device for performing method | |
WO2018092929A1 (en) | Internet of things-based indoor selfie-supporting camera system | |
WO2018164317A1 (en) | Method for generating direction information of omnidirectional image and device for performing method | |
WO2019098729A1 (en) | Vehicle monitoring method and device | |
JP2001204015A (en) | Surrounding camera system, method for generating surrounding image based on image picked up by surrounding camera, jointing unit for image picked up by adjacent camera and its method, and distance measurement device using the adjacent camera and its method | |
WO2019083068A1 (en) | Three-dimensional information acquisition system using pitching practice, and method for calculating camera parameters | |
WO2017179912A1 (en) | Apparatus and method for three-dimensional information augmented video see-through display, and rectification apparatus | |
WO2021101148A1 (en) | Electronic device including tilt ois and method for capturing image and processing captured image | |
WO2018117325A1 (en) | Method for linking integrated management system and video security system | |
WO2014035053A1 (en) | Camera system using super wide angle camera | |
WO2024005356A1 (en) | Electronic device for displaying image and method for operating same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19848738 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19848738 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 11.10.2021) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19848738 Country of ref document: EP Kind code of ref document: A1 |