WO2020032371A1 - Procédé et dispositif de partage d'image - Google Patents

Procédé et dispositif de partage d'image Download PDF

Info

Publication number
WO2020032371A1
WO2020032371A1 PCT/KR2019/006916 KR2019006916W WO2020032371A1 WO 2020032371 A1 WO2020032371 A1 WO 2020032371A1 KR 2019006916 W KR2019006916 W KR 2019006916W WO 2020032371 A1 WO2020032371 A1 WO 2020032371A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
user device
view
angle
target user
Prior art date
Application number
PCT/KR2019/006916
Other languages
English (en)
Korean (ko)
Inventor
김용국
조성래
김용진
김준세
Original Assignee
링크플로우 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020180109022A external-priority patent/KR102101382B1/ko
Application filed by 링크플로우 주식회사 filed Critical 링크플로우 주식회사
Priority to US17/250,109 priority Critical patent/US20210235164A1/en
Publication of WO2020032371A1 publication Critical patent/WO2020032371A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/238Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
    • H04N21/2381Adapting the multiplex stream to a specific network, e.g. an Internet Protocol [IP] network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/643Communication protocols
    • H04N21/6437Real-time Transport Protocol [RTP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof

Definitions

  • the present invention relates to an image sharing method and apparatus. More particularly, the present invention relates to a method for sharing an image between user devices by using a relay device.
  • An omnidirectional imaging system refers to an imaging system capable of recording image information in all directions (360 degrees) based on a specific viewpoint. It is possible to obtain a much wider field-of-view image than a conventional imaging system, and thus, in addition to research fields such as computer vision and mobile robot, surveillance systems, virtual reality systems, and pan-tilt-zoom Applications are becoming increasingly widespread in practical applications such as cameras and video conferencing.
  • an omnidirectional image may be generated by bonding images obtained by rotating one camera based on an optical axis satisfying a single view point.
  • a method of arranging a plurality of cameras in an annular structure and combining images obtained from each camera may be used.
  • a user may generate an omnidirectional image using various omnidirectional image processing apparatuses (or omnidirectional image processing cameras or 360 degree cameras).
  • the omnidirectional image processing apparatus may be utilized in various areas. For example, it can be used in areas requiring surveillance of omnidirectional video such as security / security, or it can be used to record the places visited by travelers when traveling.
  • the omnidirectional image photographed based on the omnidirectional image processing apparatus may be edited and used as an image for sale of a product.
  • the omnidirectional image processing apparatus may be used for various purposes.
  • An object of the present invention is to solve all the above-mentioned problems.
  • an object of the present invention is to transmit an image (for example, an omnidirectional image) to a target user device by using a relay device such as a user device.
  • an object of the present invention is to allow a user of a target user device located at a far distance to receive an image captured by the image processing device in real time to share a situation at a location of the image processing device at a long distance.
  • an object of the present invention is to prevent network load by minimizing unnecessary image data when sharing an image, and to prevent image disconnection.
  • an image sharing method includes connecting a target user device to a relay user device based on a first communication network and capturing the target user device by an image processing device based on the first communication network. And receiving the generated image data in real time, wherein the image processing apparatus and the relay user apparatus may be connected based on a second communication network.
  • a target user device for performing image sharing includes a communication unit implemented to receive image data and a processor operatively connected to the communication unit, wherein the processor comprises a relay user device; 1 may be configured to connect based on a communication network, and receive image data generated by an image processing apparatus in real time based on the first communication network, wherein the image processing apparatus and the relay user apparatus are provided in a second manner.
  • the connection may be based on a communication network.
  • an image (for example, an omnidirectional image) may be transmitted to the target user device by using a relay device such as a user device.
  • a user of a remotely located target user device may receive an image captured by the image processing device in real time so that the situation at the location of the image processing device may be shared remotely.
  • network load can be reduced by minimizing unnecessary image data when sharing images, and image breakage can be prevented.
  • FIG. 1 is a conceptual diagram illustrating a video sharing method according to an embodiment of the present invention.
  • FIG. 2 is a conceptual diagram illustrating a method of transmitting an image based on a user device according to an embodiment of the present invention.
  • FIG. 3 is a conceptual diagram illustrating a method of transmitting an image based on a user device according to an embodiment of the present invention.
  • FIG. 4 is a conceptual diagram illustrating a method of transmitting an image through a user device according to an embodiment of the present invention.
  • FIG. 5 is a conceptual diagram illustrating a method of transmitting an image through a user device according to an embodiment of the present invention.
  • FIG. 6 is a conceptual diagram illustrating a method of checking an image through a user device located at a far distance according to an embodiment of the present invention.
  • FIG. 7 is a conceptual diagram illustrating a method for receiving an image of a specific field of view selected by a target user according to an embodiment of the present invention.
  • FIG. 8 is a conceptual diagram illustrating a method of checking an image in a target user device according to an embodiment of the present invention.
  • the image processing apparatus is an omnidirectional image processing apparatus, and that an image captured by the image processing apparatus is an omnidirectional image.
  • the image processing apparatus may include not only an omnidirectional image processing apparatus but also various types of image processing apparatuses. It may include an image of an angle of view.
  • FIG. 1 is a conceptual diagram illustrating a video sharing method according to an embodiment of the present invention.
  • a method for transmitting an image eg, an omnidirectional image
  • an image processing apparatus eg, an omnidirectional image processing apparatus
  • the target user device may be a device that is not directly connected to the image processing device and receives an image captured by the image processing device.
  • a relay user may wear the image processing apparatus 100 and carry the relay user apparatus 110.
  • the image processing apparatus is a wearable image processing apparatus that may be worn by a relay user in the form of a neckband.
  • the image processing apparatus may include not only an image processing apparatus for capturing a 360 degree image, but also an image processing apparatus for capturing an image having a lower angle of view.
  • the relay user wears the image processing apparatus 100 and carries the relay user apparatus 110.
  • the relay user wearing the image processing apparatus 100 and the relay user apparatus 110 carry the same.
  • the users can be different.
  • the target user may carry the target user device 120 and receive an image captured by the image processing device 100.
  • the target user device 120 For convenience of description, it is assumed that one user wears the image processing apparatus 100 and carries the relay user apparatus 110, but the user and the relay user apparatus 110 wearing the omnidirectional image processing apparatus 100 are used. Carrying users may be different.
  • Omni-directional image data generated by the image processing apparatus 100 may be transmitted to the target user device 120 through the relay user device 110.
  • the target user may check the omnidirectional image of the current situation of the relay user at a distance through the target user device 120.
  • FIG. 2 is a conceptual diagram illustrating a method of transmitting an image based on a user device according to an embodiment of the present invention.
  • an image checking method based on a target user device is disclosed.
  • the communication procedure disclosed in FIG. 2 is one example, and various other methods may be utilized to identify an image through the target user device.
  • the image processing apparatus may include an omnidirectional image processing apparatus, and the image may include an omnidirectional image.
  • the image processing apparatus 200 may operate.
  • a first communication connection may be performed between the image processing apparatus 200 and the relay user device 210.
  • the relay user device 210 may act as a router based on a function such as a hot spot.
  • the image processing apparatus 200 may be combined with the relay user device 210 serving as a router.
  • the target user device 220 may be connected to the image processing device 200 from the outside based on port forwarding.
  • Port forwarding or port mapping is the process of forwarding a request for a combination of one Internet protocol (IP) address and port number combination to another while a packet crosses a network gateway, such as a router or firewall, in a computer network.
  • IP Internet protocol
  • State is a network address translation (NAT) technology.
  • the relay user device 210 sets IP: PhoneA and port number: PhoneA for the image processing device 200, and through the port forwarding technology, the image processing device 200 and the target user device. 220 may be connected.
  • the target user device 220 may receive an image data transmitted through the image processing apparatus 200 by inputting an IP and a port number set by the relay user device. Alternatively, information about the IP and the port number is automatically transferred between the relay user device 210 and the target user device 220 through a separate application so that the target user device receives data generated by the image processing device without additional input. can do.
  • the target user device 220 may check an image captured by the image processing apparatus 200 transmitted based on a video transmission protocol such as a real-time streaming protocol (RTSP) scheme.
  • a video transmission protocol such as a real-time streaming protocol (RTSP) scheme.
  • RTSP real-time streaming protocol
  • Port forwarding and real-time streaming protocol (RTSP) schemes are examples, and sharing of images may be performed in various ways.
  • RTSP real-time streaming protocol
  • the target user device connects to the relay user device based on the first communication network, and the target user device is captured by the image processing device based on the first communication network.
  • the method may include receiving the generated image data in real time, but the image processing apparatus and the relay user apparatus may be connected based on the second communication network.
  • the first communication network is generated based on IP and port information set by the relay user device of the target user device based on port forwarding
  • the second communication network is a hot spot of the relay user device. Can be generated based on functionality.
  • FIG. 3 is a conceptual diagram illustrating a method of transmitting an image based on a user device according to an embodiment of the present invention.
  • FIG. 3 a method of performing communication between an image processing device and a user device through at least one access point (AP) is disclosed.
  • the image processing apparatus 300 may start an operation and capture an image (eg, omnidirectional image).
  • AP1 310 and AP2 320 may be connected in a wireless distribution system (WDS) manner.
  • WDS wireless distribution system
  • the image processing apparatus 300 may be connected to the AP1 310, and the user device 330 may be connected to the AP2 320.
  • the user device 330 may input an internal IP and a port number of the image processing device 300 in the application.
  • the user device 330 may check the image captured by the image processing apparatus in the RTSP method.
  • the internal IP and port number of the image processing apparatus 300 may be automatically input based on the application without additional input on the user device.
  • FIG. 4 is a conceptual diagram illustrating a method of transmitting an image through a user device according to an embodiment of the present invention.
  • FIG. 4 a method for reducing the size of image data to be transmitted among the image transmission methods based on a user device is disclosed.
  • the image processing apparatus is an omnidirectional image processing apparatus and the image is an omnidirectional image (360 degree image).
  • a method for reducing the amount of image data transmitted in real time by providing only an image corresponding to a part of an angle of view of an omnidirectional image transmitted to a relay user device to a target user device is disclosed.
  • the entire image data corresponding to the entire angle of view image (eg, 360 degree image) captured by the image processing apparatus may be transmitted to the relay user apparatus directly connected to the image processing apparatus.
  • the image checked by the target user device may be an image of a part of the angle of view that the target user is interested in, not the entire angle of view image. Therefore, the image data transmitted by the relay user device may be transmitted not the entire field of view image data but only partial image data 450 for a partial field of view of interest by the target user. In this way, the burden on the communication network is reduced, and unnecessary data dropping due to the data load can be prevented.
  • the target user device may transmit the image information 400 of the output image currently output by the target user device to the relay user device through the application.
  • the image information 400 may include information about an output image angle of view output by the target user device.
  • the target user device may transmit the image information 400 to the relay user device.
  • the relay user device may receive the image information 400 and determine the partial image data 450 to be transmitted to the target user device based on the image information.
  • the relay user device may further include the target user device up to the reference image area 420 determined based on the output image angle information included in the image information 400 and the peripheral image area 440 of the reference image area 420. Can be sent to.
  • the reference image area 420 may be an image area of an angle of view of -30 degrees to 30 degrees indicated by the target user device
  • the peripheral image area 440 is an image area of an angle of view of -30 degrees to +30 degrees. It may be a peripheral image area (eg, an image area of -60 degrees to -30 degrees, +30 degrees to +60 degrees) 440 extended based on the threshold.
  • a threshold percentage for determining the surrounding image area may change according to the communication state / frequency of screen movement of the user. As the communication state is relatively good, the threshold percentage may be relatively increased. As the communication state is relatively good, the threshold percentage may be relatively decreased.
  • the threshold percentage may be relatively increased, and as the screen movement frequency of the user of the target user device is relatively small, the threshold percentage may be relatively decreased.
  • the original image data captured and generated by the image processing apparatus are image data of a first angle of view, the image data is image data of a second angle of view, and the first angle of view is greater than the second angle of view, and the second angle of view is the target user device. Can be determined by.
  • the target user device generates image information, the image information includes output image angle information indicating a second angle of view, the output image angle of view information includes information on an output image angle of view to be output from the target user device, and output image The angle of view may be determined based on the direction or absolute direction of the image processing apparatus.
  • FIG. 5 is a conceptual diagram illustrating a method of transmitting an image through a user device according to an embodiment of the present invention.
  • FIG. 5 a method for reducing the size of image data to be transmitted among the image transmission methods based on a user device is disclosed.
  • the image processing apparatus is an omnidirectional image processing apparatus and the image is an omnidirectional image (360 degree image).
  • the image information 500 for reducing the size of the image data may include output image direction information, output image horizontal angle of view information, and output image vertical angle of view information.
  • the output image direction information may be determined by at least one camera among a plurality of cameras implemented in the image processing apparatus as information on the imaging direction of the image processing apparatus. For example, when a user wears an image processing apparatus in the form of a neck band, a specific direction (eg, a front direction of the user) captured by a specific camera included in the image processing apparatus may be an output image direction.
  • a specific direction eg, a front direction of the user
  • the output image horizontal angle of view information may indicate an angle of view in the horizontal direction based on the direction indicated by the output image direction information.
  • the horizontal direction may be a longitudinal direction.
  • the output image horizontal angle of view information may include information about the horizontal angle of view selected by the target user.
  • the output image vertical angle of view information may indicate the angle of view of the vertical direction based on the direction indicated by the output image direction information.
  • the vertical direction may be a latitude direction.
  • the output image vertical angle of view information may include information about the vertical angle of view selected by the target user.
  • the relay user device receives the image information 500 including the output image direction information, the output image horizontal angle of view information, and the output image vertical angle of view information, and transmits partial image data to be transmitted to the target user device based on the image information 500. 550 may be determined.
  • the relay user device may include the reference image area 520 and the reference image area 520 determined based on the output image angle of view information, the output image horizontal angle of view information, and the output image vertical angle of view information included in the image information 500.
  • the peripheral image area 540 may be further transmitted to the target user device.
  • the reference image area 520 may include a first direction, an output image horizontal angle of view (-30 degrees to +30 degrees), and an output image vertical angle of view (-30 degrees to +30 degrees) indicated by the target user device.
  • the peripheral image area 540 may extend the output image horizontal angle of view (-30 degrees to +30 degrees) by a first threshold percentage in the first direction, and output image vertical angle of view (-30 degrees to +30 degrees). ) May be an area in which the second threshold percentage is expanded.
  • the target user wants to see and the angle of view changes
  • information about the changed direction and the changed angle of view may be transmitted from the target user device to the relay user device.
  • the reference image area 520 and the surrounding image area 540 may be changed again.
  • the first threshold percentage and / or the second threshold percentage for determining the surrounding image area 540 may be changed according to the communication state / frequency of screen movement of the user.
  • the relatively good communication state may relatively increase the first and / or second threshold percentages, and the relatively poor communication state may relatively decrease the first and / or second threshold percentages. .
  • first threshold percentage and / or the second threshold percentage are relatively increased as the screen movement frequency of the user of the target user device is relatively high, and the first threshold percentage is relatively small as the screen movement frequency of the user of the target user device is relatively small. And / or the second threshold percentage may be relatively reduced.
  • FIG. 6 is a conceptual diagram illustrating a method of checking an image through a user device located at a far distance according to an embodiment of the present invention.
  • FIG. 6 a method of automatically adjusting a screen of a target user device for convenience of screen viewing of a remotely located user device is disclosed.
  • an image captured by the image processing apparatus may be delivered to the target user apparatus through the relay user apparatus or the AP as described above.
  • the target user device may provide an image to the target user based on a reference point set by the target user.
  • an image may be provided to the target user device based on the reference point 600 set by the target user.
  • the reference point 600 may be set such that an image of a specific direction, a specific horizontal angle of view, and a specific vertical angle of view is positioned in the middle of the display of the target user device. In this case, even when the position of the image processing apparatus changes or is shaken, an image maintaining a specific horizontal / vertical angle of view and a specific direction may be provided on the target user device.
  • the target user may set a specific direction and a specific horizontal / vertical angle of view in the omnidirectional image reproduced on the target user device.
  • a target user sets a direction and a horizontal / vertical angle of view in an omnidirectional image by using a touch screen on the target user device, and generates a specific touch through the user interface to output the image direction information 620 and outputs the image.
  • the image angle of view information 640 may be generated as the screen setting information.
  • the target user device may output an image indicated by the screen setting information of the omnidirectional image on the target user device. For example, the user sets a horizontal angle of view of -60 degrees to +60 degrees, a vertical angle of view of -45 degrees to +45 degrees based on the reference point 600, and sets an x direction based on the image processing device as a selection direction.
  • a screen determined based on the screen setting information among the omnidirectional images may be output. That is, only a specific region of the omnidirectional image may be output on the target user device, and through this method, the target user device may continuously receive a screen of a specific field of view regardless of the real-time movement of the image processing device.
  • FIG. 7 is a conceptual diagram illustrating a method for receiving an image of a specific field of view selected by a target user according to an embodiment of the present invention.
  • FIG. 7 when a screen of a specific field of view to be tracked by a target user is selected, a method for extracting direction and angle of view information on the screen of the field of view is disclosed. This may be applied to a method of determining the output image direction information, the output image horizontal angle of view information, and the output image vertical angle of view information described above with reference to FIGS. 4 and 5.
  • the output image direction information may be determined by matching the omnidirectional image and the selected image (step S710).
  • Direction information may be determined based on the selection screen.
  • the omnidirectional image image may be shaped like a sphere, and the output image direction in the omnidirectional image image may be set in various ways.
  • the output image direction may be generated based on the imaging directions of the plurality of cameras implemented in the image processing apparatus. Assuming that the image processing apparatus is at the center of the sphere, the direction information of 360 degrees may be determined based on the direction of the imaging line of the first camera of the image processing apparatus. The output image direction information may be determined based on which of 360 degrees the direction of the selection screen selected by the user indicates.
  • the imaging line may be a leader line in the direction indicated by the center of the camera lens.
  • the output image direction may be a direction relatively determined based on a specific direction (for example, north) rather than the direction of the image processing apparatus.
  • the output image horizontal angle of view information / output image vertical angle of view information may be determined by matching the omnidirectional image with the selection screen (step S720).
  • the output image horizontal angle of view information and the output image vertical angle of view information may be determined as the horizontal angle of view and the vertical angle of view based on the output image direction.
  • the horizontal angle of view may be in the longitudinal direction on the sphere, and the vertical angle of view may be in the latitude direction on the sphere.
  • the target user may be provided with a similar range of view despite the movement of the image capturing apparatus.
  • the image direction selected by the user may be set based on an absolute position (for example, north), or may be set to a relative position (front of the user, first camera image pickup line, etc.).
  • an absolute position for example, north
  • a relative position front of the user, first camera image pickup line, etc.
  • the image for the specific direction may be continuously provided regardless of the direction change of the user wearing the image processing apparatus.
  • the relative position is selected in the image direction, the image reflecting the change of the direction of the user wearing the image processing apparatus may be continuously provided.
  • the image output from the target user device may be changed in consideration of the change of direction of the target user device. For example, when the target user device returns to 30 degrees to the left based on the sensor of the target user device, the angle of view of the screen output from the current target user device is adjusted by -30 degrees to further reflect the movement of the target user device.
  • the screen output from the user device can be determined
  • the target user device may also provide a function of rotating the captured image.
  • the omnidirectional image After capturing the omnidirectional image provided in real time, the omnidirectional image may be rotated on a touch basis, and a variety of angles of view and time that the target user did not check according to the rotation may be provided.
  • omnidirectional image data related to the entire captured image may be transmitted to the target user device after the capture.
  • the omnidirectional image is provided in real time, if the target user resets the capture function, the omnidirectional image of the time after the reset may be provided to the target user again.
  • FIG. 8 is a conceptual diagram illustrating a method of checking an image in a target user device according to an embodiment of the present invention.
  • FIG. 8 a method of recommending a region of interest to a target user device is disclosed.
  • the image processing apparatus is an omnidirectional image processing apparatus and the image is an omnidirectional image.
  • a recommendation screen may be provided to the target user device so that the target user may receive images of various angles of view. For example, when the target user is currently provided with a screen having an angle of view of -30 degrees to +30 degrees as the selection screen, a screen for another angle of view among the omnidirectional image images may be provided as the recommended screen.
  • the relay user apparatus or the image processing apparatus may determine an angle of view in which a lot of change occurs and provide a screen for the angle of view to the target user as a recommendation screen.
  • an optimal view to be provided to the target user may be found and provided to the target user.
  • the default screen 850 may be provided to the target user with the same view as that of the user wearing the omnidirectional image processing apparatus.
  • the field of view of the omnidirectional image processing apparatus wearer may be changed, and the field of view of the omnidirectional image processing apparatus wearer may be changed according to the operation of the omnidirectional image processing apparatus wearer. May provide a more natural default screen 850 to the target user through learning about the change.
  • the current state 800 of the user wearing the omnidirectional image processing apparatus may be determined.
  • the user who wears the omnidirectional image processing apparatus (hereinafter, the wearer of the omnidirectional image processing apparatus) may determine whether to stand, sit, move, or climb the stairs. Determination of an operation (or current state 800) of the wearer of the omnidirectional image processing apparatus may be performed based on a sensor implemented in the omnidirectional image processing apparatus. Alternatively, the determination of the operation (or the current state 800) of the wearer of the omnidirectional image processing apparatus may be performed based on the image change in the omnidirectional image processing apparatus.
  • Learning of the wearer of the omnidirectional image processing apparatus may be performed, and a more accurate motion of the wearer of the current omnidirectional image wearing user (or the current state 800) may be determined based on the learning result. You can decide.
  • the default screen 850 to be provided to the current target user may be determined based on the current state 800 of the wearer of the omnidirectional image processing apparatus.
  • the image line reference of the first camera is based on the front surface of the wearer of the omnidirectional image processing apparatus.
  • the 60- + 60 degree direction may be the same as that of the current omnidirectional image processing device wearer.
  • the view of the wearer of the omnidirectional image processing device and the view of the wearer of the omnidirectional image processing device according to the current state 800 are reflected, so that the field of view of the forward image processing device wearer is the same as or similar to that of the front image processing device wearer.
  • the corresponding screen may be provided as the default screen 850 to the target user device.
  • the determination of the default screen 850 may be performed by at least one of the omnidirectional image processing apparatus, the relay user apparatus, and the target user apparatus.
  • the current state 800 of the wearer of the omnidirectional image processing apparatus is determined based on the motion sensing information transmitted by the omnidirectional image processing apparatus and the screen change of the omnidirectional image processing apparatus. And provide a default screen 850.
  • Embodiments according to the present invention described above can be implemented in the form of program instructions that can be executed by various computer components, and recorded in a computer-readable recording medium.
  • the computer-readable recording medium may include program instructions, data files, data structures, etc. alone or in combination.
  • the program instructions recorded on the computer readable recording medium may be specially designed and configured for the present invention, or may be known and available to those skilled in the computer software field.
  • Examples of computer-readable recording media include magnetic media such as hard disks, floppy disks and magnetic tape, optical recording media such as CD-ROMs and DVDs, and magneto-optical media such as floptical disks. medium) and hardware devices specifically configured to store and execute program instructions, such as ROM, RAM, flash memory, and the like.
  • Examples of program instructions include not only machine code generated by a compiler, but also high-level language code that can be executed by a computer using an interpreter or the like.
  • Hardware devices may be modified with one or more software modules to perform the processing according to the present invention, and vice versa.

Abstract

La présente invention concerne un procédé et un dispositif de partage d'image. Le procédé de partage d'image peut comprendre les étapes suivantes : connexion à un dispositif d'utilisateur de relais, en fonction d'un premier réseau de communication par un dispositif d'utilisateur cible ; et réception des données d'image, qui ont été générées par capture d'image par un dispositif de traitement d'image, par le dispositif d'utilisateur cible en temps réel et en fonction du premier réseau de communication, le dispositif de traitement d'image et le dispositif d'utilisateur de relais pouvant être connectés en fonction d'un second réseau de communication.
PCT/KR2019/006916 2018-08-09 2019-06-09 Procédé et dispositif de partage d'image WO2020032371A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/250,109 US20210235164A1 (en) 2018-08-09 2019-06-09 Image sharing method and device

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR20180092938 2018-08-09
KR10-2018-0092938 2018-08-09
KR10-2018-0109022 2018-09-12
KR1020180109022A KR102101382B1 (ko) 2018-08-09 2018-09-12 영상 공유 방법 및 장치

Publications (1)

Publication Number Publication Date
WO2020032371A1 true WO2020032371A1 (fr) 2020-02-13

Family

ID=69413674

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2019/006916 WO2020032371A1 (fr) 2018-08-09 2019-06-09 Procédé et dispositif de partage d'image

Country Status (1)

Country Link
WO (1) WO2020032371A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150039352A (ko) * 2013-10-02 2015-04-10 엘지전자 주식회사 전자 디바이스 및 그 제어방법
KR20160122702A (ko) * 2014-02-17 2016-10-24 소니 주식회사 정보 처리 장치, 정보 처리 방법 및 프로그램
KR20170056346A (ko) * 2015-11-13 2017-05-23 아바드(주) 모바일 기반의 웨어러블 카메라의 정보 전송방법 및 웨어러블 카메라
KR20170084636A (ko) * 2016-01-12 2017-07-20 쿨클라우드(주) 사물 인터넷 네트워크 시스템
KR101843335B1 (ko) * 2017-03-31 2018-03-29 링크플로우 주식회사 영상 기반의 거래 방법 및 이러한 방법을 수행하는 장치

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150039352A (ko) * 2013-10-02 2015-04-10 엘지전자 주식회사 전자 디바이스 및 그 제어방법
KR20160122702A (ko) * 2014-02-17 2016-10-24 소니 주식회사 정보 처리 장치, 정보 처리 방법 및 프로그램
KR20170056346A (ko) * 2015-11-13 2017-05-23 아바드(주) 모바일 기반의 웨어러블 카메라의 정보 전송방법 및 웨어러블 카메라
KR20170084636A (ko) * 2016-01-12 2017-07-20 쿨클라우드(주) 사물 인터넷 네트워크 시스템
KR101843335B1 (ko) * 2017-03-31 2018-03-29 링크플로우 주식회사 영상 기반의 거래 방법 및 이러한 방법을 수행하는 장치

Similar Documents

Publication Publication Date Title
WO2013115541A1 (fr) Terminal, serveur de commande de communication d'image et système ainsi que procédé de communication d'image l'utilisant
WO2020122488A1 (fr) Appareil de lunettes de réalité mixte basé sur une caméra et procédé d'affichage de réalité mixte
WO2016186458A1 (fr) Système de collecte d'informations d'images et procédé de collecte d'informations d'images sur des objets mobiles
WO2011139070A2 (fr) Procédé et appareil pour reconnaître la localisation d'un utilisateur
WO2015050288A1 (fr) Système de service social à réalité augmentée et procédé de service social à réalité augmentée
WO2019194529A1 (fr) Procédé et dispositif de transmission d'informations sur un contenu tridimensionnel comprenant de multiples points de vue
WO2018030567A1 (fr) Hmd et son procédé de commande
WO2015160052A1 (fr) Procédé de correction d'image d'un objectif à grand angle et dispositif associé
WO2017119575A1 (fr) Dispositif de prise d'image photographique et procédé de prise d'image photographique
WO2018164316A1 (fr) Procédé et dispositif de capture d'image omnidirectionnelle permettant de mettre en oeuvre un procédé
WO2020054898A1 (fr) Procédé et dispositif de partage d'image
WO2018092926A1 (fr) Système de caméra de support d'auto-photographie en extérieur basé sur l'internet des objets
WO2020032371A1 (fr) Procédé et dispositif de partage d'image
WO2018101533A1 (fr) Dispositif et procédé de traitement d'image
WO2019004531A1 (fr) Procédé de traitement de signal d'utilisateur et dispositif d'exécution dudit procédé
WO2018092929A1 (fr) Système d'appareil de prise de vues prenant en charge des autoportraits à l'intérieur fondé sur l'internet des objets
WO2018164317A1 (fr) Procédé de production d'informations de direction d'une image omnidirectionnelle et dispositif pour la mise en œuvre de ce procédé
WO2019098729A1 (fr) Procédé et dispositif de surveillance de véhicule
JP2001204015A (ja) 周囲カメラ・システム、周囲カメラの撮像画像に基づいて周囲画像を生成する方法、隣接カメラによる撮像画像の接続処理装置及び方法、並びに、隣接カメラを用いた距離測定装置及び方法
WO2017204598A1 (fr) Terminal et procédé de configuration de protocole de données pour une image photographiée
WO2019083068A1 (fr) Système d'acquisition d'informations tridimensionnelles à l'aide d'une pratique de lancement, et procédé de calcul de paramètres de caméra
WO2017179912A1 (fr) Appareil et procédé destiné à un dispositif d'affichage transparent de vidéo augmentée d'informations tridimensionnelles, et appareil de rectification
WO2021101148A1 (fr) Dispositif électronique comprenant un ois d'inclinaison et procédé de capture d'image et de traitement d'image capturée
WO2018117325A1 (fr) Procédé de liaison d'un système de gestion intégré et système de sécurité vidéo
WO2019098450A1 (fr) Système destiné à guider le regard d'un utilisateur au moyen d'un dispositif mobile et procédé associé

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19848738

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19848738

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 11.10.2021)

122 Ep: pct application non-entry in european phase

Ref document number: 19848738

Country of ref document: EP

Kind code of ref document: A1