WO2021133053A1 - Dispositif électronique et son procédé de commande - Google Patents

Dispositif électronique et son procédé de commande Download PDF

Info

Publication number
WO2021133053A1
WO2021133053A1 PCT/KR2020/018980 KR2020018980W WO2021133053A1 WO 2021133053 A1 WO2021133053 A1 WO 2021133053A1 KR 2020018980 W KR2020018980 W KR 2020018980W WO 2021133053 A1 WO2021133053 A1 WO 2021133053A1
Authority
WO
WIPO (PCT)
Prior art keywords
size
image
depth
display
displayed
Prior art date
Application number
PCT/KR2020/018980
Other languages
English (en)
Korean (ko)
Inventor
조승현
송석우
이요한
Original Assignee
삼성전자주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자주식회사 filed Critical 삼성전자주식회사
Publication of WO2021133053A1 publication Critical patent/WO2021133053A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • G06F3/005Input arrangements through a video camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Definitions

  • the present disclosure relates to an electronic device and a control method thereof, and more particularly, to an electronic device displaying an AR object and a control method thereof.
  • AR augmented reality
  • AR is a technology that combines and displays a virtual object (or information) in an image taken of the physical space of the real environment. Through the virtual object displayed on the display, it may be displayed as if the virtual object exists together in the real space, and useful information related to the real object may be provided to the user.
  • the user may recognize the perspective (or distance) of the object by recognizing the small size of the object located far from the user in the real space and recognizing the large size of the object located close to the user.
  • the AR space ie, a virtual space grafted with reality
  • the size of the virtual object placed in the AR space varies according to the distance from the user (or electronic device).
  • the present disclosure has been made in response to the above-mentioned necessity, and an object of the present disclosure is to provide an electronic device for adjusting the size of an AR object and a method for controlling the same.
  • an electronic device controls the display to display an AR object on a camera, a display, and an image acquired through the camera, and the mode of the electronic device is a normal mode
  • the mode of the electronic device is a normal mode
  • the size of the AR object displayed on the display is adjusted based on the changed depth, and in a state that the mode of the electronic device is the arrangement mode, and a processor that controls to maintain the size at which the AR object is displayed on the display when the depth of the point where the AR object is located on the image is changed.
  • the depth of the point at which the AR object is located may be changed as the point at which the camera is located or the point at which the AR object is located on the image moves according to the depth direction.
  • the processor controls the display to display the AR object on the image and the image based on the size of the AR object, and in the arrangement mode, when the depth of the point where the AR object is located on the image is changed, the AR object is displayed on the display. You can change the size of the AR object so that it retains its displayed size.
  • the processor increases the size of the AR object to maintain the size at which the AR object is displayed on the display when the depth of the point where the AR object is located increases while the mode of the electronic device is the arrangement mode, and the AR object is positioned
  • the size of the AR object may be reduced to maintain the size at which the AR object is displayed on the display.
  • the processor controls the display to display the AR object on the image and the image based on the size of the AR object, and in a state where the mode of the electronic device is the normal mode, allocates the depth of the point where the AR object is located and the AR object Based on the specified properties, the size of the AR object displayed on the display can be adjusted.
  • the processor controls the size at which the AR object is displayed on the display to be maintained when the property assigned to the AR object is a property in which the size is fixed while the mode of the electronic device is the normal mode, and the property assigned to the AR object When this size is a variable property, the size of the AR object displayed on the display may be adjusted based on the changed depth.
  • the processor maintains the size at which the AR object is displayed on the display when the depth of the AR object on the image is changed when the property assigned to the AR object is a fixed size while the mode of the electronic device is the normal mode.
  • the mode of the electronic device is in the normal mode
  • the AR is displayed on the display so that the size decreases when the depth of the point where the AR object is located on the image increases.
  • the size at which the object is displayed may be adjusted, and the size at which the AR object is displayed on the display may be adjusted so that the size increases when the depth of the point where the AR object is located decreases.
  • a method of controlling an electronic device includes displaying an image acquired through a camera and an AR object on the image, and in a state in which the mode of the electronic device is a normal mode, the AR object is positioned on the image If the depth of the point is changed, adjusting the size at which the AR object is displayed based on the changed depth, and in a state that the mode of the electronic device is the arrangement mode, the depth of the point where the AR object is located on the image is changed and controlling the AR object to maintain the displayed size.
  • the depth of the point at which the AR object is located may be changed as the point at which the camera is located or the point at which the AR object is located on the image moves according to the depth direction.
  • the AR object is displayed on the image and on the image based on the size of the AR object, and in the controlling step in the arrangement mode, when the depth of the point where the AR object is located on the image is changed, the AR object You can change the size of the AR object so that it retains its displayed size.
  • the controlling includes increasing the size of the AR object to maintain the displayed size when the depth of the point where the AR object is located increases in the arrangement mode, and in the arrangement mode, the AR object
  • the method may include reducing the size of the AR object to maintain the size at which the AR object is displayed when the depth of the point at which is located is decreased.
  • the displaying step displays the image and the AR object on the image based on the size of the AR object
  • the adjusting step is in the normal mode, in the depth of the point where the AR object is located and the properties assigned to the AR object. Based on this, the size at which the AR object is displayed can be adjusted.
  • the adjusting step is a step of controlling so that the displayed size of the AR object is maintained when the property assigned to the AR object is a property with a fixed size in the normal mode, and assigning the AR object to the AR object in the normal mode
  • the method may include adjusting the displayed size of the AR object to be changed based on the changed depth.
  • the adjusting includes controlling the size at which the AR object is displayed to be maintained when the depth of the AR object is changed on the image when the property assigned to the AR object is a property in which the size is fixed in the normal mode; In the normal mode, if the property assigned to the AR object is a property that changes in size, adjust the displayed size of the AR object so that the size decreases if the depth of the point where the AR object is located on the image increases, If the depth of the point where is is decreased, the size of the AR object displayed can be adjusted so that the size increases.
  • the visibility of the AR object and the operation convenience of the AR object may be improved.
  • FIG. 1 is a diagram for describing an electronic device according to an embodiment of the present disclosure.
  • FIG. 2 is a block diagram illustrating a configuration of an electronic device according to an embodiment of the present disclosure.
  • FIG. 3 is a block diagram illustrating an additional configuration of an electronic device according to an embodiment of the present disclosure.
  • 4A is a diagram for explaining the size of an AR object according to an embodiment of the present disclosure.
  • 4B is a diagram for explaining the size of an AR object according to an embodiment of the present disclosure.
  • FIG. 5 is a diagram for explaining a method of adding an AR object to an image according to an embodiment of the present disclosure.
  • FIG. 6 is a diagram for explaining a method of adding an AR object to an image according to an embodiment of the present disclosure.
  • FIG. 7 is a diagram for explaining a method of adding an AR object to an image according to an embodiment of the present disclosure.
  • FIG. 8 is a diagram for explaining a method of adding an AR object to an image according to an embodiment of the present disclosure.
  • FIG. 9 is a diagram for explaining a method of adding an AR object to an image according to an embodiment of the present disclosure.
  • FIG. 10 is a diagram for explaining a method of adding an AR object to an image according to an embodiment of the present disclosure.
  • FIG. 11 is a diagram for explaining a method of adding an AR object to an image according to an embodiment of the present disclosure.
  • FIG. 12 is a view for explaining an arrangement mode according to an embodiment of the present disclosure.
  • 13A is a diagram for describing an arrangement mode according to an embodiment of the present disclosure.
  • 13B is a diagram for describing an arrangement mode according to an embodiment of the present disclosure.
  • 13C is a diagram for describing an arrangement mode according to an embodiment of the present disclosure.
  • FIG. 14 is a diagram for explaining a normal mode according to an embodiment of the present disclosure.
  • 15A is a diagram for describing a normal mode according to an embodiment of the present disclosure.
  • 15B is a diagram for explaining a normal mode according to an embodiment of the present disclosure.
  • 16 is a diagram for describing a flowchart according to an embodiment of the present disclosure.
  • expressions such as “A or B,” “at least one of A and/and B,” or “one or more of A or/and B” may include all possible combinations of the items listed together.
  • “A or B,” “at least one of A and B,” or “at least one of A or B” means (1) includes at least one A, (2) includes at least one B; Or (3) it may refer to all cases including both at least one A and at least one B.
  • a component eg, a first component is "coupled with/to (operatively or communicatively)" to another component (eg, a second component)
  • another component eg, a second component
  • the certain element may be directly connected to the other element or may be connected through another element (eg, a third element).
  • a component eg, a first component
  • another component eg, a second component
  • a device configured to may mean that the device is “capable of” with other devices or parts.
  • a processor configured (or configured to perform) A, B, and C refers to a dedicated processor (eg, an embedded processor) for performing the above operations, or by executing one or more software programs stored in a memory device.
  • a generic-purpose processor eg, a CPU or an application processor
  • FIG. 1 is a diagram for describing an electronic device according to an embodiment of the present disclosure.
  • the electronic device 100 is a device capable of providing augmented reality (AR), and may be implemented as a smart phone or the like.
  • augmented reality may refer to combining a virtual object (hereinafter, augmented reality object; AR object) with an image based on a real environment (space or object).
  • the electronic device 100 may display an image and an AR object on the image. That is, the electronic device 100 may combine an image and an AR object, and display the AR object by superimposing it on the image.
  • the image may include a photographed subject.
  • the subject refers to the real environment 10 located in the photographing area.
  • the image can be recognized as a three-dimensional space through simultaneous localization and mapping (SLAM).
  • SLAM simultaneous localization and mapping
  • the image may be a plurality of image frames photographed in units of a preset frame rate.
  • the AR object may be an image in which an existing object is rendered in 2D or 3D.
  • the AR object may be implemented in the same form as a 2D or 3D rendered image of objects such as a TV, a digital picture frame, a sound bar, a refrigerator, a washing machine, furniture, a car, a building, and a tree.
  • the AR object is not limited thereto, and may be implemented in the form of various information such as text, text, image, photo, video, document, dashboard, and the like.
  • the AR object may be displayed in a translucent state.
  • the AR object may be located at a specific point on an image recognized as a 3D space.
  • a specific point on the image may be represented by three-dimensional space coordinates.
  • the x-axis represents the horizontal direction
  • the y-axis represents the vertical direction
  • the z-axis represents the depth direction.
  • a point on the image where the AR object is located may have a relative positional relationship with a point where a subject (eg, a floor, a plane, a wall, an object, etc.) included in the image is located.
  • a subject eg, a floor, a plane, a wall, an object, etc.
  • the AR object in the same direction as the direction in which the point where the subject is located is changed (or moved)
  • the point where is located may be changed (or moved).
  • the depth may indicate a distance between a point where the electronic device 100 (or camera) is positioned and a point where the AR object is positioned in a three-dimensional space on an image along the z-axis (depth direction).
  • the size (default size) of the AR object may be preset for each AR object.
  • the size of the AR object may indicate a size displayed in the standard depth, and the size may collectively refer to various concepts such as area, volume, diagonal length, horizontal and vertical length, radius, and diameter.
  • the standard depth may indicate a distance (eg, a distance on a z-axis) that is a reference so that the AR object is displayed in a preset size.
  • the standard depth may be preset to a distance of 1 m, which is an example and may be changed to various distances.
  • the size of the AR object may be changed according to a user command.
  • the electronic device 100 may display the AR object in a preset size. For example, assuming that the size of an AR object, such as a 100-inch TV, is preset to be 220 x 125 in height, the electronic device 100 sets the depth of the AR object (100-inch TV) located on the image to 1 m, which is the standard depth. When , the AR object may be displayed in a preset size of 220 x 125 in height.
  • the electronic device 100 may adjust or maintain the displayed size of the AR object according to the mode of the electronic device.
  • the mode may be one of a normal mode and a batch mode.
  • the normal mode is a mode in which the size at which the AR object is displayed may be changed according to the depth of the point where the AR object is disposed (the distance between the electronic device (or camera) and the AR object).
  • the arrangement mode is a mode in which a point where an AR object is located in an image can be changed according to a user command, and the size at which the AR object is displayed can be maintained regardless of the depth of the point where the AR object is placed. That is, the normal mode is a mode to which the perspective for the AR object is applied, and the arrangement mode is a mode to which the perspective to the AR object is not applied.
  • an electronic device for adjusting the size of an AR object and a method for controlling the same.
  • the visibility of the AR object and the operation convenience of the AR object may be improved.
  • the electronic device 100 is illustrated as a smartphone in FIG. 1 , this is only an example, and the electronic device 100 may be implemented as a wearable device that a user can wear.
  • the wearable device may be an accessory type (eg, watch, ring, bracelet, anklet, necklace, eyeglasses, contact lens, or head-mounted-device (HMD)), a textile or an integrated garment (eg, electronic garment). ), body-attached, or bioimplantable circuit, etc.
  • the electronic device 100 may include a tablet PC, a speaker, a mobile phone, a telephone, an e-book reader, a desktop PC, a laptop PC, etc.
  • the electronic device 100 may be implemented as a device having a transparent display or a flexible display in some cases.
  • AR augmented reality
  • VR virtual reality
  • MR mixed reality
  • XR extended reality
  • FIG. 2 is a block diagram illustrating a configuration of an electronic device according to an embodiment of the present disclosure
  • FIG. 3 is a block diagram illustrating an additional configuration of an electronic device according to an embodiment of the present disclosure.
  • the electronic device 100 may include a camera 110 , a display 120 , and a processor 130 .
  • the camera 110 may acquire an image. Specifically, the camera 110 may acquire an image by photographing a subject existing within a field of view (FoV) of the camera 110 at a specific point of view (PoV) of the camera 110 . Also, the camera 110 may sequentially acquire a plurality of images through continuous shooting.
  • FoV field of view
  • PoV point of view
  • the image acquired through the camera 110 or metadata separate from the image may include information about a frame rate, time, viewpoint, angle of view, etc. captured by the camera 110 .
  • the frame rate represents the number of frames (the number of images) acquired per second (or per minute)
  • the angle of view represents the focal length of the lens of the camera 110 and that of the camera 110 .
  • a value determined according to a size (eg, a diagonal length) of an image sensor (not shown) may be represented.
  • the viewpoint may be detected by a sensor (eg, a gyro sensor, an acceleration sensor, etc.) provided inside or outside the camera 110 .
  • the camera 110 may be implemented as an RGB camera or a stereo camera.
  • the RGB camera may include a lens (not shown), an image sensor (not shown), and an image processor (not shown).
  • the lens collects or diverges the light reflected from the object to the image sensor, and the image sensor divides the transmitted light into pixels and detects R (Red), G (Green), B (Blue) colors for each pixel.
  • the signal is generated, and the image processor processes each pixel according to the electrical signal detected by the image sensor to obtain an image representing the color, shape, contrast, etc. of the object.
  • the image is a projection of a real three-dimensional space onto a virtual two-dimensional plane, and each point (pixel) on the two-dimensional plane constituting the image contains two-dimensional position information (eg, position on the x-axis, position on the y-axis).
  • the image processor uses a programming library (eg, AR Core, AR Tool Kit, AR SDK) for analyzing real-time computer vision for shading, contrast, point cloud, color, etc. of the image. , Unity, OpenCV (Open Source Computer Vision), Python, etc.), and various algorithms such as SLAM, a depth (eg, a position on the z-axis) can be given to each point (pixel) composing an image.
  • the image acquired through the camera 110 may include 3D position information (eg, an x-axis position, a y-axis position, and a z-axis position).
  • the stereo camera includes a plurality of the above-described RGB cameras, and the plurality of RGB cameras may be disposed to be spaced apart from each other.
  • the description of the RGB camera may be applied.
  • the stereo camera may acquire a plurality of images by simultaneously photographing a subject at different positions at the same point of time.
  • the image processor calculates disparity by stereo matching a plurality of images, and based on the disparity, the focal length of the lens and the baseline, the camera ( 110) and the depth (or distance) between the subject may be calculated.
  • the image processor or processor 130 ) combines two-dimensional position information (eg, an x-axis position, a y-axis position) and depth information (eg, a z-axis position) of a reference image among a plurality of images to obtain a reference image 3D position information (eg, an x-axis position, a y-axis position, and a z-axis position) of an object included in the frame may be acquired.
  • two-dimensional position information eg, an x-axis position, a y-axis position
  • depth information eg, a z-axis position
  • a reference image 3D position information eg, an x-axis position, a y-axis position, and a z-axis
  • stereo matching refers to matching the same subject included in a plurality of images for the same viewpoint through various methods such as global matching and local matching.
  • the parallax indicates a position difference (eg, a position difference on an x-axis or a y-axis) of the same subject included in a plurality of images, and the greater the focal length or baseline, the higher the parallax.
  • the focal length may refer to a distance between the image sensor and the lens.
  • the baseline may refer to an interval at which a plurality of RGB cameras are spaced apart.
  • the reference image may refer to an image captured by one preset RGB camera among a plurality of RGB cameras.
  • the camera 110 may be implemented as an RGB-D (Depth) camera.
  • the RGB-D (Depth) camera may acquire an image by photographing a subject, and detect a depth (or distance) between the camera 110 and the subject.
  • the processor 140 combines the two-dimensional position information (eg, the x-axis position, the y-axis position) and the depth information (eg, the z-axis position) of the image frame, and the three-dimensional position of the object included in the image frame.
  • Information eg, position on the x-axis, position on the y-axis, position on the z-axis
  • the RGB-D camera may include a sensor (eg, a Time Of Flight (TOF) sensor, a LIDAR sensor, etc.) coupled to an RGB camera or a stereo camera.
  • TOF Time Of Flight
  • LIDAR LIDAR
  • the processor 130 may control the display 120 to display an AR object at a specific location on the image while building a map using the SLAM-based tracking technology of the camera 110 . have.
  • the processor 130 constructs a map of a three-dimensional space through a subject, a feature point, or a point cloud of each of the plurality of images.
  • the processor 130 compares the map with the three-dimensional space in the image (hereinafter, the current image) most recently acquired through the camera, to the point on the current image corresponding to the point on the map where the AR object is located.
  • the display 120 may be controlled to display the AR object. Accordingly, the point at which the AR object is displayed can be synchronized in real time.
  • the display 120 is a device for visually outputting information or data.
  • the display 120 may display the image and the AR object in all or part of the display area.
  • the display area may refer to a pixel unit area in which information or data is visually displayed.
  • the display 120 may be implemented as a flexible display.
  • the display 120 has a flexible property that can be bent or bent, and displays images and AR objects even in a bent or bent state. can do.
  • the display 120 may be implemented as a transparent display, and an object located behind the display 120 may be transmitted through the display 120 due to the transparent nature of the display 120 .
  • the processor 130 controls the display 120 to display the AR object on the image and the image acquired through the camera 110, and the AR object is displayed on the image while the mode of the electronic device 100 is the normal mode.
  • the size of the AR object displayed on the display 120 is adjusted based on the changed depth, and in the state that the mode of the electronic device 100 is the arrangement mode, AR is displayed on the image.
  • the depth of the point where the object is located is changed, it is possible to control the display 120 to maintain the size at which the AR object is displayed.
  • the processor 130 may control the display 120 to display the AR object on the image and the image acquired through the camera 110 .
  • the processor 130 may control the display 120 to display the image and the AR object on the image based on the size of the AR object.
  • the size of the AR object is set to a size displayed at the standard depth for each AR object as a default, and may be changed according to the mode.
  • the size may collectively refer to various concepts such as area, volume, diagonal length, horizontal and vertical length, radius, and diameter.
  • the processor 130 displays the AR on the display 120 based on the changed depth. You can adjust the size at which the object is displayed.
  • the depth of the point where the AR object is positioned may be changed as the point where the camera 110 is positioned is moved according to the depth direction.
  • the depth direction may be a z-axis direction.
  • the point at which the camera 110 is located may be changed.
  • the distance (depth) between the point where the camera 110 is positioned and the point where the AR object is positioned may be changed. This can be applied not only in normal mode but also in batch mode.
  • the processor 130 performs the display 120 on the basis of the depth of the point where the AR object is located and the attribute assigned to the AR object in a state in which the mode of the electronic device 100 is the normal mode. ), you can adjust the size of the AR object displayed.
  • the processor 130 displays the display 120 when the depth of the AR object on the image is changed when the attribute assigned to the AR object is a fixed size while the mode of the electronic device 100 is the normal mode.
  • the property assigned to the AR object is a property whose size changes
  • the point at which the AR object is located on the image The size at which the AR object is displayed on the display 120 is adjusted so that the size decreases when the depth of the AR object increases, and the size at which the AR object is displayed on the display 120 increases when the depth of the point where the AR object is located decreases. can be adjusted.
  • the attribute assigned to the AR object may be one of a fixed size attribute and a variable size attribute.
  • Each AR object has a property selected according to a user command (eg, a touch gesture, a motion gesture, a user's voice, a mouse click, etc.) through the input interface 170 (refer to FIG. 3) among the properties of fixed size and properties of variable size. can be assigned.
  • a user command eg, a touch gesture, a motion gesture, a user's voice, a mouse click, etc.
  • the processor 130 may consider that a size-variable attribute is assigned to the AR object when a fixed-size attribute is not assigned to the AR object.
  • the processor 130 may assign a fixed size attribute to the AR object when the AR object is text-type information such as letters, numbers, and symbols, even without a user command (ie, automatically).
  • the attribute assigned to the AR object may be stored in the memory 150 .
  • the user command of the present disclosure may be, for example, various types of input such as a touch gesture, a motion gesture, a user's voice, a mouse click, a keyboard input, a button input, and the like, and further, the user command is not limited thereto, and the development of technology is not limited thereto. Accordingly, it can be transformed into any type of input that the user can interact with with the electronic device.
  • a touch gesture a motion gesture
  • a user's voice a mouse click
  • keyboard input e.g., a button input
  • the user command is not limited thereto, and the development of technology is not limited thereto. Accordingly, it can be transformed into any type of input that the user can interact with with the electronic device.
  • the touch gesture is described.
  • the processor 130 displays the AR object on the display 120 based on the changed depth when the property assigned to the AR object is a property whose size changes. You can adjust the displayed size to change. That is, in this case, perspective may be applied to the AR object as shown in FIG. 4A .
  • the processor 130 increases the depth of the point at which the AR object is located on the image when the property assigned to the AR object is a property whose size changes while the mode of the electronic device 100 is the normal mode. If the AR object is adjusted so that the size at which the AR object is displayed on the display 120 is decreased, the AR object is adjusted to decrease the size of the AR object displayed on the display 120 when the depth of the point where the AR object is located on the image is decreased. objects can be adjusted.
  • the depth of the AR objects 415 and 425 is changed from 1m to 3m by moving the viewpoint of the camera 110 in the normal mode, referring to FIG. 4A .
  • the processor 130 sets the display 120 to display the AR objects 415 and 425 on the images 410 and 420 based on the sizes and depths (1m, 3m) of the AR objects 415 and 425 .
  • the AR objects 415 and 425 have the same preset size (default size).
  • the size of the AR objects 415 and 425 may be a size displayed at a standard depth (eg, 1 m).
  • the processor 130 may control the display 120 to display the AR 415 on the image 410 with a preset size of the AR object 415 at a depth of 1 m, which is a standard depth.
  • the processor 130 determines that the property assigned to the AR object is fixed in size and the size is changed. You can determine which of the properties are.
  • the processor 130 sets the AR object to a size smaller than the preset size of the AR object 425 by perspective at a depth of 3 m greater than the standard depth ( The display 120 may be controlled to display 425 on the image 420 .
  • the processor 130 may decrease the size of the AR objects 415 and 425 displayed on the images 410 and 420 .
  • the processor 130 may increase the size of the AR objects 415 and 425 displayed on the images 410 and 420 according to the decreased depth.
  • the processor 130 controls the display 120 to maintain the size at which the AR object is displayed when the property assigned to the AR object is a fixed size property while the mode of the electronic device 100 is the normal mode. can do. That is, in this case, the perspective for the AR object may be ignored as shown in FIG. 4B .
  • the processor 130 changes the depth of the point at which the AR object is located on the image when the property assigned to the AR object is a property in which the size is fixed while the mode of the electronic device 100 is the normal mode. Even if it is, it is possible to control so that the size of the AR object displayed on the display 120 is maintained.
  • the depth with the AR objects 435 and 445 is changed from 1m to 3m by moving the viewpoint of the camera 110 in the normal mode, referring to FIG. 4B .
  • the processor 130 sets the display 120 to display the AR objects 435 and 445 on the images 430 and 440 based on the sizes and depths (1m, 3m) of the AR objects 435 and 445 .
  • the AR objects 435 and 445 have the same preset size (default size).
  • the size of the AR objects 435 and 445 may be a size displayed at a standard depth (eg, 1 m).
  • the processor 130 may control the display 120 to display the AR 435 on the image 420 with a preset size of the AR object 435 at a depth of 1 m, which is a standard depth.
  • the processor 130 determines that the property assigned to the AR object has a fixed size and size. It is possible to determine which of the properties that is changed.
  • the processor 130 sets the AR object 445 to a size smaller than the preset size of the AR object 445 at a depth of 3 m greater than the standard depth.
  • the preset size of the AR object 445 may be largely adjusted according to the depth of the point where the AR object 445 is disposed. Accordingly, perspective according to the depth of the AR object may be ignored.
  • size(A) is the size when the AR object is located at point A
  • size(B) is the size when the AR object is located at point B
  • distance(AB) is the depth between point A and point B It can represent a difference (or distance).
  • the processor 130 may maintain the size at which the AR object is displayed on the display 120 at a constant size by adjusting a predetermined original size (or the original size of the AR object) of the AR object.
  • the size of the AR object displayed on the display 120 may be maintained so that the size displayed on the display 120 is not displayed large or small. Accordingly, the visibility of information may be improved in that the user can view an AR object to which an attribute having a fixed size, such as text-type information, is assigned regardless of a depth (distance) in an appropriate size. In addition, user convenience may be improved in that the user does not need to separately adjust the size of the AR object.
  • the processor 130 controls to maintain the size at which the AR object is displayed on the display 120 . can do. That is, in this case, the perspective for the AR object may be ignored.
  • the arrangement mode not only the editing mode for correcting (editing) the point where the AR object displayed on the image is located, but also an additional mode for locating the AR object not displayed on the image at a specific point on the image. may include
  • the depth of the point at which the AR object is located may be changed according to the movement of the point at which the camera 110 is located or the point at which the AR object is located on the image according to the depth direction.
  • the point at which the AR object is located on the image may be changed.
  • the distance (depth) between the point where the camera 110 is positioned and the point where the AR object is positioned may be changed. This can be applied only in the arrangement mode where the point at which the AR object is placed can be changed.
  • the processor 130 when the depth of the point where the AR object is located on the image is changed in the arrangement mode, the processor 130 maintains the size at which the AR object is displayed on the display 120 to maintain the AR object. can change the size of
  • the processor 130 controls the size of the AR object to maintain the size at which the AR object is displayed on the display 120 . size can be increased.
  • the processor 130 controls the size of the AR object to maintain the size at which the AR object is displayed on the display 120 . size can be reduced.
  • the depth with the AR objects 435 and 445 is changed from 1 m to 3 m by moving the point where the AR object is located in the arrangement mode with reference to FIG. 4B .
  • the processor 130 displays the AR objects 435 and 445 on the images 430 and 440 based on the sizes and depths (eg, 1m, 3m) of the AR objects 435 and 445 on the display 120 to display the AR objects 435 and 445 . ) can be controlled.
  • the sizes of the AR objects 435 and 445 may be changed from the same preset size (default size).
  • the processor 130 may control the display 120 to display the AR 435 on the image 430 with a preset size of the AR object 435 at a depth of 1 m, which is a standard depth.
  • the processor 130 determines that at a depth of 3m greater than the standard depth, the AR object 445 is larger than the preset size.
  • the preset size of the AR object 445 may be increased according to the depth of the point where the AR object 445 is disposed. Accordingly, perspective according to the depth of the AR object may be ignored.
  • size(A) is the size when the AR object is located at point A
  • size(B) is the size when the AR object is located at point B
  • distance(AB) is the depth between point A and point B It can represent a difference (or distance).
  • the processor 130 may maintain the size at which the AR object is displayed on the display 120 at a constant size by adjusting a predetermined unique size of the AR object.
  • the electronic device 100 may maintain the size at which the AR object is displayed on the display 120 so as to ignore the perspective while changing the position at which the AR object is disposed in the arrangement mode. have. Accordingly, user convenience in manipulating the AR object may be improved in that the size of the AR display does not change depending on the depth of the point where the AR object is located, and it is easier for the user to know what the actual size of the AR object is. It has a predictable effect.
  • the electronic device 100 includes a speaker 140 , a memory 150 , a communication unit 160 , and a camera 110 , a display 120 , and a processor 130 in addition to the processor 130 . At least one of the input interfaces 170 may be further included.
  • the processor 130 may control the electronic device 100 by executing at least one instruction stored in the memory 150 .
  • the processor 130 may be connected to the camera 110 , the display 120 , and the memory 150 to control the electronic device 100 .
  • the processor 130 may read and interpret the instructions and determine a sequence for data processing, thereby controlling the operation of another device by providing timing and control signals for controlling the operation of the other device to the other device. have.
  • the processor 130 may control the electronic device 100 by executing at least one instruction stored in a memory (not shown) provided in the processor 130 .
  • the memory provided in the processor 130 includes a ROM (eg, NOR or NAND type flash memory), a RAM (eg, dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM)), volatile memory and the like.
  • the processor 130 may include one or a plurality of processors, and the processor 130 is a general-purpose processor such as a central processing unit (CPU), an application processor (AP), a graphic processing unit (GPU), and a vision (VPU). It may be implemented with a graphics-only processor, such as a processing unit, or an artificial intelligence-only processor, such as a neural processing unit (NPU).
  • a general-purpose processor such as a central processing unit (CPU), an application processor (AP), a graphic processing unit (GPU), and a vision (VPU).
  • GPU central processing unit
  • AP application processor
  • GPU graphic processing unit
  • VPU vision
  • GPU graphics-only processor
  • NPU artificial intelligence-only processor
  • the processor 130 may include a GPU and a CPU, and the GPU and the CPU may perform the operations of the present disclosure in connection.
  • the GPU may process an image frame among data
  • the CPU may process the remaining data (eg, instructions, code, etc.).
  • the GPU is implemented in a structure with hundreds or thousands of cores specialized for a parallel processing method that processes multiple commands or data simultaneously, and the CPU has several cores specialized for a serial processing method that processes commands or data in the order in which they are input. It can be implemented with a structure with
  • the GPU detects a plurality of fingers from the plurality of first image frames acquired through the camera 110 , and the CPU determines that the detected poses of the plurality of fingers correspond to the trigger poses, inputting a character mode, the GPU detects the motion of one of the plurality of fingers in the plurality of second image frames acquired through the camera 110 in the text input mode, and the CPU detects the position of the finger by the motion and the finger
  • the display 120 may be controlled to identify a key corresponding to a motion among a plurality of keys mapped to a finger based on the position of the set reference point and display information corresponding to the identified key.
  • the speaker 140 may directly output various notification sounds or voice messages as well as various audio data on which various processing operations such as decoding, amplification, and noise filtering have been performed by an audio processing unit (not shown), and the electronic device ( 100) or may be implemented as a separate external device.
  • the speaker 160 may be implemented as a directional speaker that transmits sound only to a specific location or area.
  • the memory 150 may refer to hardware that stores information such as data in an electrical or magnetic form so that the camera 110, the processor 130, etc. can access it, and the memory 150 is a non-volatile memory, a volatile memory, and the like. , a flash memory, a hard disk drive (HDD) or a solid state drive (SSD), RAM, ROM, etc. may be implemented as hardware.
  • At least one instruction, program, or data required for the operation of the electronic device 100 or the processor 130 may be stored in the memory 150 .
  • the instruction is a code unit for instructing the operation of the electronic device 100 or the processor 130 , and may be written in machine language, which is a language that a computer can understand.
  • a program may be a set of instructions that perform a specific task in a unit of work.
  • the data may be status information in units of bits or bytes that can represent characters, numbers, images, and the like.
  • the memory 150 may store an image frame acquired by the camera 110 , information corresponding to a key identified by the processor 140 , and the like.
  • the memory 150 is accessed by the processor 130 , and reading/writing/modification/deletion/update of instructions, programs, or data may be performed by the processor 130 .
  • the communication unit 160 may transmit/receive various types of data to/from various types of external devices (eg, servers) according to various wired or wireless communication methods.
  • the communication unit 160 may perform direct communication with an external device or may communicate with an external device via (or relay) other external devices through various communication networks.
  • the communication unit 160 may include circuitry according to each communication method, and may further include an antenna or the like in the case of a wireless communication method.
  • the communication unit 160 may receive information from an external device and transmit the received information to the processor 130 . Also, the communication unit 160 may transmit information to an external device under the control of the processor 130 .
  • the communication unit 160 is a Wi-Fi chip using a Wi-Fi (Wi-Fi) communication method, a Bluetooth chip using a Bluetooth communication method, an NFC chip using a near field communication (NFC) communication method, a mobile communication method (eg : Infrared communication using a wireless communication chip and infrared communication method using long-term evolution (LTE), LTE Advance (LTE-A), 5th generation (5G), code division multiple access (CDMA), wideband CDMA (WCDMA)) It may include at least one of the chips. Furthermore, the communication unit 160 may include at least one of an Ethernet module (not shown) and a USB module (not shown) for performing wired communication.
  • Wi-Fi Wi-Fi
  • Bluetooth chip using a Bluetooth communication method
  • NFC chip using a near field communication (NFC) communication method eg : Infrared communication using a wireless communication chip and infrared communication method using long-term evolution (LTE), LTE Advance (LTE-A), 5th generation (5G), code division multiple access
  • the communication unit 160 may include a network interface or a network chip according to a wired/wireless communication method.
  • the communication method of the communication unit 160 is not limited to the above-described example, and may include a communication method that appears newly according to the development of technology.
  • the input interface 170 may receive various user inputs and transmit them to the processor 140 .
  • the input interface 170 may include, for example, at least one of a touch panel (not shown), a pen sensor (not shown), a key (not shown), and a microphone (not shown).
  • the touch panel may use, for example, at least one of a capacitive type, a pressure-sensitive type, an infrared type, and an ultrasonic type, and for this, the touch panel may include a control circuit.
  • the touch panel may further include a tactile layer to provide a tactile response to the user.
  • the pen sensor may be, for example, a part of the touch panel or may include a separate recognition sheet.
  • the key may include, for example, a physical button, an optical key, or a keypad.
  • the microphone may directly receive the user's voice, and digitally convert the user's voice, which is an analog signal, by a digital converter (not shown) to obtain an audio signal.
  • a digital converter (not shown) to obtain an audio signal.
  • Such an input interface 170 may be embedded in the electronic device 100 or implemented as a separate external device (not shown) such as a keyboard, mouse, external microphone, remote control, or the like.
  • 5 to 11 are diagrams for explaining a method of adding an AR object to an image according to an embodiment of the present disclosure.
  • the processor 130 of the electronic device 100 may control the display 120 to display an image 520 acquired through the camera 110 .
  • the image 520 may include a subject photographed through the camera 110 , where the subject may include a real environment 510 (space or object) located within the angle of view from the viewpoint of the camera 110 .
  • the processor 130 may recognize each of a plurality of pixels included in the image 520 as 3D spatial coordinates including a depth by using the SLAM-based tracking technology of the camera 110 . That is, the processor 130 may recognize the image as a three-dimensional space. Furthermore, the processor 130 may generate a plurality of images 520 sequentially acquired through the camera 110 as a map of a three-dimensional space and store it in the memory 150 .
  • the processor 130 may control the display 120 to display UIs 530 and 550 for selecting the arrangement mode.
  • the mode of the electronic device 100 may be set to the arrangement mode.
  • the processor 130 sets the mode of the electronic device 100 as an additional mode among the arrangement modes as shown in FIG. 6 , A list of a plurality of pre-stored AR objects 631-633 may be displayed on the display 120 .
  • the processor 130 sets the mode of the electronic device 100 to the edit mode among the batch modes, and sets the image according to the user command.
  • the position of one AR object among the AR objects displayed on the image may be changed. Specific details thereof will be described later with reference to FIGS. 12 to 13C .
  • the plurality of AR objects 631 - 633 are AR objects pre-stored in the memory 140 , and may be a rendered image or a text box capable of inputting text.
  • AR objects are objects such as TVs, digital picture frames, sound bars, refrigerators, washing machines, furniture, cars, buildings, trees, etc. that are rendered in 2D or 3D images, text, text, images, photos, videos, etc. It may be in the form of various information such as documents, dashboards, and the like.
  • the processor 130 receives a third AR object as shown in FIG. 7 .
  • the display 120 may be controlled to display 733 on the image 720 .
  • the added AR object 733 may be displayed with a preset size. That is, the perspective for the AR object 733 may be ignored.
  • the AR object 733 displayed on the display 120 may be a text box, and the AR object 733 may include text input according to a user command.
  • the processor 130 may change a point at which the AR object 833 added on the image 820 is located in the addition mode according to a user command (eg, touch drag, etc.).
  • the point at which the AR object 833 is located may be moved along the 3D space axis (x, y, z axis) in the image 820 according to a user command.
  • depth direction depth direction
  • the processor 130 may change the preset size of the AR object 933 added to the image 920 in the addition mode according to a user command (eg, pinch zoom).
  • a user command eg, pinch zoom
  • the size of the AR object 933 displayed on the display 120 may be changed according to a user command.
  • the processor 130 when a user command (eg, a touch longer than a preset time) for the AR object 1033 added to the image 1020 is received in the addition mode, the processor 130 receives the AR object 1033 . ), the display 120 may be controlled to display a UI 1035 for allocating properties and a UI 1037 for deleting the AR object 1033 .
  • the processor 130 sets the property for the AR object 1033 to a fixed size property and a variable size property.
  • a UI for selecting one of the properties may be controlled to be displayed on the display 120 .
  • the attribute assigned to the AR object may be changed according to a user command.
  • the processor 130 may remove (or delete) the AR object 1033 displayed on the image 1020 . .
  • the processor 130 sets the AR objects 733 , 833 , 933 , 1033 , 1133 added according to a user command to a set state (size, location, properties, etc.) in the normal mode.
  • the first UI (740, 840, 940, 1040, 1140) that saves the state set to be displayed and the AR object (733, 833, 933, 1033, 1133) added according to the user's command can be canceled from being displayed in normal mode.
  • the display 120 may be controlled to display the second UIs 770 , 870 , 970 , 1070 , and 1170 .
  • the processor 130 enters the mode of the electronic device 100 into the normal mode according to a user command to select the first UI 740 , 840 , 940 , 1040 , 1140 , and an AR object (
  • the display 120 may be controlled to display 733 , 833 , 933 , 1033 , and 1133 according to a set state.
  • the processor 130 enters the mode of the electronic device 100 into the normal mode according to a user command for selecting the second UI 770 , 870 , 970 , 1070 , and 1170 , and the AR object 733 in the normal mode , 833 , 933 , 1033 , 1133 may be removed (or deleted) from the AR objects 733 , 833 , 933 , 1033 , and 1133 so that they are not displayed.
  • 12 to 13C are diagrams for explaining an arrangement mode according to an embodiment of the present disclosure.
  • the processor 130 of the electronic device 100 may control the display 120 to display UIs 1230 and 1250 for selecting a layout mode.
  • the processor 130 may set (or enter) the mode of the electronic device 100 as an additional mode from among the batch modes, and a description thereof is shown in FIG. 5 . It has been described above with reference to FIGS.
  • the processor 130 may set (or enter) the mode of the electronic device 100 from the arrangement mode to the edit mode. In this case, the processor 130 may change the arrangement of the AR object 1233 among the image 1220 and the AR object 1233 displayed on the display 120 according to a user command.
  • points located on the images 1320A and 1320B may be changed according to a user command of the AR objects 1333A and 1333B.
  • the processor 130 displays the AR on the image 1320A.
  • the preset size of the AR object 1333A may be decreased so that the displayed size of the object 1333A is maintained.
  • the processor 130 displays the image 1320B on the image 1320B.
  • the preset size of the AR object 1333B may be increased so that the size at which the AR object 1333B is displayed is maintained.
  • the processor 130 determines the size at which the AR objects 1333A and 1333B are displayed on the display 120 and the AR object 1233 is displayed on the display 120 of FIG. 12 . You can control it to stay at the displayed size. That is, the size of the AR objects 1333A and 1333B displayed on the display 120 may be maintained as the size of the AR object 1233 displayed on the display 120 when the edit mode is set.
  • the effect of changing the size by perspective may be offset.
  • 14 to 15B are diagrams for explaining a normal mode according to an embodiment of the present disclosure.
  • the processor 130 of the electronic device 100 obtains an image 1420 and AR objects 1431 and 1432 obtained through the camera 110 .
  • the display 120 may be controlled to display .
  • the AR objects 1431 and 1432 may be arranged on the image 1420 through the above-described arrangement mode.
  • the electronic device 100 in the state shown in FIG. 14 is moved to a point where the depth is changed as shown in FIGS. 15A and 15B .
  • the first AR objects 1531A and 1531B are assigned with a variable size attribute
  • the second AR objects 1532A and 1532B are assigned with a fixed size attribute.
  • the processor 130 may determine a property assigned to each of the first and second AR objects 1531A and 1532A. have.
  • the processor 130 increases the size at which the first AR object 1531A is displayed on the image 1520A as the depth decreases with respect to the first AR object 1531A to which the property whose size is changed is assigned.
  • the first AR object 1531A may be adjusted to
  • the processor 130 is configured to maintain the size at which the second AR object 1532A is displayed on the image 1520A.
  • the second AR object 1532A may be adjusted to reduce a preset size of the second AR object 1532A.
  • the processor 130 determines a property assigned to each of the first and second AR objects 1531B and 1532B.
  • the processor 130 decreases the size of the first AR object 1531B displayed on the image 1520B as the depth increases with respect to the first AR object 1531B to which the property whose size is changed is assigned. can be adjusted to make it
  • the processor 130 controls the size of the second AR object 1532B to which the property to be fixed in size is assigned so that the size at which the second AR object 1532B is displayed on the image 1520B is maintained. It may be adjusted to increase the preset size of the second AR object 1532B.
  • perspective is applied to the first AR object 1531A, and perspective is ignored to the second AR object 1532A according to the attribute assigned to the AR object in the normal mode.
  • 16 is a diagram for describing a flowchart according to an embodiment of the present disclosure.
  • the control method of the electronic device 100 includes an image acquired through the camera 110 and displaying an AR object on the image ( S1610 ), the electronic device When the depth of the point where the AR object is located on the image is changed in the state in which the mode of (100) is the normal mode, adjusting the size of the AR object is displayed based on the changed depth (S1620) and the former In a state in which the mode of the device 100 is the arrangement mode, if the depth of the point where the AR object is located on the image is changed, controlling to maintain the size at which the AR object is displayed ( S1630 ) is included.
  • the AR object may be displayed on the image and the image acquired through the camera 110 first (S1610).
  • the AR object may be displayed on the image and the image acquired through the camera 110 based on the size of the AR object.
  • the size of the AR object is set to a size displayed at the standard depth for each AR object as a default, and may be changed according to the mode.
  • the size may collectively refer to various concepts such as area, volume, diagonal length, horizontal and vertical length, radius, and diameter.
  • the size at which the AR object is displayed may be adjusted based on the changed depth ( S1620).
  • the depth of the point where the AR object is positioned may be changed as the point where the camera 110 is positioned is moved according to the depth direction.
  • the size at which the AR object is displayed may be adjusted based on the depth of the point where the AR object is located and the property assigned to the AR object.
  • the size at which the AR object is displayed is maintained, and the size of the property assigned to the AR object is changed.
  • the size at which the AR object is displayed may be adjusted based on the changed depth.
  • the size at which the AR object is displayed is maintained. can be controlled as much as possible.
  • the AR object in the normal mode, if the property assigned to the AR object is a property whose size changes, the AR object is adjusted so that the size decreases when the depth of the point where the AR object is located on the image increases.
  • the displayed size may be adjusted, and the displayed size of the AR object may be adjusted so that the size increases when the depth of the point where the AR object is located decreases.
  • control may be performed to maintain the size at which the AR object is displayed ( S1630 ).
  • the depth of the point at which the AR object is positioned may be changed as the point at which the camera 110 is positioned or the point at which the AR object is positioned on the image is moved according to the depth direction.
  • the size of the AR object may be changed to maintain the size at which the AR object is displayed.
  • the controlling may include increasing the size of the AR object so as to maintain the displayed size of the AR object when the depth of the point where the AR object is located increases in the arrangement mode, and increasing the size of the point where the AR object is located. When the depth decreases, the size of the AR object can be reduced to maintain the size at which the AR object is displayed.
  • Various embodiments of the present disclosure may be implemented as software including instructions stored in a machine-readable storage medium readable by a machine (eg, a computer).
  • the device calls the stored instructions from the storage medium.
  • an electronic device eg, the electronic device 100
  • the processor directly or the The function described in the instruction may be performed using other components under the control of the processor.
  • the instruction may include code generated or executed by a compiler or an interpreter.
  • a machine-readable storage medium is a non-transitory It may be provided in the form of a (non-transitory) storage medium, where 'non-transitory' means that the storage medium does not include a signal and is tangible, but data is semi-permanent or temporary in the storage medium It does not distinguish that it is stored as
  • the method according to various embodiments may be provided by being included in a computer program product.
  • Computer program products may be traded between sellers and buyers as merchandise.
  • the computer program product may be distributed in the form of a machine-readable storage medium (eg, compact disc read only memory (CD-ROM)) or online through an application store (eg, Play StoreTM).
  • an application store eg, Play StoreTM
  • at least a part of the computer program product may be temporarily stored or temporarily created in a storage medium such as a memory of a server of a manufacturer, a server of an application store, or a relay server.
  • Each of the components may be composed of a singular or a plurality of entities, and some sub-components of the above-described sub-components may be omitted, or other sub-components may be various. It may be further included in the embodiment. Alternatively or additionally, some components (eg, a module or a program) may be integrated into a single entity to perform the same or similar functions performed by each of the components before being integrated. According to various embodiments, operations performed by a module, program, or other component may be sequentially, parallel, repetitively or heuristically executed, or at least some operations may be executed in a different order, omitted, or other operations may be added. can

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Architecture (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

La présente divulgation concerne un dispositif électronique et son procédé de commande. Le dispositif électronique de la présente divulgation comprend une caméra, un afficheur et un processeur qui : commande l'afficheur pour afficher une image obtenue par le biais de la caméra et afficher un objet AR sur l'image ; lorsque la profondeur d'un point auquel l'objet AR est situé sur l'image est modifiée dans un état où le dispositif électronique est dans un mode normal, ajuste la taille de l'objet AR affiché sur l'afficheur, sur la base de la profondeur qui est modifiée ; et lorsque la profondeur d'un point auquel l'objet AR est situé sur l'image est modifiée dans un état où le dispositif électronique est dans un mode discontinu, réalise une commande pour maintenir la taille de l'objet AR affiché sur l'afficheur.
PCT/KR2020/018980 2019-12-26 2020-12-23 Dispositif électronique et son procédé de commande WO2021133053A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020190175578A KR20210083016A (ko) 2019-12-26 2019-12-26 전자 장치 및 그의 제어 방법
KR10-2019-0175578 2019-12-26

Publications (1)

Publication Number Publication Date
WO2021133053A1 true WO2021133053A1 (fr) 2021-07-01

Family

ID=76575597

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2020/018980 WO2021133053A1 (fr) 2019-12-26 2020-12-23 Dispositif électronique et son procédé de commande

Country Status (2)

Country Link
KR (1) KR20210083016A (fr)
WO (1) WO2021133053A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230092282A1 (en) * 2021-09-23 2023-03-23 Apple Inc. Methods for moving objects in a three-dimensional environment

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102628667B1 (ko) * 2021-09-23 2024-01-24 그리다텍 주식회사 태양계 공전 시스템을 모사한 vr 인터페이스 시스템
KR102578113B1 (ko) * 2023-04-10 2023-09-14 전남대학교 산학협력단 3차원 객체 형상 획득 시스템 및 방법

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101227255B1 (ko) * 2010-03-17 2013-01-28 에스케이플래닛 주식회사 마커 크기 기반 인터렉션 방법 및 이를 구현하기 위한 증강 현실 시스템
KR20140071086A (ko) * 2012-12-03 2014-06-11 삼성전자주식회사 증강 현실 컨텐츠 운용 방법 및 이를 지원하는 단말기와 시스템
US20170256096A1 (en) * 2016-03-07 2017-09-07 Google Inc. Intelligent object sizing and placement in a augmented / virtual reality environment
KR101896982B1 (ko) * 2016-10-13 2018-09-10 에이케이엔코리아 주식회사 사용자간 통신을 위한 가상의 사용자 인터페이스 객체의 처리 방법 및 이를 수행하는 시스템
KR20190080243A (ko) * 2017-12-28 2019-07-08 엘에스산전 주식회사 증강 현실 제공 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101227255B1 (ko) * 2010-03-17 2013-01-28 에스케이플래닛 주식회사 마커 크기 기반 인터렉션 방법 및 이를 구현하기 위한 증강 현실 시스템
KR20140071086A (ko) * 2012-12-03 2014-06-11 삼성전자주식회사 증강 현실 컨텐츠 운용 방법 및 이를 지원하는 단말기와 시스템
US20170256096A1 (en) * 2016-03-07 2017-09-07 Google Inc. Intelligent object sizing and placement in a augmented / virtual reality environment
KR101896982B1 (ko) * 2016-10-13 2018-09-10 에이케이엔코리아 주식회사 사용자간 통신을 위한 가상의 사용자 인터페이스 객체의 처리 방법 및 이를 수행하는 시스템
KR20190080243A (ko) * 2017-12-28 2019-07-08 엘에스산전 주식회사 증강 현실 제공 방법

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230092282A1 (en) * 2021-09-23 2023-03-23 Apple Inc. Methods for moving objects in a three-dimensional environment
WO2023049767A3 (fr) * 2021-09-23 2023-04-27 Apple Inc. Procédés pour déplacer des objets dans un environnement tridimensionnel

Also Published As

Publication number Publication date
KR20210083016A (ko) 2021-07-06

Similar Documents

Publication Publication Date Title
WO2021133053A1 (fr) Dispositif électronique et son procédé de commande
WO2017155236A1 (fr) Configuration et fonctionnement de dispositifs d'affichage comprenant une conservation de contenu
WO2018088742A1 (fr) Appareil d'affichage et son procédé de commande
WO2016140545A1 (fr) Procédé et dispositif de synthétisation de contenu d'arrière-plan tridimensionnel
EP3628121A1 (fr) Dispositif électronique pour mémoriser des informations de profondeur en relation avec une image en fonction des propriétés d'informations de profondeur obtenues à l'aide d'une image, et son procédé de commande
WO2015064903A1 (fr) Affichage de messages dans un dispositif électronique
WO2020171429A1 (fr) Dispositif électronique de fourniture d'image animée et procédé correspondant
WO2018048163A1 (fr) Appareil électronique et procédé de commande de celui-ci
WO2019027090A1 (fr) Terminal mobile et procédé de commande associé
WO2018080165A1 (fr) Appareil d'affichage d'image, dispositif mobile et procédé de fonctionnement associé
WO2020149689A1 (fr) Procédé de traitement d'image et dispositif électronique le prenant en charge
WO2019139270A1 (fr) Dispositif d'affichage et procédé de fourniture de contenu associé
WO2018034436A1 (fr) Appareil électronique, et procédé de commande associé
WO2017052150A1 (fr) Dispositif de terminal d'utilisateur, dispositif électronique, et procédé de commande d'un dispositif terminal utilisateur et d'un dispositif électronique
WO2016114432A1 (fr) Procédé de traitement de sons sur la base d'informations d'image, et dispositif correspondant
WO2016126083A1 (fr) Procédé, dispositif électronique et support d'enregistrement pour notifier des informations de situation environnante
WO2021025266A1 (fr) Appareil électronique et son procédé de commande
WO2016080662A1 (fr) Procédé et dispositif de saisie de caractères coréens sur la base du mouvement des doigts d'un utilisateur
WO2020091182A1 (fr) Dispositif électronique pour fournir des données d'image à l'aide de la réalité augmentée et son procédé de commande
WO2021107200A1 (fr) Terminal mobile et procédé de commande de terminal mobile
WO2018070756A1 (fr) Procédé, appareil, et support d'enregistrement pour traiter une image
WO2018093198A1 (fr) Appareil de traitement d'image, et son procédé de commande
WO2020242064A1 (fr) Dispositif mobile, et procédé de commande de dispositif mobile
WO2021225333A1 (fr) Dispositif électronique permettant de fournir un service de réalité augmentée, et son procédé de fonctionnement
WO2020075926A1 (fr) Dispositif mobile et procédé de commande de dispositif mobile

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20906609

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20906609

Country of ref document: EP

Kind code of ref document: A1