WO2021101357A1 - 데이터 라벨링을 위한 바운딩 박스 그리기 방법 및 장치 - Google Patents
데이터 라벨링을 위한 바운딩 박스 그리기 방법 및 장치 Download PDFInfo
- Publication number
- WO2021101357A1 WO2021101357A1 PCT/KR2020/017438 KR2020017438W WO2021101357A1 WO 2021101357 A1 WO2021101357 A1 WO 2021101357A1 KR 2020017438 W KR2020017438 W KR 2020017438W WO 2021101357 A1 WO2021101357 A1 WO 2021101357A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data labeling
- box
- task
- user
- control unit
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0486—Drag-and-drop
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04806—Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
Definitions
- the present embodiment relates to a method and apparatus for drawing a bounding box for data labeling. More specifically, it relates to a method and apparatus for securing high-quality learning data by performing a box drawing operation for data labeling.
- This embodiment allows the time required to detect the first I frame for the switched channel by using a caching server when switching channels in the IPTV set-top box, and features of the initial data information of the switched channel.
- the purpose of this is to provide a user interface for interaction with users to perform data labeling-related tasks corresponding to box drawing on objects in photos, and to secure high-quality learning data based on this.
- a data labeling apparatus comprising: a display unit for displaying a user interface; And a control unit for controlling the display of the user interface for interaction with a user to perform a data labeling related task corresponding to drawing a box on an object in a photo on the display unit.
- a method comprising: displaying the user interface for interaction with a user on a display unit to perform a data labeling related task corresponding to drawing a box on an object in a photo; Receiving a user input; And controlling the operation to be performed so that a bounding box surrounding an object to be performed is generated based on the user input and the user interface.
- a user interface for interaction with a user is provided to perform a data labeling-related task corresponding to drawing a box on an object in a picture, and based on this, it is possible to secure high-quality learning data.
- FIG. 1 is a diagram illustrating a procedure for channel switching in a general IP-based media service.
- FIG. 2 is a block diagram schematically showing a terminal equipped with a data labeling application according to the present embodiment.
- FIG. 2 is a block diagram schematically illustrating a data labeling apparatus according to the present embodiment.
- FIG. 3 is a diagram illustrating an operation related to data labeling according to the present embodiment.
- 4 to 11 are diagrams for explaining a data labeling method using a user interface according to the present embodiment.
- FIG. 12 is a flowchart illustrating a data labeling method according to the present embodiment.
- FIG. 1 is a block diagram schematically showing a terminal equipped with a data labeling application according to the present embodiment.
- the terminal 100 refers to a terminal capable of transmitting and receiving various data using a communication intermediary device including an AP according to a user's key operation, and a tablet PC, a laptop, and a personal computer (PC). Computer), a smart phone, a personal digital assistant (PDA), and a mobile communication terminal. That is, the terminal 100 is a terminal that performs voice or data communication using an AP and a communication network, and a memory for storing a program or protocol for communicating with an external device via the AP and a communication network, and an operation by executing the corresponding program. And a terminal equipped with a microprocessor for controlling and the like.
- the terminal 100 may provide a function of performing an operation related to data labeling by mounting the data labeling application 110.
- the terminal 100 may drive the data labeling application 110 by a user's manipulation or command, and may provide a function of performing an operation related to data labeling through the data labeling application 110.
- the terminal 100 is implemented in a state in which the data labeling application 110 is mounted in an embedded form.
- it may be mounted in an embedded form in an OS (Operating System) mounted in the terminal 100, or installed in the OS in the terminal 100 by a user's manipulation or command.
- the terminal 100 may access the data labeling application 110 through a web connection.
- the terminal 100 provides a menu for selecting an object box drawing operation when the mounted data labeling application 110 is executed.
- the terminal 100 operates to perform an object box drawing operation according to the selected menu. For example, when a corresponding menu is selected, the terminal 100 may operate to additionally output a screen for selecting a task to be performed or to output a preset task to be performed directly on the screen.
- FIG. 3 is a diagram illustrating an operation related to data labeling according to the present embodiment.
- the work related to data labeling according to the present embodiment may be a work of drawing a square box to fit the object in the photo. That is, the work related to data labeling according to the present embodiment may correspond to a process of calculating a bounding box corresponding to a boundary of an object through complete classification of a background and an object, and labeling what an object in an image is based on.
- the result of execution according to the task may be used as essential learning data for AI to recognize a specific object in a photo when learning a specific photo.
- the terminal 100 provides a user interface for interaction with a user to perform the above-described data labeling-related task, and operates to provide high-quality learning data based on this. Meanwhile, in the present embodiment, a detailed description of the user interface for performing data labeling-related tasks will be described later in FIG. 2.
- the data labeling device 200 substantially shows an internal drawing of the terminal 100 equipped with the data labeling application 102, but the terminal 100 equipped with the data labeling application 102 is a separate device, data
- the labeling device 200 may be implemented, and components included in the data labeling device 200 may be implemented as software or hardware elements, respectively. That is, the data labeling device 200 shown in FIG. 2 is an embodiment in which the data labeling application 102 is implemented as a separate device, and the data labeling device 200 according to the present embodiment includes an input unit 210 and a display unit. 220, a control unit 230, and a memory unit 240. In this case, the components included in the data labeling apparatus 200 are not necessarily limited thereto.
- the input unit 210 receives user input information on performing a data labeling related task through a user interface. That is, the input unit 210 may receive user touch information and user selection information through a user interface.
- the input unit 210 may be implemented as various input means according to the configuration and function of the data labeling apparatus 200.
- the input unit 210 may be implemented through input means such as a touch screen, a keypad, and a voice recognition sensor, and in the case of a personal computer, a mouse, a keyboard, etc. It may be implemented through the input means of.
- the input unit 210 and the display unit 220 may be replaced with a touch screen.
- the input unit 210 transmits a selection command input from the user to the control unit 230.
- the user's'selection command' for the GUI element may be an event such as'click','drag','mouse over', etc. for the corresponding GUI element when the input unit 210 is implemented with a mouse, and the input unit 210 ) Is implemented as a touch sensor of a touch screen, it may be an event such as'tapping','drag','flicking', and'press'.
- the mouse over event refers to an action of placing the mouse cursor on a specific object for a certain period of time.
- the tapping event refers to a touch input, such as a mouse click on a general PC, as an action of lightly pressing a selected object (number, letter, symbol, or icon, etc.) once.
- the drag event is an action that appears after selecting a specific object and moving to a specific location while pressing (touching) it. An object that is moved while being pressed continuously moves according to the direction of movement, and then is fixed in a prominent action.
- a flicking event is an action in which a contact is released after moving in a specific direction (up, down, left, right or diagonal) after a touch, and a specific action is processed according to the motion direction and moving speed of the flicking event. Is performed.
- a flicking event refers to an event for an action that looks like turning a page.
- the press event refers to an event for an operation of continuously pressing a contact point after a touch
- a release event refers to an event for an operation in which the contact is raised after a touch.
- the display unit 220 performs a function of outputting a result of the operation of the input unit 210 and the control unit 230 through a display unit.
- the display unit 220 displays a user interface for interaction with a user to perform data labeling related tasks.
- the user interface according to the present embodiment includes at least one area for performing a task.
- the user interface is described by illustrating that the user interface is largely including first to second regions, but the present invention is not limited thereto.
- the user interface may be displayed on the display 220 by the controller 230 based on a program stored in the memory 240.
- the first area is a work space for performing a work related to data labeling, and is an area in which work information including a work execution target, a work execution process for the work execution target, and work performance results are displayed.
- the second area is an area in which functional modules related to task execution are displayed.
- a first function module for converting the current state of the object to be performed to the full view state a second function module for converting a color state related to the object to be performed, and a result of performing the task are displayed in the second area.
- a third functional module or the like for submission may be displayed.
- the second area may additionally include a fourth function module for switching to the next task execution target, regardless of whether the task is successful or irrespective of the task being performed.
- the data labeling apparatus 100 provides a work space for data labeling and various functions using the user interface of the display unit 220, and based on this, high-quality learning data can be collected more efficiently. To be able to. Meanwhile, details of the data labeling operation using the user interface according to the present embodiment will be described later in the process of describing the operation of the control unit 230.
- the control unit 230 controls the display unit 220 to display a user interface for interaction with a user in order to perform a data labeling related task corresponding to drawing a box on an object in a photo. That is, the control unit 230 interlocks with the input unit 210 and the display unit 220 to perform an overall control function for performing a task related to data labeling.
- the controller 230 corresponds to the object to be performed. Controls the creation of bounding boxes.
- the control unit 230 matches the outermost lines of each side of the bounding box and the outermost lines of each side of the object to be performed based on the user control information corresponding to the generated bounding box to generate a bounding box surrounding the object to be performed. Controls the box drawing operation to be performed.
- control unit 230 allows a bounding box composed of an in-box and an out box to be generated, and through this, a user adjusts each box line to obtain two box lines. You can control the box drawing operation to be performed by making the edge of the object enter through.
- control unit 230 may control the line to be adjusted even when the periphery of the outer line of the bounding box is selected.
- This has the advantage that the box line can be adjusted even if the outer side of the box line is touched, compared to the case of controlling the box line by clicking the box line itself when controlling the existing box line, making it much more convenient to work than the existing work tools.
- the controller 230 divides the screen outside the box into four zones (ex: top, bottom, left, right) based on applause, and controls the corresponding box line to move when each zone is touched. have.
- control unit 230 may determine that the task is completed when the bounding box of the shape in which the outermost line of each side of the object to be performed is located is recognized between the outlines corresponding to each of the in-box and the out-box. have.
- in-box and out-box are used to extract consistent labeling results. That is, in order to determine the completion of the work, the necessary tight criterion for labeling, that is, the relationship between the in-box and out-box, that the person who ordered the labeling thinks, is established as a formula so that the calculation can be automatically performed. Since the final bounding box should not be inside the object boundary, the final labeling result will use the outbox.
- the operator does not put a subjective criterion on the tightness of the box, but only focuses on putting the object boundary between the in-box and out-box of a predetermined ratio, so that the operator can produce a consistent labeling result for anyone.
- the controller 230 may also determine that the task has been completed.
- the control unit 230 controls to perform a task as described above in relation to all tasks to be performed shown on the first area, and may determine whether the task is completed.
- the controller 230 may control the task execution target to be enlarged.
- the control unit 230 may allow a more accurate task to be performed in relation to task performance by expanding the object to be performed.
- the control unit 230 displays the screen from the enlarged state to the box full view state. It can be controlled to be switched.
- the controller 230 may control the color of the outline of the bounding box to be changed.
- control unit 230 performs a task when the button click information for an airplane button (ex: a third function module) in the function module is recognized while the box drawing operation is completed through the adjustment of the bounding box. You can control how your results are submitted.
- the control unit 230 when the button click information on the airplane button is recognized, the control unit 230 provides a screen for selecting or inputting correct answer information corresponding to the object to be performed, and selecting or inputting based on this The correct answer information is controlled so that it is provided as a result of performing a task together with a photograph of the object processed by the box.
- the controller 230 may control to switch to the next task execution target regardless of whether the task is successful. .
- controller 230 may control to switch to the next task execution target when a clear button (eg, a fourth function module) in the function module is touched after the task is completed.
- a clear button eg, a fourth function module
- the completion of the work may correspond to the case where the box drawing for all objects in the photo is completed.
- the memory 240 may store various data related to performing a data labeling operation.
- a program for performing a function corresponding to a user input when performing a data labeling operation may be included on the memory 240.
- FIG. 12 is a flowchart illustrating a data labeling method according to the present embodiment.
- the data labeling apparatus 200 displays a user interface for interaction with a user on the display 220 in order to perform a data labeling related task corresponding to a box grid on an object in a photo (S1202).
- the data labeling apparatus 200 displays a user interface including a first area, which is a work space for performing data labeling-related work, and a second area, which is an area for displaying a function module related to performing the work, on the display unit 220. Mark on.
- the data labeling apparatus 200 receives user input information regarding performing a data labeling related task through a user interface (S1204).
- the data labeling apparatus 200 controls the operation of generating a bounding box surrounding the object to be performed based on the user input information in step S1204 and the user interface in step S1202 (S1206).
- steps S1202 to S1206 correspond to the operation of each component of the data labeling apparatus 200 described above, further detailed description will be omitted.
- FIG. 12 it is described that each process is sequentially executed, but the present invention is not limited thereto. In other words, since it may be applicable to changing and executing the processes illustrated in FIG. 12 or executing one or more processes in parallel, FIG. 12 is not limited to a time-series order.
- the data labeling method described in FIG. 12 is implemented as a program and recorded on a recording medium (CD-ROM, RAM, ROM, memory card, hard disk, magneto-optical disk, storage device, etc.) that can be read using software of a computer. Can be.
- a recording medium CD-ROM, RAM, ROM, memory card, hard disk, magneto-optical disk, storage device, etc.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Software Systems (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Medical Informatics (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims (11)
- 데이터 라벨링 장치에 있어서,사용자 인터페이스(User Interface)를 표시하는 디스플레이부; 및상기 디스플레이부 상에 사진 속 물체에 박스 그리기에 대응하는 데이터 라벨링 관련 작업 수행을 위해 사용자와의 인터렉션을 위한 상기 사용자 인터페이스가 표시되도록 제어하는 제어부를 포함하는 것을 특징으로 하는 데이터 라벨링 장치.
- 제 1항에 있어서,상기 제어부는,상기 사용자 인터페이스의 제1 영역 상에 상기 데이터 라벨링 관련 작업 수행 대상, 상기 작업 수행 대상에 대한 작업 수행과정 및 작업 수행결과를 포함한 작업 정보가 표시되도록 하고,상기 사용자 인터페이스의 제2 영역 상에 상기 작업 수행과 관련한 적어도 하나 이상의 기능 모듈이 표시되도록 제어하는 것을 특징으로 하는 데이터 라벨링 장치.
- 제 2항에 있어서,상기 제어부는,상기 제1 영역에 표시된 상기 작업 수행 대상에 대한 드래그(Drag) 입력이 입력되면 상기 작업 수행 대상에 상응하는 바운딩 박스가 생성되도록 제어하는 것을 특징으로 하는 데이터 라벨링 장치.
- 제 2항에 있어서,상기 제어부는,상기 바운딩 박스에 대응되는 사용자 조절정보에 기반하여 상기 바운딩 박스의 각 측면의 외각선과 상기 작업 수행 대상의 각 측면의 최외각선 간 서로 매칭되어 상기 작업 수행 대상을 둘러싸는 상기 바운딩 박스가 생성되도록 하는 상기 박스 그리기 작업이 수행되도록 제어하는 것을 특징으로 하는 데이터 라벨링 장치.
- 제 4항에 있어서,상기 제어부는,상기 바운딩 박스의 외곽선 및 상기 외곽선의 외곽 면 부분에 대한 선택정보를 상기 사용자 조절정보로서 수신하는 것을 특징으로 하는 데이터 라벨링 장치.
- 제 4항에 있어서,상기 제어부는,인 박스와 아웃 박스로 구성되는 상기 바운딩 박스가 생성되도록 하며, 상기 인 박스 및 상기 아웃 박스 각각에 대응하는 외곽선 사이에 상기 작업 수행 대상의 각 측면의 최외각선이 위치되는 형상의 상기 바운딩 박스가 인지되는 경우 작업 수행이 완료된 것으로 판별하는 것을 특징으로 하는 데이터 라벨링 장치.
- 제 2항에 있어서,상기 제어부는,상기 제1 영역에 대한 사용자 클릭정보 혹은 사용자 스와이프 동작이 인지되는 경우 상기 작업 수행 대상이 확대되도록 제어하는 것을 특징으로 하는 데이터 라벨링 장치.
- 제 2항에 있어서,상기 제어부는,상기 제2 영역 상에,상기 작업 수행 대상의 현재 상태를 전체보기 상태로 전환하기 위한 제1 기능모듈;상기 작업 수행 대상과 관련한 색상 상태를 변환하기 위한 제2 기능모듈; 및상기 작업 수행결과를 제출하기 위한 제3 기능모듈중 적어도 하나의 기능 모듈이 표시되도록 제어하는 것을 특징으로 하는 데이터 라벨링 장치.
- 제 8항에 있어서,상기 제어부는,상기 제2 영역 상에,작업 성공 여부 혹은 상기 작업 성공 여부와 무관하게 다음 작업 수행 대상으로의 전환을 위한 제4 기능모듈이 추가 표시되도록 제어하는 것을 특징으로 하는 데이터 라벨링 장치.
- 제 8항에 있어서,상기 제어부는,상기 제3 기능모듈에 대한 사용자 선택이 인지되는 경우, 상기 작업 수행 대상에 대응하는 정답 정보를 선택 또는 입력하기 위한 화면을 제공하며, 선택 또는 입력된 상기 정답 정보를 상기 작업 수행결과로서 함께 제공되도록 제어하는 것을 특징으로 하는 데이터 라벨링 장치.
- 사진 속 물체에 박스 그리기에 대응하는 데이터 라벨링 관련 작업 수행을 위해 사용자와의 인터렉션을 위한 사용자 인터페이스를 디스플레이부 상에 표시하는 단계;사용자 입력을 수신하는 단계; 및상기 사용자 입력 및 상기 사용자 인터페이스에 기반하여 작업 수행 대상을 둘러싸는 바운딩 박스가 생성되도록 하는 상기 작업이 수행되도록 제어하는 단계를 포함하는 것을 특징으로 하는 데이터 라벨링 방법.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/771,171 US20220391075A1 (en) | 2019-11-18 | 2020-12-02 | Method and apparatus for drawing bounding box for data labeling |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020190147563A KR102176458B1 (ko) | 2019-11-18 | 2019-11-18 | 데이터 라벨링을 위한 바운딩 박스 그리기 방법 및 장치 |
KR10-2019-0147563 | 2019-11-18 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021101357A1 true WO2021101357A1 (ko) | 2021-05-27 |
Family
ID=73429467
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2020/017438 WO2021101357A1 (ko) | 2019-11-18 | 2020-12-02 | 데이터 라벨링을 위한 바운딩 박스 그리기 방법 및 장치 |
Country Status (3)
Country | Link |
---|---|
US (1) | US20220391075A1 (ko) |
KR (1) | KR102176458B1 (ko) |
WO (1) | WO2021101357A1 (ko) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102176458B1 (ko) * | 2019-11-18 | 2020-11-09 | 셀렉트스타 주식회사 | 데이터 라벨링을 위한 바운딩 박스 그리기 방법 및 장치 |
KR102310585B1 (ko) * | 2021-02-10 | 2021-10-13 | 주식회사 인피닉 | 용이하게 객체를 지정할 수 있는 어노테이션 방법 및 이를 실행하기 위하여 기록매체에 기록된 컴퓨터 프로그램 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20080078752A (ko) * | 2007-02-14 | 2008-08-28 | 허훈 | 인터넷을 이용한 3차원 모형 학습 시스템 및 사용자컴퓨터에서의 3차원 모형 학습 방법 |
KR20160045714A (ko) * | 2013-08-22 | 2016-04-27 | 삼성전자주식회사 | 디스플레이 장치의 애플리케이션 실행 방법 및 그 디스플레이 장치 |
JP2018196046A (ja) * | 2017-05-19 | 2018-12-06 | ヤフー株式会社 | 画像処理装置、画像編集装置、およびプログラム |
KR102030754B1 (ko) * | 2012-03-08 | 2019-10-10 | 삼성전자주식회사 | 관심 영역을 선택하기 위한 이미지 편집 장치 및 방법 |
KR20190124559A (ko) * | 2018-04-26 | 2019-11-05 | 주식회사 슈퍼브에이아이 | 컴퓨팅 장치 및 이를 이용한 인공 지능 기반 영상 처리 서비스 시스템 |
KR102176458B1 (ko) * | 2019-11-18 | 2020-11-09 | 셀렉트스타 주식회사 | 데이터 라벨링을 위한 바운딩 박스 그리기 방법 및 장치 |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CA2393887A1 (en) * | 2002-07-17 | 2004-01-17 | Idelix Software Inc. | Enhancements to user interface for detail-in-context data presentation |
US9069814B2 (en) * | 2011-07-27 | 2015-06-30 | Wolfram Alpha Llc | Method and system for using natural language to generate widgets |
US9075933B2 (en) * | 2012-10-11 | 2015-07-07 | Adobe Systems Incorporated | 3D transformation of objects using 2D controls projected in 3D space and contextual face selections of a three dimensional bounding box |
US10319412B2 (en) * | 2016-11-16 | 2019-06-11 | Adobe Inc. | Robust tracking of objects in videos |
CN110301136B (zh) * | 2017-02-17 | 2023-03-24 | 交互数字麦迪逊专利控股公司 | 在流传输视频中进行选择性感兴趣对象缩放的系统和方法 |
CN110210624A (zh) * | 2018-07-05 | 2019-09-06 | 第四范式(北京)技术有限公司 | 执行机器学习过程的方法、装置、设备以及存储介质 |
US11120592B2 (en) * | 2018-09-26 | 2021-09-14 | Element Ai Inc. | System and method for oriented bounding box tool defining an orientation of a tilted or rotated object |
US10650576B1 (en) * | 2018-11-12 | 2020-05-12 | Adobe Inc. | Snapping experience with clipping masks |
-
2019
- 2019-11-18 KR KR1020190147563A patent/KR102176458B1/ko active IP Right Grant
-
2020
- 2020-12-02 WO PCT/KR2020/017438 patent/WO2021101357A1/ko active Application Filing
- 2020-12-02 US US17/771,171 patent/US20220391075A1/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20080078752A (ko) * | 2007-02-14 | 2008-08-28 | 허훈 | 인터넷을 이용한 3차원 모형 학습 시스템 및 사용자컴퓨터에서의 3차원 모형 학습 방법 |
KR102030754B1 (ko) * | 2012-03-08 | 2019-10-10 | 삼성전자주식회사 | 관심 영역을 선택하기 위한 이미지 편집 장치 및 방법 |
KR20160045714A (ko) * | 2013-08-22 | 2016-04-27 | 삼성전자주식회사 | 디스플레이 장치의 애플리케이션 실행 방법 및 그 디스플레이 장치 |
JP2018196046A (ja) * | 2017-05-19 | 2018-12-06 | ヤフー株式会社 | 画像処理装置、画像編集装置、およびプログラム |
KR20190124559A (ko) * | 2018-04-26 | 2019-11-05 | 주식회사 슈퍼브에이아이 | 컴퓨팅 장치 및 이를 이용한 인공 지능 기반 영상 처리 서비스 시스템 |
KR102176458B1 (ko) * | 2019-11-18 | 2020-11-09 | 셀렉트스타 주식회사 | 데이터 라벨링을 위한 바운딩 박스 그리기 방법 및 장치 |
Also Published As
Publication number | Publication date |
---|---|
US20220391075A1 (en) | 2022-12-08 |
KR102176458B1 (ko) | 2020-11-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2013172607A1 (en) | Method of operating a display unit and a terminal supporting the same | |
US7809214B2 (en) | Device and a method for identifying movement patterns | |
WO2021101357A1 (ko) | 데이터 라벨링을 위한 바운딩 박스 그리기 방법 및 장치 | |
WO2013070024A1 (en) | Method and apparatus for designating entire area using partial area touch in a portable equipment | |
WO2014003337A1 (ko) | 유아이 조절 방법 및 이를 사용하는 사용자 단말기 | |
CN108874283B (zh) | 图片识别方法、移动终端及计算机可读存储介质 | |
WO2013103275A1 (en) | Method and apparatus for implementing multi-vision system by using multiple portable terminals | |
EP2678756A1 (en) | An apparatus and method for inputting command using gesture | |
CN106412410A (zh) | 移动终端及其控制方法 | |
WO2013125914A1 (en) | Method and apparatus for object size adjustment on a screen | |
WO2021031843A1 (zh) | 对象位置调整方法及电子设备 | |
EP3055762A1 (en) | Apparatus and method of copying and pasting content in a computing device | |
WO2010113487A1 (ja) | 情報入力装置および情報入力方法 | |
WO2016085186A1 (en) | Electronic apparatus and method for displaying graphical object thereof | |
WO2018131825A1 (ko) | 전자책 서비스 제공방법 및 그를 위한 컴퓨터 프로그램 | |
WO2019107799A1 (ko) | 입력 필드의 이동 방법 및 장치 | |
CN101910998A (zh) | 信息处理设备和程序 | |
WO2019054796A1 (en) | INTERACTION ACTIVATION METHOD USING A DIGITAL FOOTPRINT ON A DISPLAY AND ASSOCIATED ELECTRONIC DEVICE | |
CN109670507B (zh) | 图片处理方法、装置及移动终端 | |
US20150138121A1 (en) | Device for operating graphic arts machines and devices having a display device with a touch operation wall screen | |
EP2950503A1 (en) | Communication system, transfer control device, communication method, and computer program product | |
JP2017200119A (ja) | 画像処理装置及び画像処理システム | |
WO2014069815A1 (ko) | 학습용 마스크 디스플레이 장치 및 학습용 마스크 표시 방법 | |
CN103543825A (zh) | 摄像机光标系统 | |
EP3662357A1 (en) | Display apparatus for providing preview ui and method of controlling display apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20889643 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20889643 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20889643 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 08.12.2022) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20889643 Country of ref document: EP Kind code of ref document: A1 |