WO2019058539A1 - Programme, procédé de traitement d'informations et dispositif de traitement d'informations - Google Patents

Programme, procédé de traitement d'informations et dispositif de traitement d'informations Download PDF

Info

Publication number
WO2019058539A1
WO2019058539A1 PCT/JP2017/034438 JP2017034438W WO2019058539A1 WO 2019058539 A1 WO2019058539 A1 WO 2019058539A1 JP 2017034438 W JP2017034438 W JP 2017034438W WO 2019058539 A1 WO2019058539 A1 WO 2019058539A1
Authority
WO
WIPO (PCT)
Prior art keywords
editing
point
image
contour
displayed
Prior art date
Application number
PCT/JP2017/034438
Other languages
English (en)
Japanese (ja)
Inventor
フレデリック フォゲル
Original Assignee
Line株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Line株式会社 filed Critical Line株式会社
Priority to PCT/JP2017/034438 priority Critical patent/WO2019058539A1/fr
Priority to JP2019542940A priority patent/JP6995867B2/ja
Publication of WO2019058539A1 publication Critical patent/WO2019058539A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/80Creating or modifying a manually drawn or painted image using a manual input device, e.g. mouse, light pen, direction keys on keyboard
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image

Definitions

  • the present disclosure relates to a program, an information processing method, and an information processing apparatus.
  • a tool capable of cutting out (cutting out and trimming) a part of an image by a user operation (see, for example, Patent Document 1 and Non-patent Document 1).
  • the cutout area is designated by the user by an operation of tracing the outline of the area with a finger, a mouse cursor or the like.
  • the clipped image can be used, for example, as a stamp transmitted by an instant messaging service of SNS (Social Networking Service) or the like.
  • SNS Social Networking Service
  • the present disclosure has been made in view of the above problems, and an object of the present disclosure is to provide a technique capable of easily designating an area of an object to be cut out from an image with relatively high accuracy.
  • a program includes a first display step for causing an information processing apparatus to display a first image, a step of acquiring an outline of a cutout region in the first image, and a method for correcting the outline
  • a second display step of displaying a plurality of editing points respectively on the outline, and in response to a user operation, enlarging and displaying the first image, and increasing the number of the plurality of editing points;
  • a third display step of displaying and a step of cutting out a second image in the contour specified by the plurality of editing points are performed.
  • FIG. 1 is a diagram showing a configuration of a communication system in an embodiment of the present disclosure.
  • the server 10 and the terminals 20 are connected via the network 30.
  • the server 10 provides a service for realizing transmission and reception of messages between the terminals 20 to the terminals 20 owned by the user via the network 30.
  • the number of terminals 20 connected to the network 30 is not limited.
  • the network 30 plays a role of connecting one or more terminals 20 and one or more servers 10. That is, the network 30 refers to a communication network that provides a connection path so that data can be transmitted and received after the terminal 20 connects to the server 10.
  • Network 30 may be, by way of example and not limitation, an ad hoc network, an intranet, an extranet, a virtual private network (VPN), a local area network (LAN), wireless LAN (wireless LAN: WLAN), wide area network (wide area network: WAN), wireless WAN (wireless WAN: WWAN), metropolitan area network (metropolitan area network: MAN), part of the Internet, public switched telephone network (Public) Switched Telephone Network (PSTN), mobile phone network, ISDNs (integrated service digital networks), wireless LANs, long term evolution (LTE) code division multiple access (CDMA), Bluetooth (registered trademark), satellite communication The Or it may include combinations of two or more thereof.
  • Network 30 may include one or more networks 30.
  • the terminal 20 may be any information processing terminal that can realize the functions described in each embodiment.
  • the terminal 20 may be, by way of example and not limitation, a smart phone, a mobile phone (feature phone), a computer (for example but not limited to desktop, laptop, tablet etc.), a media computer platform (for example but not limited to cable, satellite set Top boxes, digital video recorders, handheld computing devices (such as but not limited to PDAs (personal digital assistants, email clients etc.), wearable terminals (glasses type devices, watch type devices etc), or other types of computers Or include a communication platform.
  • the terminal 20 may be expressed as an information processing terminal.
  • the terminal used by the user X is expressed as the terminal 20X
  • the user information in a predetermined service associated with the user X or the terminal 20X is expressed as the user information X.
  • the user information is information of the user associated with the account used by the user in the predetermined service.
  • the user information is, by way of example and not limitation, the user's name, the user's icon image, the user's age, the user's age, the user's gender, the user's address, the user's hobbies entered by the user or provided by a predetermined service. It includes information associated with the user, such as preferences and user identifiers, and may be any one or a combination of these.
  • the server 10 has a function of providing a predetermined service to the terminal 20.
  • the server 10 may be any device as long as it can implement the functions described in each embodiment.
  • the server 10 may be, by way of example and not limitation, a server device, a computer (for example but not limited to, desktop, laptop, tablet etc.), a media computer platform (for example but not limited to cable, satellite set top box, digital video recorder) And handheld computing devices (for example and not by way of limitation, PDAs, email clients, etc.), or other types of computers, or communication platforms.
  • the server 10 may be expressed as an information processing apparatus.
  • HW Hardware
  • the terminal 20 includes a control device 21 (CPU: central processing unit (central processing unit)), a storage device 28, a communication I / F 22 (interface), an input / output device 23, a display device 24, a microphone 25, a speaker 26, and a camera 27. Prepare.
  • the HW components of the terminal 20 are interconnected via the bus B, by way of example and not limitation.
  • the communication I / F 22 transmits and receives various data via the network 30.
  • the communication may be performed by wire or wireless, and any communication protocol may be used as long as communication with each other can be performed.
  • the communication I / F 22 has a function of executing communication with the server 10 via the network 30.
  • the communication I / F 22 transmits various data to the server 10 in accordance with an instruction from the control device 21.
  • the communication I / F 22 receives various data transmitted from the server 10 and transmits the data to the control device 21.
  • the input / output device 23 includes a device for inputting various operations to the terminal 20 and a device for outputting the processing result processed by the terminal 20.
  • the input / output device 23 may be integrated with the input device and the output device, or may be separated into the input device and the output device.
  • the input device is realized by any one or a combination of all types of devices capable of receiving an input from a user and transmitting information related to the input to the control device 21.
  • the input device includes, by way of example and not limitation, hardware keys such as a touch panel, a touch display, and a keyboard, a pointing device such as a mouse, a camera (operation input via a moving image), and a microphone (operation input by voice).
  • the output device is realized by any one or a combination of all types of devices capable of outputting the processing result processed by the control device 21.
  • Output devices include, by way of example and not limitation, touch panels, touch displays, speakers (audio output), lenses (eg, without limitation 3D (three dimensions) output, hologram output), printers, etc.
  • the display device 24 is realized by any one or a combination of all kinds of devices capable of displaying according to the display data written in the frame buffer.
  • the display device 24 is, by way of example and not limitation, a touch panel, a touch display, a monitor (as a non-limiting example, a liquid crystal display or OELD (organic electroluminescence display)), a head mounted display (HDM: Head Mounted Display), projection mapping, a hologram , An air, etc. (which may be vacuum), and a device capable of displaying images, text information, etc. Note that these display devices 24 may be able to display display data in 3D.
  • the input / output device 23 is a touch panel
  • the input / output device 23 and the display device 24 may be arranged to face each other with substantially the same size and shape.
  • the control device 21 has a circuit physically structured to execute the functions realized by the codes or instructions contained in the program, and is not limited and is, for example, a data processing device built in hardware. Is realized by
  • the controller 21 may be, by way of example and not limitation, a central processing unit (CPU), microprocessor, processor core, multiprocessor, ASIC (application-specific integrated circuit), FPGA (field programmable) (gate array) is included.
  • CPU central processing unit
  • microprocessor processor core
  • multiprocessor multiprocessor
  • ASIC application-specific integrated circuit
  • FPGA field programmable
  • the storage device 28 has a function of storing various programs and various data required for the terminal 20 to operate.
  • the storage device 28 includes various storage media such as a hard disk drive (HDD), a solid state drive (SSD), a flash memory, a random access memory (RAM), and a read only memory (ROM).
  • HDD hard disk drive
  • SSD solid state drive
  • RAM random access memory
  • ROM read only memory
  • the terminal 20 stores the program P in the storage device 28, and by executing the program P, the control device 21 executes processing as each unit included in the control device 21. That is, the program P stored in the storage device 28 causes the terminal 20 to realize each function executed by the control device 21.
  • the microphone 25 is used to input audio data.
  • the speaker 26 is used to output audio data.
  • the camera 27 is used to acquire moving image data.
  • the server 10 includes a control device 11 (CPU), a storage device 15, a communication I / F 14 (interface), an input / output device 12, and a display 13.
  • the HW components of the server 10 are interconnected via the bus B, by way of example and not limitation.
  • the control device 11 has a circuit physically structured to execute functions realized by codes or instructions contained in a program, and is not limited and is, for example, a data processing device built in hardware. Is realized by
  • the control device 11 is typically a central processing unit (CPU), and may be a microprocessor, processor core, multiprocessor, ASIC, or FPGA. However, in the present disclosure, the control device 11 is not limited to these.
  • the storage device 15 has a function of storing various programs and various data required for the server 10 to operate.
  • the storage device 15 is realized by various storage media such as an HDD, an SSD, and a flash memory.
  • the storage device 15 is not limited to these.
  • the communication I / F 14 transmits and receives various data via the network 30.
  • the communication may be performed by wire or wireless, and any communication protocol may be used as long as communication with each other can be performed.
  • the communication I / F 14 has a function of executing communication with the terminal 20 via the network 30.
  • the communication I / F 14 transmits various data to the terminal 20 in accordance with an instruction from the control device 11.
  • the communication I / F 14 receives various data transmitted from the terminal 20 and transmits the data to the control device 11.
  • the input / output device 12 is realized by a device that inputs various operations on the server 10.
  • the input / output device 12 is realized by any one or a combination of all types of devices capable of receiving input from a user and transmitting information related to the input to the control device 11.
  • the input / output device 12 is typically realized by a hardware key represented by a keyboard or the like, or a pointing device such as a mouse.
  • the input / output device 12 may include a touch panel, a camera (operation input via a moving image), and a microphone (operation input by voice) as an example, without limitation. However, in the present disclosure, the input / output device 12 is not limited to these.
  • the display 13 is typically implemented by a monitor (for example, a liquid crystal display or an organic electroluminescence display (OELD) as an example and not by way of limitation).
  • the display 13 may be a head mounted display (HDM) or the like. Note that these displays 13 may be able to display display data in 3D. However, in the present disclosure, the display 13 is not limited to these.
  • the server 10 stores the program P in the storage device 15, and by executing the program P, the control device 11 executes processing as each unit included in the control device 11. That is, the program P stored in the storage device 15 causes the server 10 to realize each function executed by the control device 11.
  • CPU of terminal 20 and / or server 10 explains as what is realized by running program P.
  • the control device 21 of the terminal 20 and / or the control device 11 of the server 10 is not only a CPU but also a logic circuit formed in an integrated circuit (IC (Integrated Circuit) chip, LSI (Large Scale Integration)), etc. Each process may be realized by (hardware) or a dedicated circuit. In addition, these circuits may be realized by one or more integrated circuits, and the plurality of processes shown in each embodiment may be realized by one integrated circuit. Also, LSI may be called VLSI, super LSI, ultra LSI, or the like depending on the degree of integration.
  • program P software program / computer program
  • the storage medium can store the program in the “non-transitory tangible medium”.
  • the storage medium is, where appropriate, one or more semiconductor-based or other integrated circuits (ICs) (for example but not limited to field programmable gate arrays (FPGAs) or application specific ICs (ASICs) etc.
  • ICs semiconductor-based or other integrated circuits
  • the storage medium may be volatile, non-volatile, or a combination of volatile and non-volatile, as appropriate.
  • the storage medium is not limited to these examples, and may be any device or medium as long as the program P can be stored.
  • the server 10 and / or the terminal 20 can realize the functions of the plurality of functional units shown in each embodiment by reading the program P stored in the storage medium and executing the read program P.
  • the program P of the present disclosure may be provided to the server 10 and / or the terminal 20 via any transmission medium (communication network, broadcast wave, etc.) capable of transmitting the program.
  • the server 10 and / or the terminal 20 implement the functions of the plurality of functional units shown in each embodiment by executing the program P downloaded via the Internet etc. as an example and not by way of limitation.
  • Each embodiment of the present disclosure can also be realized in the form of a data signal embedded in a carrier wave, in which the program P is embodied by electronic transmission.
  • At least part of the processing in the server 10 and / or the terminal 20 may be realized by cloud computing configured by one or more computers.
  • the configuration may be such that at least a part of the processing in the terminal 20 is performed by the server 10. In this case, at least a part of the processing of each functional unit of the control device 21 of the terminal 20 may be performed by the server 10.
  • the terminal 20 may perform at least a part of the processing in the server 10. In this case, at least a part of the processing of each functional unit of the control device 11 of the server 10 may be performed by the terminal 20.
  • the configuration of the determination in the embodiment of the present disclosure is not essential, and the predetermined process is operated when the determination condition is satisfied, or the predetermined process is performed when the determination condition is not satisfied.
  • program of the present disclosure is not limited and is, for example, a script language such as ActionScript and JavaScript (registered trademark), an object-oriented programming language such as Objective-C and Java (registered trademark), and a markup language such as HTML5. It is implemented using.
  • a script language such as ActionScript and JavaScript (registered trademark)
  • an object-oriented programming language such as Objective-C and Java (registered trademark)
  • a markup language such as HTML5. It is implemented using.
  • the first embodiment is a point for correcting the outline of the cutout region (range) in the original image, and the number of editing points that are points on the outline is increased when the image is enlarged and displayed. It is a form to be displayed. According to the first embodiment, the area of the object to be cut out from the image can be specified with relatively high accuracy and ease.
  • the contents described in the first embodiment can be applied to any of the other embodiments described later.
  • the terminal 20 includes a reception unit 201, a display control unit 202, an image processing unit 203, and a control unit 204 as functions implemented by the control device 21.
  • the receiving unit 201 receives an operation from a user.
  • the receiving unit 201 receives, for example, an operation or the like for designating an outline of an area to be cut out in the displayed original image.
  • the display control unit 202 displays a display screen when trimming an image on the screen of the terminal 20 according to an instruction from the image processing unit 203 or the server 10.
  • the display control unit 202 displays, for example, a plurality of editing points which are points for correcting the outline of the cutout area (range) in the image and which are points on the outline. Further, in response to the user's operation, the display control unit 202 enlarges and displays at least a part of the image, and increases and displays the number of editing points.
  • the image processing unit 203 performs image processing to cut out an image in an outline designated by a plurality of editing points.
  • the control unit 204 generates content including the image cut out by the image processing unit 203 and transmits the generated content to the server 10.
  • the server 10 includes an acquisition unit 101, a display control unit 102, and a control unit 103 as functions implemented by the control device 11.
  • the acquisition unit 101 acquires data from the terminal 20.
  • the display control unit 102 controls the display of the screen of the terminal 20.
  • the control unit 103 sells the content received from the terminal 20 to the user of each terminal 20, for example, as a stamp or the like that can be used in an SNS instant messaging service or the like.
  • FIG. 2 is a flowchart showing an example of processing when cutting out a part of an image.
  • FIG. 3A is a view for explaining a display example of an original image on the terminal 20.
  • FIG. 3B is a view for explaining a display example of a confirmation screen of the clipping range in the terminal 20.
  • step S1 the receiving unit 201 of the terminal 20 receives an operation of selecting an image from the user.
  • the display control unit 202 of the terminal 20 displays the selected image ("first image") on the screen (step S2).
  • first image displayed on the screen.
  • FIG. 3A for example, an image stored in advance in the terminal 20 or an image captured by the terminal 20 is displayed.
  • the image of the dog 301 is selected by the user and displayed.
  • the reception unit 201 of the terminal 20 and the display control unit 202 edit the cutout range in the selected image according to the operation from the user (step S3).
  • the image processing unit 203 of the terminal 20 trims the image (“second image”) within the edited clipping range by the operation of the user (step S4).
  • the face portion 301A of the dog 301 within the edited clipping range is highlighted and displayed by lowering the luminance of the area outside the edited clipping range.
  • the confirmation button 302 is tapped, the image of the portion 301A that is within the edited cutout range is cut out and stored.
  • FIG. 4 is a flowchart showing an example of processing when editing the cutout range.
  • FIG. 5 is a diagram for explaining an example of a method of displaying an editing point before an image is enlarged and displayed.
  • FIG. 6 is a diagram for explaining an example of a method of increasing the number of editing points when an image is enlarged and displayed.
  • FIG. 7 is a diagram for explaining an example of a method of reducing and displaying the number of editing points when an image is enlarged and displayed and then reduced.
  • step S101 in response to the user's operation, the reception unit 201 acquires the outline of the cutout area in the displayed original image.
  • the display control unit 202 displays a plurality of editing points on the contour (step S102).
  • the display control unit 202 sets a line passing through each pixel traced by the user as the contour.
  • the editing point 401 and the editing point 402 etc. adjacent to each other are displayed on the outline 400.
  • the display control unit 202 may display the editing points on the contour at predetermined intervals on the contour.
  • the length on the contour between two adjacent editing points is at a position separated by a first threshold (for example, 6 mm) or more, and each editing point is a feature on the contour
  • the editing points may be displayed at the positions of the points.
  • the feature points on the contour will be described later.
  • the reception unit 201 receives an operation for enlarging the displayed original image from the user (step S103).
  • the reception unit 201 for example, places a thumb, a forefinger, and the like on the touch panel screen, and receives a pinch-out operation and the like that are spread by two fingers.
  • the display control unit 202 enlarges and displays the displayed original image, and increases and displays the number of editing points (step S104).
  • the display control unit 202 when the image of FIG. 5 is enlarged and displayed, and the length on the contour between the editing point 401 and the editing point 402 is equal to or more than the above-described first threshold, the display control unit 202 An editing point 403 is additionally displayed on the contour between the editing point 401 and the editing point 402. Further, in the example of FIG. 6, the display control unit 202 is configured such that the length on the contour from the editing point 401 and the editing point 402 on the contour between the editing point 401 and the editing point 402 is a second threshold (for example, An edit point 403 is additionally displayed at a predetermined position separated by 3 mm or more.
  • the display control unit 202 may add the editing point 403 to the middle of the outline from the editing point 401 to the editing point 402.
  • the display control unit 202 may add the editing point 403 to the feature points on the contour from the editing point 401 to the editing point 402.
  • the feature point on the contour is a point representing a feature of the degree of curvature of the contour, for example, a singular point such as a cusp when the contour is represented by an algebraic curve, and a curvature of the algebraic curve
  • the extreme point or the like of may be used as a feature point on the contour.
  • the receiving unit 201 receives from the user an operation to move the position of the editing point to an arbitrary position on the screen (step S105).
  • the receiving unit 201 receives, for example, a slide operation for sliding the finger in a state where the editing point on the touch panel screen is touched with the finger.
  • the display control unit 202 moves the position of the operated editing point (step S106).
  • the display control unit 202 moves, for example, the operated editing point to a position at which the finger is moved by a slide operation or the like, and the length on the outline from the operated editing point is the third threshold
  • One or more other editing points (hereinafter referred to as “peripheral editing points”) having a size of 1 cm or less (for example, 1 cm or less) may also be moved in a direction according to the movement direction of the operated editing point .
  • the display control unit 202 moves the movement distance of the peripheral editing point in proportion to the movement distance of the operated editing point, and the display control unit 202 moves on the contour from the operated editing point to the peripheral editing point. It may be inversely proportional to the length.
  • the display control unit 202 calculates an outline passing through the moved editing point, and displays the calculated changed outline (step S107).
  • the display control unit 202 may calculate the contour passing through the moved editing point as an algebraic curve passing through each currently displayed editing point.
  • the display control unit 202 may calculate an outline passing through the moved editing point as a straight line passing through each editing point currently displayed.
  • the display control unit 202 adds or deletes the editing point to be displayed based on the editing point currently displayed and the contour after the change (step S108).
  • a fourth threshold for example, 1 cm
  • the display control unit 202 An edit point is added at a position where the edit point is the position of the feature point on the contour.
  • the display control unit 202 sets the adjacent two editing points Delete one of them (hide it).
  • the receiving unit 201 receives an operation for reducing the displayed original image from the user (step S109).
  • the reception unit 201 receives, for example, a pinch-in operation or the like in which a thumb, a forefinger, and the like are placed on the touch panel screen and a pinching operation is performed by two fingers.
  • the display control unit 202 reduces and displays the original image displayed in an enlarged size, and reduces the number of editing points and displays the reduced size (step S110).
  • the display control unit 202 continues to display the editing point whose position has been changed based on the operation of moving the position of the editing point in step S105, and the position is changed depending on the operation of moving the position of the editing point in step S105.
  • the number of edit points not changed may be reduced and displayed.
  • the display of the editing point whose position has been changed by the user operation continues, which is the feature point of the changed outline, etc. It can be displayed.
  • the display control unit 202 adds an editing point around the one editing point that is automatically moved according to the movement of the one editing point in step S106 and added in step S108. You may make it include the edited edit point etc.
  • the display control unit 202 reduces the editing point 603 and the editing point 602, etc., and also reduces the editing point 603 whose position has been changed by the user's operation after being added when enlarged and displayed. It is displayed continuously when it is displayed.
  • the display control unit 202 may reduce the number of editing points whose position has been changed based on the operation of moving the position of the editing point in step S105, and may display the reduced number. In this case, even if the display control unit 202 does not display the editing point displayed by, for example, a blue circle, the outline of the outline displayed by the white line or the like has an outline passing through the editing point whose position has been changed. You may display it. In this case, the display control unit 202 may calculate and store, as an algebraic curve, an outline passing through the edited editing point in step S107, for example.
  • the display control unit 202 erases the display of the editing point when it is displayed in a reduced size, and when it is displayed again in a large size, it is a feature point of the stored algebraic curve or the like. Display the editing point whose position has been changed again. Thus, the editing point erased when the reduced display is performed can be restored when the enlarged display is performed again.
  • the number of editing points is automatically increased and displayed. As a result, the outline of the object to be cut out from the original image can be specified more accurately by the easier operation. In addition, since the operation becomes easier, the processing load of the terminal 20 can be reduced as a result.
  • FIG. 8 is a diagram showing a configuration of a communication system in the second embodiment. As shown in FIG.
  • the terminal 20 according to the second embodiment further includes a tilt sensor 29.
  • the inclination sensor 29 is, for example, an acceleration sensor, an angular velocity sensor, or the like, and is a sensor for detecting the inclination of the housing of the terminal 20.
  • the terminal 20 according to the second embodiment includes a tilt detection unit 205 as a function realized by the control device 21.
  • FIG. 9 is a flowchart showing an example of processing when editing the cutout range according to the second embodiment.
  • FIG. 10 is a diagram for explaining an example of a display screen of the first moving point group.
  • FIG. 11 is a diagram for explaining an example of data in which an original image is edge-detected.
  • 12 and 13 are diagrams for explaining the movement of the moving point when the housing of the terminal 20 is inclined.
  • FIG. 14 is a diagram showing an example of the display screen in which the position of the movement point by the user operation is fixed.
  • FIG. 15 is a diagram for explaining an example of a display screen of the second moving point group.
  • step S201 in response to the user's operation accepted by the accepting unit 201, the display control unit 202 performs a first movement, which is one or more movement points for designating an area to be cut out in the displayed original image.
  • a point cloud (“first plurality of points") is displayed from one side of the screen.
  • the display control unit 202 connects the moving points with a plurality of moving points 1001-1, 1001-2 to 1001-N arranged in the horizontal direction from near the upper side of the screen of the terminal 20.
  • a line 1002 (trimming line) is displayed.
  • the display control unit 202 detects an edge of the displayed original image (step S202).
  • the display control unit 202 first differentiates the image to calculate the magnitude of the gradient (edge strength), predicts the local direction of the edge from the gradient direction, and the gradient of that direction is You may use the method of searching for the local maximum point.
  • the display control unit 202 may use, for example, a method of searching for a zero crossing in a second order differential equation obtained from an image.
  • FIG. 11 shows an example of data 1101 as a result of edge detection of the original image shown in FIG. In the example of FIG. 11, the pixels whose detected edge strength is equal to or greater than a predetermined threshold are shown in black.
  • the display control unit 202 may use only the value greater than or equal to a predetermined threshold value among the detected edge strengths.
  • the display control unit 202 moves the plurality of moving points 1001-1, 1001-2 to 1001-N and each moving point.
  • the connecting line 1002 is moved in the inclined direction and displayed (step S203).
  • each moving point may be moved and displayed at a higher speed as the screen of the housing is greatly inclined from the predetermined position (for example, the horizontal position) by the user.
  • the display control unit 202 causes each moving point to be temporarily stopped at each portion of the edge and displayed by a virtual force (adhesive force or frictional force) corresponding to the edge strength of the original image (Ste S204).
  • the display control unit 202 sets the edge at the pixel position on the edge data as shown in FIG. 11 corresponding to the pixel position on the original image as shown in FIG. 10 where each moving point is currently located. Extract strength values.
  • the display control unit 202 temporarily stops each moving point by a virtual force corresponding to the extracted edge strength.
  • the display control unit 202 displays, for example, a trimming line connecting the moving points by a curve or a straight line along an edge detected from the original image, which passes through the adjacent moving points.
  • the moving point once stopped is moved again by further tilting the casing.
  • the display control unit 202 is a virtual according to the force in the direction in which the moving point slides down due to the gravity virtually applied to the moving point and the strength of the edge. Forces of cohesion or frictional force on the edge calculate a virtual inertial force at which the moving point tries to stay at the position on the edge. Then, when the force in the direction to slide down becomes larger than the inertial force to stop at the position on the edge, the display control unit 202 causes the moving point to be displayed so as to move in the direction to slide down. Good.
  • the moving point will stop once on that edge Instead, it may be moved to the edge of the screen of the terminal 20 and displayed.
  • step S203 The process of moving the moving point in step S203 and the process of temporarily stopping each moving point by a virtual force according to the edge strength in step S204 are alternately repeated repeatedly by the operation of tilting the case by the user. Be done.
  • Each movement point is temporarily stopped by a force.
  • the moving point 1201 temporarily stopped on the edge of the ear portion of the dog 301 and the moving point 1202 temporarily stopped on the edge of the trunk portion of the dog 301 shown in FIG.
  • the movement point 1201 is temporarily stopped by a virtual force higher than the movement point 1202. Therefore, when the screen of FIG.
  • the moving point 1202 is in the lower direction of the screen while the moving point 1201 is stopped. Moved to Then, the moving point 1202 is moved to a position under the face of the dog 301 and temporarily stopped according to the edge strength as shown in FIG. 11 like the moving point 1302 shown in FIG.
  • the display control unit 202 when the user tilts the housing, the display control unit 202 causes the moving point to slide down due to gravity virtually applied to each moving point. Then, when the length on the edge between the adjacent moving points on the line 1002 connecting the moving points is equal to or more than the above-described first threshold, the display control unit 202 causes the edge between the moving points. Moving points are added and displayed at positions such as feature points. Further, in the example of FIG. 12 and FIG. 13, the display control unit 202 is stopped in the vertical direction of the screen by the user so that the straight line connecting the moving points hangs down on the object from above or below. To be displayed. Then, the display control unit 202 is stopped and displayed so that the straight line connecting the moving points hangs from the left and right on the object by being tilted in the horizontal direction or the like of the screen by the user.
  • the display control unit 202 detects that the moving point included in each of the moving points is touched (pressed) by the user by the receiving unit 201 (step S205), the display control unit 202 The position is fixed (step S206).
  • the display control unit 202 displays the moving point whose position is fixed in a manner that can be distinguished from the moving point whose position is not fixed. For example, when the movement point whose position is not fixed is a green circle, the movement point whose position is fixed is displayed as a double black circle (round black), a blue circle, or the like.
  • the display control unit 202 selects a second moving point group (“the second plurality of moving point groups for specifying a clipping region in the displayed original image.
  • the point “)” is displayed from the other side facing the one side of the screen on which the first moving point group is displayed (step S207).
  • the first movement point group only the movement points whose positions are fixed in the process of step S206 are displayed.
  • a plurality of moving points 1501-1, 1501-2 to 1501-N arranged in the horizontal direction and a line 1502 (trimming line) connecting the moving points are displayed. There is.
  • FIG. 16 is a diagram showing an example of data as a result of detection of an edge in a roughly designated area. The process of specifying the cutout area by the edit point in the first embodiment described above and the process of moving the moving point on the edge and specifying the cutout area according to the inclination of the housing in the second embodiment are combined. It is also good.
  • the cutout region is roughly designated as shown in FIG.
  • the edge is detected only in the designated area 1401 in the original image.
  • the process of fixing the position of the moving point by the user's touch operation such as step S205 in FIG. 9 may be omitted.
  • the processing of steps S207 to S211 in FIG. 9 using the second trimming line may be omitted.
  • Second Modified Example of Second Embodiment After moving the movement point on the edge according to the inclination of the casing according to the second embodiment and specifying the cutout area, the cutout area is specified according to the editing point in the first embodiment. It may be possible to execute the process. In this case, the moving point described above is used as an editing point in the first embodiment, as described below.
  • the display control unit 202 moves the movement point on the edge according to the inclination of the housing and specifies the cutout region according to the processing illustrated in FIG. 9 of the second embodiment. Then, the display control unit 202 enlarges and displays the original image by, for example, the processing of step S103 to step S110 shown in FIG. 4 of the first embodiment described above, and sets the moving point as the editing point and the number of editing points Increase and display. Then, the editing point is moved by an operation of moving the editing point with a finger or an operation of tilting the casing.
  • the point for specifying the cutout area in the original image is moved and displayed.
  • the outline of the object to be cut out from the original image can be specified more accurately by the easier operation.
  • the processing load of the terminal 20 can be reduced as a result.
  • a point for specifying a cutout area in an original image is moved to a position of an edge in the original image and displayed.
  • the area of the object to be cut out from the image can be specified with relatively high accuracy and ease. Note that the third embodiment is the same as the first embodiment or the second embodiment except for a part thereof, and thus the description will be appropriately omitted.
  • FIG. 17 is a flowchart showing an example of processing when editing the cutout range according to the third embodiment.
  • FIG. 18 is a diagram for explaining processing for moving and displaying an editing point by a flick operation.
  • FIG. 19A and FIG. 19B are diagrams for explaining the process of causing the moving speed of the editing point to be attenuated according to the strength of the edge and causing it to be stopped and displayed.
  • step S301 in response to the user's operation, the reception unit 201 acquires the outline of the cutout region in the displayed original image. Subsequently, the display control unit 202 displays a plurality of editing points on the outline (step S302).
  • step S301 and step S302 even if the contour traced by the finger or the like by the user is acquired similarly to the processing of step S101 and step S102 shown in FIG. 4 of the first embodiment described above. Good.
  • an editing point or the like whose number has been increased along with the enlarged display may be displayed at any one of the processing steps from step S103 to step S110 shown in FIG. 4 of the first embodiment described above.
  • the moving point in any of the processing steps of steps S201 to S211 shown in FIG. 9 of the second embodiment described above may be used as the editing point.
  • the receiving unit 201 receives the user's flick operation (or swipe operation) on one editing point (hereinafter referred to as “first editing point”) included in the plurality of editing points (step S303).
  • the flick operation is, for example, an operation of rapidly moving or flipping a finger on a screen having a touch panel.
  • the swipe operation is an operation such as sliding a finger on a screen having a touch panel.
  • the display control unit 202 moves the first editing point and editing points around the first editing point (hereinafter referred to as “second editing point”) in the direction of the flick operation (step S304). ).
  • the display control unit 202 moves the first editing point at a speed according to the speed of the flicking operation, and the speed of the flicking operation and the distance on the contour between the first editing point and the second editing point Move the second editing point at a speed according to.
  • the direction of the flick operation is, for example, the direction in which the finger is moved by the flick operation.
  • the speed of the flick operation is, for example, the speed at which the finger is moved by the flick operation.
  • the display control unit 202 determines that the distance on the contour from the flick-operated first editing point 1801 is equal to or less than a predetermined threshold, or the number of adjacent editing points from the first editing point 1801 is predetermined.
  • a predetermined threshold may be a value that includes at least another editing point adjacent to the first editing point.
  • the display control unit 202 calculates the distance on the contour between the first editing point 1801 and the selected editing point 1802, and sets the speed according to the speed of the flick operation and the calculated distance as the magnitude, A velocity vector 1802A whose direction is the direction of operation is calculated.
  • the display control unit 202 may set the size to, for example, a size that is proportional to the speed of the flick operation and inversely proportional to the calculated distance.
  • the display control unit 202 similarly calculates the velocity vector 1803A and the velocity vector 1804A for the selected editing point 1803 and editing point 1804 as well.
  • the display control unit 202 moves the first editing point 1801 by the speed vector 1801A having the speed according to the speed of the flicking operation and the direction of the flicking operation, the editing point 1802, the editing point 1803 and The editing point 1804 is moved by the velocity vector 1803A and the velocity vector 1804A.
  • the distance on the outline from the first editing point 1801 is longer in the order of the editing point 1802, the editing point 1803, and the editing point 1804.
  • the velocity vector 1802A, the velocity vector 1803A, and the velocity vector 1804A have the same direction, and the velocity decreases in the order of the velocity vector 1802A, the velocity vector 1803A, and the velocity vector 1804A.
  • a virtual force (adhesive force or frictional force corresponding to the edge strength of the original image) ) causes each editing point to be stopped at a position on the edge and displayed (step S305).
  • the display control unit 202 may detect an edge in the original image in advance.
  • the display control unit 202 may detect only an edge in the original image along the movement direction from the movement source position of each moved editing point. Thereby, the processing load on the terminal 20 can be reduced.
  • the display control unit 202 attenuates the moving speed of each editing point according to the speed of the flicking operation and the strength of the edge on the path along which each editing point moves in response to the flicking operation. You may make it stop.
  • the display control unit 202 moves the editing point 1901 with the velocity vector 1901A, as shown in FIG. 19A, for each moving editing point.
  • the display control unit 202 attenuates the speed of movement of the editing point 1901 by a virtual force corresponding to the strength of the edge 1903. Then, as shown in FIG.
  • the display control unit 202 moves the editing point 1901 with the velocity vector 1901B in which the moving direction is unchanged and the velocity is attenuated from the velocity vector 1901A. Then, the display control unit 202 stops the editing point 1901 at the position of the edge 1904 on the moving direction 1902 by a virtual force according to the strength of the edge 1904 on the moving direction 1902.
  • the display control unit 202 calculates an outline passing through the moved editing point, and displays the calculated changed outline (step S306). Subsequently, the display control unit 202 adds or deletes the editing point to be displayed based on the editing point currently displayed and the contour after the change (step S307).
  • the processes of steps S306 to S307 may be the same as the processes of steps S107 to S108 in FIG. 4 of the first embodiment described above.
  • First Modified Example of Third Embodiment After the display control unit 202 moves the editing point to the position of the edge in response to the flick operation or the like, when the pinch out operation or the like is received, steps S103 to S110 shown in FIG. 4 of the first embodiment described above are performed. Processing may be performed.
  • the original image is enlarged and displayed, and the moving point is used as an editing point, and the number of editing points is increased and displayed. Then, in a state of being enlarged and displayed, the editing point is moved to a position where the finger is released by a slide operation of moving the editing point with the finger.
  • the point for specifying the cutout region in the original image is moved to the position of the edge in the original image and displayed. As a result, the outline of the object to be cut out from the original image can be specified more accurately by the easier operation. In addition, since the operation becomes easier, the processing load of the terminal 20 can be reduced as a result.
  • the server 10 may perform the image processing.
  • the reception unit 201 of the terminal 20 notifies the server 10 of the content of the operation by the user
  • the display control unit 102 of the server 10 controls the display screen of the terminal 20
  • the control unit 103 of the server 10 trims Image processing such as may be performed.
  • the reception unit 201 is an example of an “acquisition unit”.
  • the image processing unit 203 is an example of a “cutting unit”.
  • the server 10 and the terminal 20 are examples of the “information processing apparatus”.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)
  • Editing Of Facsimile Originals (AREA)

Abstract

La présente invention exécute, dans un dispositif de traitement d'informations : une première étape d'affichage pour afficher une première image ; une étape pour acquérir le contour d'une zone à découper dans la première image ; une deuxième étape d'affichage pour afficher, sur le contour, une pluralité de points d'édition pour corriger le contour ; une troisième étape d'affichage pour agrandir et afficher la première image, et augmenter le nombre de la pluralité de points d'édition puis afficher, en réponse à une opération d'un utilisateur ; et une étape de découpe d'une deuxième image à l'intérieur du contour désigné par la pluralité de points d'édition.
PCT/JP2017/034438 2017-09-25 2017-09-25 Programme, procédé de traitement d'informations et dispositif de traitement d'informations WO2019058539A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2017/034438 WO2019058539A1 (fr) 2017-09-25 2017-09-25 Programme, procédé de traitement d'informations et dispositif de traitement d'informations
JP2019542940A JP6995867B2 (ja) 2017-09-25 2017-09-25 プログラム、情報処理方法、及び情報処理装置

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/034438 WO2019058539A1 (fr) 2017-09-25 2017-09-25 Programme, procédé de traitement d'informations et dispositif de traitement d'informations

Publications (1)

Publication Number Publication Date
WO2019058539A1 true WO2019058539A1 (fr) 2019-03-28

Family

ID=65809624

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/034438 WO2019058539A1 (fr) 2017-09-25 2017-09-25 Programme, procédé de traitement d'informations et dispositif de traitement d'informations

Country Status (2)

Country Link
JP (1) JP6995867B2 (fr)
WO (1) WO2019058539A1 (fr)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09147131A (ja) * 1995-11-24 1997-06-06 Dainippon Screen Mfg Co Ltd 画像部品の切抜き機能を有する画像レイアウト装置

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09147131A (ja) * 1995-11-24 1997-06-06 Dainippon Screen Mfg Co Ltd 画像部品の切抜き機能を有する画像レイアウト装置

Also Published As

Publication number Publication date
JPWO2019058539A1 (ja) 2020-11-19
JP6995867B2 (ja) 2022-01-17

Similar Documents

Publication Publication Date Title
EP2701152B1 (fr) Recherche d'objets multimedia dans une fenêtre collaborative, édition sur client mobile, rendu en réalité augmentée.
EP3091426B1 (fr) Dispositif de terminal utilisateur permettant une interaction utilisateur et son procédé
KR102137240B1 (ko) 디스플레이 영역을 조절하기 위한 방법 및 그 방법을 처리하는 전자 장치
CN110100251B (zh) 用于处理文档的设备、方法和计算机可读存储介质
EP2811731B1 (fr) Dispositif électronique d'édition d'image double et procédé associé
KR20130123171A (ko) 화면을 이동시키기 위한 방법 및 그 전자 장치
KR20160141807A (ko) 적응형 사용자 인터페이스 창 관리자
USRE47812E1 (en) Adaptive determination of information display
US9357132B2 (en) Video rolling shutter correction for lens movement in optical image stabilization cameras
US10810801B2 (en) Method of displaying at least one virtual object in mixed reality, and an associated terminal and system
CN110908554B (zh) 长截图的方法及终端设备
CN104461312A (zh) 一种显示控制方法及电子设备
KR20140113056A (ko) 전자 장치에서 줌 기능을 제어하기 위한 방법 및 장치
KR102113509B1 (ko) 가상 키패드 제어 방법 및 그 전자 장치
US10261602B2 (en) Hop navigation
TW201926968A (zh) 程式、資訊處理方法及資訊處理裝置
US20140111551A1 (en) Information-processing device, storage medium, information-processing method, and information-processing system
JP7029913B2 (ja) プログラム、情報処理方法、及び情報処理装置
CN111010528A (zh) 视频通话方法、移动终端及计算机可读存储介质
JP6995867B2 (ja) プログラム、情報処理方法、及び情報処理装置
JP6918660B2 (ja) プログラム、情報処理方法、及び情報処理装置
KR20130121370A (ko) 터치 반응성 개선 방법 및 그 전자 장치
CN113168286A (zh) 终端、用于该终端的控制方法以及记录用于实现该方法的程序的记录介质
KR20200033640A (ko) 단말기, 이의 제어 방법 및 상기 방법을 구현하기 위한 프로그램을 기록한 기록 매체
WO2023210352A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17926090

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019542940

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17926090

Country of ref document: EP

Kind code of ref document: A1