WO2021167365A2 - Dispositif électronique et procédé de suivi du mouvement d'un objet - Google Patents

Dispositif électronique et procédé de suivi du mouvement d'un objet Download PDF

Info

Publication number
WO2021167365A2
WO2021167365A2 PCT/KR2021/002061 KR2021002061W WO2021167365A2 WO 2021167365 A2 WO2021167365 A2 WO 2021167365A2 KR 2021002061 W KR2021002061 W KR 2021002061W WO 2021167365 A2 WO2021167365 A2 WO 2021167365A2
Authority
WO
WIPO (PCT)
Prior art keywords
processor
electronic device
image
information
color
Prior art date
Application number
PCT/KR2021/002061
Other languages
English (en)
Korean (ko)
Other versions
WO2021167365A3 (fr
Inventor
김민철
김상호
김형진
이성준
허창원
임연욱
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Publication of WO2021167365A2 publication Critical patent/WO2021167365A2/fr
Publication of WO2021167365A3 publication Critical patent/WO2021167365A3/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/285Analysis of motion using a sequence of stereo image pairs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/292Multi-camera tracking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics

Definitions

  • Various embodiments of the present document relate to a technique for tracking the motion of an object moving on the ground by independently extracting the object.
  • the sports industry and health care industry are gradually expanding into areas that take into account various age groups and various sports, centering on individual activities.
  • related technology industries are also developing, and cutting-edge technologies are gradually being grafted together. This is due to the diversification of consumer demand for sporting goods, facilities, and services.
  • a motion tracking or motion tracking technology there are a method of extracting an object on the screen as 3D information by analyzing the phase difference with a stereo camera, a method of tracking the user's motion by extracting the user's area with a general RGB camera, etc. have.
  • an electronic device and method capable of detecting all movement motions without a posture limitation and obtaining joint motion information by scanning user body regions in various postures.
  • the electronic device includes at least one camera module, a memory, and a processor operatively connected to the at least one camera module and the memory, wherein the processor receives an image of a first object. photographing, obtaining color information of the photographed first object, and in response to a video recording command, starting video recording of the first object and a second object moving on the first object, Based on the color information, a first object image corresponding to the first object is determined from the data obtained by photographing the moving picture, and three-dimensional coordinates of an area determined as the first object image in the data are designated as a distance value and an electronic device for tracking the movement of the second object based on the data in which the three-dimensional coordinates of the first object image are changed.
  • the method of operating an electronic device includes an operation of photographing an image of a first object, an operation of obtaining color (RGB) information of the photographed first object, and in response to a video recording command, the Initiating video recording of a first object and a second object moving on the first object, based on the color information of the first object, corresponding to the first object in data obtained by photographing the video
  • An operation of determining a first object image, an operation of changing the three-dimensional coordinates of an area determined as the first object image in the data based on a specified distance value, an operation of changing the three-dimensional coordinates of the first object image based on the changed data It may include an operation of tracking the movement of the second object.
  • information on motion may be extracted through a simple image pre-processing process while maintaining a motion tracking solution.
  • movement tracking or motion tracking is possible without limitation of movement motion, and thus various and effective programs and contents can be configured.
  • FIG. 1 is a diagram illustrating a system for tracking a motion of an object according to an exemplary embodiment.
  • FIG. 2 is a flowchart illustrating an overall flow of tracking a motion of an object in an electronic device, according to an exemplary embodiment.
  • FIG. 3 is a block diagram of an electronic device 301 in a network environment 300 according to various embodiments of the present disclosure.
  • FIG. 4 is a block diagram 400 illustrating a camera module 380, according to various embodiments.
  • FIG. 5 is a flowchart illustrating a flow of pre-processing an image of a first object in an electronic device according to an exemplary embodiment.
  • 6A is a diagram illustrating a state in which an image of a first object is pre-processed in an electronic device according to an exemplary embodiment
  • 6B is a diagram illustrating a state in which an image of a first object is pre-processed in an electronic device according to an exemplary embodiment.
  • FIG. 7 is a flowchart illustrating a flow of determining a first object image in an electronic device according to an exemplary embodiment.
  • FIG. 8 is a flowchart illustrating a flow of adjusting a distance value d of a first object image and tracking a motion of a second object based on location information in an electronic device, according to an exemplary embodiment.
  • 9A is a diagram illustrating a concept of tracking a motion of a second object based on location information in an electronic device according to an exemplary embodiment.
  • 9B is a diagram illustrating a concept of tracking a motion of a second object based on location information in an electronic device according to an exemplary embodiment.
  • 9C is a diagram illustrating a concept of tracking a motion of a second object based on location information in an electronic device according to an exemplary embodiment.
  • FIG. 10 is a diagram illustrating a concept of tracking a motion of a second object by using a change in a distance value d according to various embodiments of the present disclosure
  • FIG. 1 is a diagram illustrating a system for tracking a motion of an object according to an exemplary embodiment.
  • the electronic device 110 uses a camera module (eg, the camera module 480 of FIG. 4 ) under the control of the processor (eg, the processor 320 of FIG. 3 ) to at least the first object 130 . and objects including the second object 140 may be photographed.
  • the first object 130 may be a yoga mattress, an exercise mattress, a blanket, or the like.
  • the second object 140 may be a moving subject (eg, a person, an animal, etc.).
  • the electronic device 110 is a person who practices yoga on a yoga mattress using a camera module (eg, the camera module 480 of FIG. 4 ) under the control of the processor (eg, the processor 320 of FIG. 3 ). can be filmed.
  • the electronic device 110 uses a camera module (eg, the camera module 480 of FIG. 4 ) under the control of the processor (eg, the processor 320 of FIG. 3 ) to provide a yoga mattress, a desk, a vase, a lamp, etc. in addition to a person. You can take pictures of surrounding objects.
  • a camera module eg, the camera module 480 of FIG. 4
  • the processor eg, the processor 320 of FIG. 3
  • the electronic device 110 receives color (RGB) information and location information on the first object 130 and the second object 140 under the control of a processor (eg, the processor 320 of FIG. 3 ).
  • a processor eg, the processor 320 of FIG. 3
  • the electronic device 110 may take a picture of a first object (eg, a yoga mattress, color information of exercise mattresses, blankets, etc.) can be obtained.
  • the electronic device 110 may capture color information and location information of a first object and a second object by shooting a video based on the camera module 380 under the control of a processor (eg, the processor 320 of FIG. 3 ). It is possible to obtain color information and location information of (eg, people, animals, etc.).
  • the electronic device 110 controls the colors of objects (eg, the first object 130 , the second object 140 , etc.) acquired under the control of the processor (eg, the processor 320 of FIG. 3 ). Pre-processing for motion tracking can be performed using information and location information.
  • the electronic device 110 may determine the first object image by using color information of the first object obtained under the control of a processor (eg, the processor 320 of FIG. 3 ). Also, under the control of a processor (eg, the processor 320 of FIG. 3 ), the electronic device 110 may change the distance value of the first object image into data by using the acquired location information. In various embodiments, the electronic device 110 may change the distance value of the first object image to be separated from the second object under the control of a processor (eg, the processor 320 of FIG. 3 ).
  • the electronic device 110 may transmit data related to the motion tracking of the second object to the display device 120 under the control of a processor (eg, the processor 320 of FIG. 3 ).
  • the display device 120 displays a motion image of the second object based on the motion tracking data of the second object received from the electronic device 110 under the control of the processor (eg, the processor 320 of FIG. 3 ). can do.
  • the electronic device 110 may track the movement of a person performing a yoga motion under the control of a processor (eg, the processor 320 of FIG. 3 ), and display data on the person's yoga movement on the display device ( 120) can be transmitted.
  • the display device 120 may implement a figure of a person doing yoga in 3D based on the data on the movement under the control of a processor (eg, the processor 320 of FIG. 3 ).
  • the electronic device 110 or the display device 120 may have the same or similar structure to the electronic device 301 of FIG. 3 to be described later.
  • the display device 120 may be a computer as shown in FIG. 1 or various electronic devices such as a TV and a tablet PC not shown in FIG. 1 .
  • FIG. 2 is a flowchart illustrating an overall flow of independently extracting an object from an electronic device to track a motion, according to an exemplary embodiment.
  • the electronic device 110 may execute an application.
  • the electronic device 110 may execute the application upon obtaining an input for executing the application from the user under the control of the processor (eg, the processor 320 of FIG. 3 ).
  • the application may be an application for photographing and tracking the movement of a moving subject.
  • an input for executing the application may be various, such as an input for touching an icon, a voice input, and the like.
  • the electronic device 110 may photograph the first object under the control of a processor (eg, the processor 320 of FIG. 3 ).
  • the first object may be, for example, a yoga mattress, an exercise mattress, a blanket, a living room rug, or the like.
  • the shooting may be a photo shooting or a video shooting.
  • the electronic device 110 may photograph the floor of a living room, a room, a gym floor, etc. under the control of a processor (eg, the processor 320 of FIG. 3 ). Even in the case of photographing the ground, all operations disclosed in this document may be applicable to the electronic device 110 .
  • the electronic device 110 may detect and store color information of the first object under the control of a processor (eg, the processor 320 of FIG. 3 ).
  • the color information may include RGB information.
  • the RGB information may include information such as R, G, and B color values.
  • the electronic device 100 may store the data related to the color information in a memory under the control of a processor (eg, the processor 320 of FIG. 3 ). In this regard, it will be described in detail with reference to FIG. 5 .
  • the electronic device 110 may photograph objects.
  • the objects may include a first object and a second object moving on the first object.
  • the electronic device 110 may photograph the yoga mattress (eg, the first object) and the user (eg, the second object) under the control of the processor (eg, the processor 320 of FIG. 3 ).
  • the shooting may be moving picture shooting.
  • the electronic device 110 may be capturing a video of a user doing yoga on a yoga mattress in real time according to a composition captured by the user under the control of a processor (eg, the processor 320 of FIG. 3 ).
  • the user can check the state of doing yoga in real time through the screen displayed on the display device 120 .
  • the electronic device 110 may capture a video of a user's yoga movement under the control of a processor (eg, the processor 320 of FIG. 3 ), and the display device 120 may capture a video of the user's yoga movement. 3, the movement may be implemented in 3D under the control of the processor 320).
  • the electronic device 110 may photograph objects based on an RGB camera and a stereo camera under the control of a processor (eg, the processor 320 of FIG. 3 ).
  • the electronic device 110 may include an RGB camera and a stereo camera.
  • the electronic device 110 may obtain color information (eg, RGB information) of objects from an RGB camera under the control of a processor (eg, the processor 320 of FIG. 3 ), and may obtain position information of objects (eg, RGB information) from the stereo camera. : 3D information) can be obtained.
  • the electronic device 110 may correct the image or image acquired by the RGB camera and the stereo camera to match or correspond to each other based on the predefined differences for each configuration (eg, mounting position, mounting angle, distance between configurations, etc.). have. For example, when there is a difference in the positions or compositions of objects in the image or image acquired by the RGB camera and the stereo camera under the control of the electronic device 110 (eg, the processor 320 of FIG.
  • the electronic device 110 determines the object They can be processed so that their positions or compositions match or correspond. For another example, the electronic device 110 processes to match or correspond to the pixel positions of the objects in the image or image acquired by the RGB camera and the stereo camera under the control of the processor (eg, the processor 320 of FIG. 3 ). can do.
  • the processor eg, the processor 320 of FIG. 3
  • the electronic device 100 may acquire color information and location information of the objects by photographing the objects under the control of a processor (eg, the processor 320 of FIG. 3 ).
  • a processor eg, the processor 320 of FIG. 3
  • the electronic device 100 may obtain color information of objects from an RGB camera and may obtain position information of objects from a stereo camera.
  • the color information may include at least RGB information.
  • the location information may include location information of a first object (eg, a yoga mattress), location information of a second object (eg, a person), and location information of other objects (eg, a vase, a lamp, etc.).
  • the electronic device 110 may track the movement of the second object.
  • the electronic device 110 may acquire joint information based on location information of a second object (eg, a person) under the control of a processor (eg, the processor 320 of FIG. 3 ).
  • a processor eg, the processor 320 of FIG. 3
  • the electronic device 110 may control a person's neck joint, shoulder joint, wrist joint, wrist joint, pelvic joint, knee joint, and ankle joint according to the control of the processor (eg, the processor 320 of FIG. 3 ).
  • Joint information including information and the like can be obtained.
  • the electronic device 110 controls the object to be photographed based on the color information of the first object obtained through operations 220 and 230 under the control of a processor (eg, the processor 320 of FIG. 3 ).
  • the first object may be spatially separated from the remaining objects (eg, the second object).
  • the electronic device 110 changes the value of one of the spatial coordinates of the first object (eg, the Z-axis) under the control of the processor (eg, the processor 320 of FIG. 3 ).
  • the first object and the second object may be sufficiently spaced apart in the Z-axis.
  • the electronic device 110 moves the first object and the second object apart on spatial coordinates under the control of the processor (eg, the processor 320 of FIG. 3 ), so that the second object is actually close to the first object (eg, the mat). Even if the object (eg, a person) is performing an exercise such as a push-up or yoga, the movement of the second object may be effectively tracked.
  • the electronic device 110 may acquire location information including the joint information by using a stereo camera under the control of a processor (eg, the processor 320 of FIG. 3 ).
  • the camera module (eg, 380 ) of the electronic device 110 may include a stereo camera as a configuration.
  • the electronic device 110 may acquire location information of a second object (eg, a person) through a stereo camera under the control of a processor (eg, the processor 320 of FIG. 3 ).
  • the location information may include coordinates of a neck joint, a shoulder joint, a wrist joint, a wrist joint, a pelvic joint, a knee joint, an ankle joint, and a change in the coordinates.
  • the electronic device 110 may track the motion of the second object based on location information including joint information under the control of a processor (eg, the processor 320 of FIG. 3 ). For example, the electronic device 110 may set the positions of various joints included in the joint information as coordinates based on the x-axis, the y-axis, and the z-axis under the control of the processor (eg, the processor 320 of FIG. 3 ). . Also, the electronic device 110 may detect a change in the coordinates of the joints under the control of a processor (eg, the processor 320 of FIG. 3 ), and may track the movement by tracking the coordinates.
  • a processor eg, the processor 320 of FIG. 3
  • FIG. 3 is a block diagram of an electronic device 301 in a network environment 300 according to various embodiments of the present disclosure.
  • the electronic device 301 communicates with the electronic device 302 through a first network 398 (eg, a short-range wireless communication network) or a second network 399 . It may communicate with the electronic device 304 or the server 308 through (eg, a long-distance wireless communication network). According to an embodiment, the electronic device 301 may communicate with the electronic device 304 through the server 308 .
  • the electronic device 301 includes a processor 320 , a memory 330 , an input device 350 , a sound output device 355 , a display device 360 , an audio module 370 , and a sensor module ( 376 , interface 377 , haptic module 379 , camera module 380 , power management module 388 , battery 389 , communication module 390 , subscriber identification module 396 , or antenna module 397 . ) may be included. In some embodiments, at least one of these components (eg, the display device 360 or the camera module 380 ) may be omitted or one or more other components may be added to the electronic device 301 . In some embodiments, some of these components may be implemented as a single integrated circuit. For example, the sensor module 376 (eg, a fingerprint sensor, an iris sensor, or an illuminance sensor) may be implemented while being embedded in the display device 360 (eg, a display).
  • the sensor module 376 eg, a fingerprint sensor, an iris sensor
  • the processor 320 executes software (eg, a program 340) to execute at least one other component (eg, a hardware or software component) of the electronic device 301 connected to the processor 320 . It can control and perform various data processing or operations. According to an embodiment, as at least part of data processing or operation, the processor 320 stores commands or data received from other components (eg, the sensor module 376 or the communication module 390 ) into the volatile memory 332 . may be loaded into the volatile memory 332 , may process commands or data stored in the volatile memory 332 , and may store the resulting data in the non-volatile memory 334 .
  • software eg, a program 340
  • the processor 320 stores commands or data received from other components (eg, the sensor module 376 or the communication module 390 ) into the volatile memory 332 .
  • the processor 320 includes a main processor 321 (eg, a central processing unit or an application processor), and an auxiliary processor 323 (eg, a graphic processing unit, an image signal processor) that can be operated independently or together with the main processor 321 . , a sensor hub processor, or a communication processor). Additionally or alternatively, the auxiliary processor 323 may be configured to use less power than the main processor 321 or to specialize in a designated function. The auxiliary processor 323 may be implemented separately from or as part of the main processor 321 .
  • a main processor 321 eg, a central processing unit or an application processor
  • an auxiliary processor 323 eg, a graphic processing unit, an image signal processor
  • the auxiliary processor 323 may be configured to use less power than the main processor 321 or to specialize in a designated function.
  • the auxiliary processor 323 may be implemented separately from or as part of the main processor 321 .
  • the coprocessor 323 may be, for example, on behalf of the main processor 321 while the main processor 321 is in an inactive (eg, sleep) state, or the main processor 321 is active (eg, executing an application). ), together with the main processor 321, at least one of the components of the electronic device 301 (eg, the display device 360, the sensor module 376, or the communication module 390) It is possible to control at least some of the related functions or states.
  • the coprocessor 323 eg, an image signal processor or a communication processor
  • may be implemented as part of another functionally related component eg, the camera module 380 or the communication module 390). have.
  • the memory 330 may store various data used by at least one component (eg, the processor 320 or the sensor module 376 ) of the electronic device 301 .
  • the data may include, for example, input data or output data for software (eg, the program 340 ) and instructions related thereto.
  • the memory 330 may include a volatile memory 332 or a non-volatile memory 334 .
  • the program 340 may be stored as software in the memory 330 , and may include, for example, an operating system 342 , middleware 344 , or an application 346 .
  • the input device 350 may receive a command or data to be used in a component (eg, the processor 320 ) of the electronic device 301 from the outside (eg, a user) of the electronic device 301 .
  • the input device 350 may include, for example, a microphone, a mouse, a keyboard, or a digital pen (eg, a stylus pen).
  • the sound output device 355 may output a sound signal to the outside of the electronic device 301 .
  • the sound output device 355 may include, for example, a speaker or a receiver.
  • the speaker can be used for general purposes such as multimedia playback or recording playback, and the receiver can be used to receive incoming calls.
  • the receiver may be implemented separately from or as a part of the speaker.
  • the display device 360 may visually provide information to the outside (eg, a user) of the electronic device 301 .
  • the display device 360 may include, for example, a display, a hologram device, or a projector and a control circuit for controlling the corresponding device.
  • the display device 360 may include a touch circuitry configured to sense a touch or a sensor circuit (eg, a pressure sensor) configured to measure the intensity of a force generated by the touch. have.
  • the audio module 370 may convert a sound into an electric signal or, conversely, convert an electric signal into a sound. According to an embodiment, the audio module 370 acquires a sound through the input device 350 or an external electronic device (eg, a sound output device 355 ) directly or wirelessly connected to the electronic device 301 .
  • the electronic device 302) eg, a speaker or headphones
  • the sensor module 376 detects an operating state (eg, power or temperature) of the electronic device 301 or an external environmental state (eg, user state), and generates an electrical signal or data value corresponding to the sensed state. can do.
  • the sensor module 376 may include, for example, a gesture sensor, a gyro sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, It may include a temperature sensor, a humidity sensor, or an illuminance sensor.
  • the interface 377 may support one or more designated protocols that may be used for the electronic device 301 to directly or wirelessly connect with an external electronic device (eg, the electronic device 302 ).
  • the interface 377 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • HDMI high definition multimedia interface
  • USB universal serial bus
  • SD card interface Secure Digital Card
  • connection terminal 378 may include a connector through which the electronic device 301 can be physically connected to an external electronic device (eg, the electronic device 302 ).
  • the connection terminal 378 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 379 may convert an electrical signal into a mechanical stimulus (eg, vibration or movement) or an electrical stimulus that the user can perceive through tactile or kinesthetic sense.
  • the haptic module 379 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 380 may capture still images and moving images.
  • the camera module 380 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 388 may manage power supplied to the electronic device 301 .
  • the power management module 388 may be implemented as, for example, at least a part of a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • the battery 389 may supply power to at least one component of the electronic device 301 .
  • the battery 389 may include, for example, a non-rechargeable primary cell, a rechargeable secondary cell, or a fuel cell.
  • the communication module 390 is a direct (eg, wired) communication channel or a wireless communication channel between the electronic device 301 and an external electronic device (eg, the electronic device 302, the electronic device 304, or the server 308). It can support establishment and communication through the established communication channel.
  • the communication module 390 may include one or more communication processors that operate independently of the processor 320 (eg, an application processor) and support direct (eg, wired) communication or wireless communication.
  • the communication module 390 is a wireless communication module 392 (eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 394 (eg, : It may include a local area network (LAN) communication module, or a power line communication module).
  • a wireless communication module 392 eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module
  • GNSS global navigation satellite system
  • wired communication module 394 eg, : It may include a local area network (LAN) communication module, or a power line communication module.
  • a corresponding communication module is a first network 398 (eg, a short-range communication network such as Bluetooth, WiFi direct, or infrared data association (IrDA)) or a second network 399 (eg, a cellular network, the Internet, or It may communicate with an external electronic device via a computer network (eg, a telecommunication network such as a LAN or WAN).
  • a computer network eg, a telecommunication network such as a LAN or WAN.
  • These various types of communication modules may be integrated into one component (eg, a single chip) or may be implemented as a plurality of components (eg, multiple chips) separate from each other.
  • the wireless communication module 392 uses the subscriber information (eg, International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 396 within a communication network, such as the first network 398 or the second network 399 .
  • the electronic device 301 may be identified and authenticated.
  • the antenna module 397 may transmit or receive a signal or power to the outside (eg, an external electronic device).
  • the antenna module may include one antenna including a conductor formed on a substrate (eg, a PCB) or a radiator formed of a conductive pattern.
  • the antenna module 397 may include a plurality of antennas. In this case, at least one antenna suitable for a communication scheme used in a communication network such as the first network 398 or the second network 399 is connected from the plurality of antennas by, for example, the communication module 390 . can be selected. A signal or power may be transmitted or received between the communication module 390 and the external electronic device through the selected at least one antenna.
  • other components eg, RFIC
  • other than the radiator may be additionally formed as a part of the antenna module 397 .
  • peripheral devices eg, a bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
  • GPIO general purpose input and output
  • SPI serial peripheral interface
  • MIPI mobile industry processor interface
  • the command or data may be transmitted or received between the electronic device 301 and the external electronic device 304 through the server 308 connected to the second network 399 .
  • Each of the electronic devices 302 and 304 may be the same or a different type of device as the electronic device 301 .
  • all or part of the operations performed by the electronic device 301 may be executed by one or more of the external electronic devices 302 , 304 , or 308 .
  • the electronic device 301 may perform the function or service itself instead of executing the function or service itself.
  • one or more external electronic devices may be requested to perform at least a part of the function or the service.
  • the one or more external electronic devices that have received the request may execute at least a part of the requested function or service, or an additional function or service related to the request, and transmit a result of the execution to the electronic device 301 .
  • the electronic device 301 may process the result as it is or additionally and provide it as at least a part of a response to the request.
  • cloud computing, distributed computing, or client-server computing technology may be used.
  • the camera module 380 includes a lens assembly 410 , a flash 420 , an image sensor 430 , an image stabilizer 440 , a memory 450 (eg, a buffer memory), or an image signal processor. (460).
  • the lens assembly 410 may collect light emitted from a subject to be imaged.
  • the lens assembly 410 may include one or more lenses.
  • the camera module 380 may include a plurality of lens assemblies 410 . In this case, the camera module 380 may form, for example, a dual camera, a 360 degree camera, or a spherical camera.
  • Some of the plurality of lens assemblies 410 may have the same lens properties (eg, angle of view, focal length, auto focus, f number, or optical zoom), or at least one lens assembly may be a different lens assembly. It may have one or more lens properties different from the lens properties of .
  • the lens assembly 410 may include, for example, a wide-angle lens or a telephoto lens.
  • the flash 420 may emit light used to enhance light emitted or reflected from the subject.
  • the flash 420 may include one or more light emitting diodes (eg, a red-green-blue (RGB) LED, a white LED, an infrared LED, or an ultraviolet LED), or a xenon lamp.
  • the image sensor 430 may acquire an image corresponding to the subject by converting light emitted or reflected from the subject and transmitted through the lens assembly 410 into an electrical signal.
  • the image sensor 430 may include, for example, one image sensor selected from among image sensors having different properties, such as an RGB sensor, a black and white (BW) sensor, an IR sensor, or a UV sensor, the same It may include a plurality of image sensors having a property, or a plurality of image sensors having different properties.
  • Each image sensor included in the image sensor 430 may be implemented using, for example, a charged coupled device (CCD) sensor or a complementary metal oxide semiconductor (CMOS) sensor.
  • CCD charged coupled device
  • CMOS complementary metal oxide semiconductor
  • the image stabilizer 440 moves at least one lens or the image sensor 430 included in the lens assembly 410 in a specific direction or Operation characteristics of the image sensor 430 may be controlled (eg, read-out timing may be adjusted, etc.). This makes it possible to compensate for at least some of the negative effects of the movement on the image being taken.
  • the image stabilizer 440 according to an embodiment, the image stabilizer 440 is a gyro sensor (not shown) or an acceleration sensor (not shown) disposed inside or outside the camera module 380 . can be used to detect such a movement of the camera module 380 or the electronic device 301 .
  • the image stabilizer 440 may be implemented as, for example, an optical image stabilizer.
  • the memory 450 may temporarily store at least a portion of the image acquired through the image sensor 430 for a next image processing operation. For example, when image acquisition is delayed according to the shutter or a plurality of images are acquired at high speed, the acquired original image (eg, Bayer-patterned image or high-resolution image) is stored in the memory 450 and , a copy image corresponding thereto (eg, a low-resolution image) may be previewed through the display device 360 .
  • the acquired original image eg, Bayer-patterned image or high-resolution image
  • a copy image corresponding thereto eg, a low-resolution image
  • the memory 450 may be configured as at least a part of the memory 330 or as a separate memory operated independently of the memory 330 .
  • the image signal processor 460 may perform one or more image processing on an image acquired through the image sensor 430 or an image stored in the memory 450 .
  • the one or more image processes may include, for example, depth map generation, three-dimensional modeling, panorama generation, feature point extraction, image synthesis, or image compensation (eg, noise reduction, resolution adjustment, brightness adjustment, blurring ( blurring, sharpening, or softening.
  • the image signal processor 460 may include at least one of the components included in the camera module 380 (eg, an image sensor). 430), for example, exposure time control, readout timing control, etc.
  • the image processed by the image signal processor 460 is stored back in the memory 450 for further processing.
  • the image signal processor 460 may be configured as at least a part of the processor 320 or as a separate processor operated independently of the processor 320.
  • the image signal processor 460 may include the processor 320 . and a separate processor, the at least one image processed by the image signal processor 460 may be displayed through the display device 360 as it is by the processor 320 or after additional image processing.
  • the electronic device 301 may include a plurality of camera modules 380 each having different properties or functions.
  • at least one of the plurality of camera modules 380 may be a wide-angle camera, and at least the other may be a telephoto camera.
  • at least one of the plurality of camera modules 380 may be a front camera, and at least the other may be a rear camera.
  • FIG. 5 is a flowchart illustrating a flow of pre-processing an image of a first object in an electronic device according to an exemplary embodiment. Operations according to various embodiments of FIG. 5 will be described with reference to FIGS. 6A and 6B , which show images of the first object being pre-processed in the electronic device.
  • the electronic device 110 may photograph the first object under the control of a processor (eg, the processor 320 of FIG. 3 ).
  • a processor eg, the processor 320 of FIG. 3
  • the user may photograph the yoga mattress 610 according to the application manual.
  • the electronic device 110 may take a picture or a video of the yoga mattress 610 based on a user's shooting input under the control of a processor (eg, the processor 320 of FIG. 3 ).
  • the electronic device 110 may display a first object (eg, yoga mattress 610) in the center of the photographing screen based on the user's photographing motion under the control of the processor (eg, the processor 320 of FIG. 3 ). You can take a photo or video to locate it.
  • the camera module (eg, 380) of the electronic device 110 may include an RGB camera.
  • the electronic device 110 may take a picture or a video of the yoga mattress 610 based on the RGB camera of the camera module (eg, 380) under the control of the processor (eg, the processor 320 of FIG. 3 ). have.
  • the electronic device 110 may detect color information of the first object from the captured image under the control of a processor (eg, the processor 320 of FIG. 3 ).
  • a processor eg, the processor 320 of FIG. 3
  • the electronic device 110 in an embodiment is the yoga mattress 610 from the image taken with the camera module (eg, 380) of the yoga mattress 610.
  • the electronic device 110 converts the same object to a point where the color information of the first object rapidly changes based on the camera module (eg, 380 ) under the control of the processor (eg, the processor 320 of FIG. 3 ).
  • the electronic device 110 may set the yoga mattress 610 to be positioned at the center of the screen in the purple yoga mattress 610 and the white surrounding background under the control of the processor (eg, the processor 320 of FIG. 3 ). can be filmed
  • the electronic device 110 moves up/down/left/right pixel by pixel starting from the center point of the screen under the control of the processor (eg, the processor 320 of FIG. 3 ), and the RGB color values can be detected.
  • the electronic device 110 may determine the same object based on a point at which color information (eg, RGB color value) rapidly changes under the control of a processor (eg, the processor 320 of FIG. 3 ) as a boundary. have.
  • the electronic device 110 detects the color information of the yoga mattress 610 under the control of the processor (eg, the processor 320 of FIG. 3 ) at the boundary between the purple yoga mattress 610 and the white background. Points where color information (eg, RGB color values) change rapidly can be identified.
  • the electronic device 110 may determine the purple yoga mattress 610 up to the points where the color information changes rapidly under the control of the processor (eg, the processor 320 of FIG. 3 ).
  • the electronic device 110 may determine the representative color of the first object under the control of a processor (eg, the processor 320 of FIG. 3 ).
  • the RGB color value of color information in the first object is different depending on lighting or a shooting angle.
  • color information (eg, purple) of the yoga mattress 610 may be different depending on a light, sunlight, or shooting angle.
  • the electronic device 110 may detect color information according to the above-described operation 520 based on the control of the processor (eg, the processor 320 of FIG. 3 ), and obtain an average value of the detected RGB color values. The average value may be determined as a representative color of the yoga mattress 610 .
  • the average value of the RGB color values of the same object may be the same for every photographing.
  • the average value of the RGB color values of the yoga mattress 610 may be a value corresponding to purple even when shooting with a time difference.
  • the electronic device 110 may obtain the average value of the RGB color values of the detected first object (eg, yoga mattress) with a time difference under the control of the processor (eg, the processor 320 of FIG. 3 ), and the The color corresponding to the average value may be the same as the representative color (eg, purple) even when shooting with a time difference.
  • the RGB average color value of the image of the first object may be obtained based on Equation 1 (hereinafter, 'first condition').
  • the red (R, red) average color value of the image of the first object is from the center pixel of the image of the first object (eg, yoga mattress 610) to the outermost pixel It may be a value obtained by dividing the sum of the color values of R by the number of pixels.
  • the green (G, green) average color value of the image of the first object is the outermost pixel from the center pixel of the image of the first object (eg, yoga mattress 610) It may be a value obtained by dividing the sum of the color values of G up to and including the number of pixels.
  • the blue (B, blue) average color value of the image of the first object is B from the center pixel of the image of the first object (eg, yoga mattress 610) to the outermost pixel It may be a value obtained by dividing the sum of the color values of , by the number of pixels.
  • the electronic device 110 may store color information and a representative color of the first object under the control of a processor (eg, the processor 320 of FIG. 3 ).
  • the electronic device 100 may display color information (eg, a color of a yoga mattress (eg, purple), a color of a background (eg, white), etc.) of the yoga mattress under the control of a processor (eg, the processor 320 of FIG. 3 ).
  • RGB information e.g, a color of a yoga mattress (eg, purple), a color of a background (eg, white), etc.
  • RGB information RGB information
  • representative color data can be stored in the memory.
  • FIG. 7 is a flowchart illustrating a flow of determining a first object image in an electronic device according to an exemplary embodiment.
  • the electronic device 110 may photograph objects under the control of a processor (eg, the processor 320 of FIG. 3 ).
  • the objects may include at least a first object and a second object moving on the first object.
  • the electronic device 110 may record a video of the user 140 doing yoga on the yoga mattress 130 under the control of the processor (eg, the processor 320 of FIG. 3 ).
  • the electronic device 110 records a video under the control of the processor (eg, the processor 320 of FIG. 3 )
  • the user 140 and the yoga mattress 130 include not only the floor, a vase, a lamp, etc. You can also record a video with the background.
  • the electronic device 110 may compare the color information of the first object stored in the memory with the color information of the photographed objects under the control of the processor (eg, the processor 320 of FIG. 3 ). have.
  • the electronic device 110 displays the color information of the objects on the screen being photographed including the user 140 , the yoga mattress 130 , and the like in an embodiment.
  • the electronic device 110 controls the user according to the control of the processor (eg, the processor 320 of FIG. 3 ) based on color information corresponding to the color (eg, purple) of the yoga mattress 130 stored in operation 540 .
  • an object having the same color information as color information corresponding to a color (eg, purple) of the yoga mattress 130 in the yoga mattress 130 may be searched for on the screen.
  • the electronic device 110 may have an RGB average color value equal to an RGB average color value (eg, an average RGB color value of purple) calculated under the control of a processor (eg, the processor 320 of FIG. 3 ). It is possible to determine an object having the same color information as an object having the same color information.
  • an RGB average color value eg, an average RGB color value of purple
  • the electronic device 110 may display the user based on the color information of the color (eg, purple) of the yoga mattress 130 stored in operation 540 under the control of the processor (eg, the processor 320 of FIG. 3 ).
  • An object having color information corresponding to the color (eg, purple) of the yoga mattress 130 among the yoga mattress 130 , the floor, the vase, and the lamp may be searched for on the screen.
  • the electronic device 110 determines that objects having RGB average color values within a specific range from the RGB average color values calculated under the control of the processor (eg, the processor 320 of FIG. 3 ) are the same object.
  • the electronic device 110 is within 2% of the RGB average color value (eg, purple RGB average color value) calculated under the control of the processor (eg, the processor 320 of FIG. 3 ).
  • Objects having RGB average color values within 5% may be determined as the same object.
  • the 2% and 5% described above are an example, and may be various ranges according to the setting of the electronic device 110 .
  • the electronic device 110 may determine an image corresponding to the color information of the first object under the control of a processor (eg, the processor 320 of FIG. 3 ). Also, the image corresponding to the color information of the first object may be a 'first object image'. For example, the electronic device 110 performs operation 710 based on color information (eg, purple) of the yoga mattress 610 stored in operation 540 under the control of the processor (eg, the processor 320 of FIG. 3 ). The yoga mattress 610 image may be determined by detecting the yoga mattress 610 having the same color information among the objects being photographed.
  • a processor eg, the processor 320 of FIG. 3
  • the electronic device 110 performs operation 710 based on color information (eg, purple) of the yoga mattress 610 stored in operation 540 under the control of the processor (eg, the processor 320 of FIG. 3 ).
  • the yoga mattress 610 image may be determined by detecting the yoga mattress 610 having the same color information among the objects being photographed.
  • the electronic device 110 operates based on the color information of the color (eg, purple) of the yoga mattress 610 stored in operation 540 under the control of the processor (eg, the processor 320 of FIG. 3 ).
  • the yoga mattress 610 image may be determined by detecting the yoga mattress 610 having similar color information within a specific range (eg, within a 2% range) among the objects being photographed.
  • FIG. 8 is a flowchart illustrating a flow of adjusting a distance value d of a first object image and tracking a motion of a second object based on location information in an electronic device, according to an exemplary embodiment. Operations according to various embodiments of FIG. 8 will be described with reference to FIGS. 9A, 9B, and 9C illustrating a concept of tracking a motion of a second object based on location information in an electronic device.
  • the electronic device 110 may photograph objects under the control of a processor (eg, the processor 320 of FIG. 3 ).
  • the electronic device 110 may capture a moving picture of the yoga mattress 910 , the person 920 , and the like under the control of a processor (eg, the processor 320 of FIG. 3 ).
  • the person 920 may be exercising in a state in close contact with the yoga mattress 910 (eg, a prone motion, a lying motion, etc.).
  • the electronic device 110 may acquire location information of objects under the control of a processor (eg, the processor 320 of FIG. 3 ).
  • the objects may include a first object (eg, the yoga mattress 910) and a second object (eg, a person) moving on the first object.
  • the electronic device 110 uses the stereo camera of the camera module 380 under the control of the processor (eg, the processor 320 of FIG. 3 ) to provide position information of the yoga mattress 910 and the person 920 .
  • the location information may include distance information (eg, a distance value), coordinate information, coordinate movement information, joint information, coordinate axis information, and the like.
  • the joint information of the person 920 may include coordinates of a neck joint, a shoulder joint, a wrist joint, a wrist joint, a finger joint, a pelvic joint, a knee joint, an ankle joint, and the like.
  • the location information may include coordinate information of a shoulder joint, a wrist joint, and the like, and motion information of the coordinates.
  • location information including various coordinate information of objects, coordinate movement information, etc. may be obtained based on the coordinate axes (x-axis, y-axis, and z-axis) with respect to the electronic device 110 .
  • the electronic device 110 may adjust the distance value d of the image corresponding to the color information of the first object under the control of the processor (eg, the processor 320 of FIG. 3 ). have.
  • the electronic device 110 determines an image of the yoga mattress 910 in operation 730 under the control of a processor (eg, the processor 320 of FIG. 3 ), and determines the yoga mattress 910 and the person ( 920) and the like may be acquired.
  • the electronic device 110 may acquire distance information from the electronic device 110 to the yoga mattress 910 under the control of a processor (eg, the processor 320 of FIG. 3 ).
  • the distance information may include a distance value from the electronic device 110 to the yoga mattress 910, a coordinate value for the distance, and the like.
  • the distance value from the electronic device 110 to the person 920 may be expressed as d.
  • the electronic device 110 provides coordinates (eg, 5) corresponding to the distance value d (eg, 5) of the person 920 with respect to the electronic device 110 under the control of the processor (eg, the processor 320 of FIG. 3 ).
  • Example: (3, 4, 0)) can be set.
  • the electronic device 110 determines the position of the yoga mattress 910 image based on a value ( ⁇ ) set under the control of a processor (eg, the processor 320 of FIG. 3 ) or a specific distance value ( ⁇ ) It can be changed based on a specific coordinate axis.
  • the set value ⁇ may be an arbitrary value preset for distance adjustment in the electronic device 110 .
  • the specific distance value ⁇ may be a distance value obtained while capturing a video by the electronic device 110 . Detailed descriptions of the set value ( ⁇ ) and the specific distance value ( ⁇ ) are separately described in each operation.
  • the electronic device 110 controls the -z-axis by the value ⁇ preset from the electronic device 110 at the original position of the yoga mattress 910 image under the control of the processor (eg, the processor 320 of FIG. 3 ). You can change the distance value (d) to be further away in the direction.
  • the yoga mattress 910 may be moved further away from the original position of the image by a specific distance value ⁇ in the -z-axis direction.
  • the distance value (d) of the image of the yoga mattress 1010 may be changed (eg, the distance value (d')), and the coordinates (eg, (a, b, c)) are also different coordinates (eg (a)) , b, c')).
  • the electronic device 110 determines the distance value d of the image of the first object (eg, the yoga mattress 910 ) based on a specific coordinate axis under the control of the processor (eg, the processor 320 of FIG. 3 ).
  • the changed value can be stored as specific data in memory.
  • the electronic device 110 may acquire joint information of the second object based on location information of the objects under the control of a processor (eg, the processor 320 of FIG. 3 ).
  • a processor eg, the processor 320 of FIG. 3
  • the electronic device 110 in an embodiment moves the first object (eg, the yoga mattress 910 ) further away from the data based on a specific coordinate axis.
  • the joint information of the second object may be acquired based on the camera module.
  • the electronic device 110 may be controlled by a processor (eg, the processor 320 of FIG. 3 ).
  • joint information including a neck joint, a shoulder joint, a wrist joint, a wrist joint, a pelvic joint, a knee joint, an ankle joint, etc. of the person 920 can be obtained.
  • the yoga mattress 910 is in a state where the yoga mattress 910 is not located under the person 920 (eg, the person 920 and the yoga mattress (
  • the joint information 930 of the person 920 may be obtained in a state in which the 910 is separated).
  • the electronic device 110 may track the motion of the second object based on joint information under the control of a processor (eg, the processor 320 of FIG. 3 ). For example, the electronic device 110 obtains joint information of each joint including coordinate information when the person 920 is exercising in a prone position under the control of a processor (eg, the processor 320 of FIG. 3 ). can do. Also, when the person 920 performs push-ups in the prone position, the electronic device 110 performs the push-up operation while the person 920 changes the posture under the control of the processor (eg, the processor 320 of FIG. 3 ). Coordinate information of each joint can be acquired in real time. The electronic device 110 may track the movement of the person 920 based on the coordinate information and the coordinate movement information under the control of a processor (eg, the processor 320 of FIG. 3 ).
  • a processor eg, the processor 320 of FIG. 3
  • the electronic device 110 performs data related to the location information of the first object image and the motion tracking of the second object on the display device 120 under the control of the processor (eg, the processor 320 of FIG. 3 ).
  • data can be transmitted.
  • the display device 120 may display a motion image of the second object based on the first object and data about the second object under the control of the processor (eg, the processor 320 of FIG. 3 ).
  • the electronic device 110 determines the distance value d of the image of the first object (eg, the yoga mattress 910) based on a specific coordinate axis under the control of the processor (eg, the processor 320 of FIG. 3 ). It is possible to transmit original data, not specific data changed to , to the display device 120 . Also, the electronic device 110 may transmit data tracking the motion of the second object to the display device 120 under the control of a processor (eg, the processor 320 of FIG. 3 ). In this case, the display device 120 receiving the above-described data displays an image of the person 920 exercising on the yoga mattress 910 under the control of the processor (eg, the processor 320 of FIG. 3 ). can do.
  • the processor eg, the processor 320 of FIG. 3
  • the electronic device 110 sets the distance value d of the image of the first object (eg, yoga mattress 910) based on a specific coordinate axis under the control of the processor (eg, the processor 320 of FIG. 3 ).
  • the specific data changed to . may be transmitted to the display device 120 .
  • the electronic device 110 may transmit data tracking the motion of the second object to the display device 120 under the control of a processor (eg, the processor 320 of FIG. 3 ).
  • the display device 120 that has received the above-described data performs exercise under the control of the processor (eg, the processor 320 of FIG. 3 ) in a state where the yoga mattress 910 is not located below the person 920 . You can display an image of what you are doing.
  • FIG. 10 is a diagram illustrating a concept of tracking a motion of a second object by using a change in a distance value d according to various embodiments of the present disclosure
  • the electronic device 110 may capture a moving picture of the yoga mattress 1010 and the person 1020 under the control of a processor (eg, the processor 320 of FIG. 3 ).
  • the electronic device 110 may photograph objects including the vase 1040 and the lamp 1050 under the control of a processor (eg, the processor 320 of FIG. 3 ).
  • the electronic device 110 provides location information of the yoga mattress 1010 and the person 1020 based on the stereo camera of the camera module 380 under the control of the processor (eg, the processor 320 of FIG. 3 ). can be obtained. Also, the electronic device 110 may acquire location information such as the vase 1040 and the lamp 1050 based on the camera module under the control of the processor (eg, the processor 320 of FIG. 3 ).
  • the location information may include distance information (eg, a distance value), coordinate information, coordinate motion information, joint information, coordinate axis information, and the like.
  • the joint information 1030 of the person 1020 may include coordinates of a neck joint, a shoulder joint, a wrist joint, a wrist joint, a finger joint, a pelvic joint, a knee joint, an ankle joint, etc. have.
  • the location information may include coordinate information of a shoulder joint, a wrist joint, and the like, and motion information of the coordinates.
  • the electronic device 110 is controlled by a processor (eg, the processor 320 of FIG. 3 ) based on the coordinate axes (x-axis, y-axis, and z-axis) with respect to the electronic device 110 . It is possible to obtain location information including various coordinate information of objects, coordinate motion information, and the like.
  • the electronic device 110 determines an image of the yoga mattress 1010 under the control of the processor (eg, the processor 320 of FIG. 3 ), and the yoga mattress 1010 , the person 1020 , and the vase 1040 . ), location information such as the light 1050 may be acquired.
  • the processor eg, the processor 320 of FIG. 3
  • location information such as the light 1050 may be acquired.
  • the electronic device 110 sets the distance value (eg, the distance value of the image corresponding to the color information of the first object (eg, the yoga mattress 1010 )) under the control of the processor (eg, the processor 320 of FIG. 3 ). d) can be changed. In addition, the electronic device 110 determines the position of the yoga mattress 1010 image based on a value ( ⁇ ) set under the control of the processor (eg, the processor 320 of FIG. 3 ) or a specific distance value ( ⁇ ) obtained from the photographed image. can be changed based on various coordinate axes.
  • the electronic device 110 moves the position of the yoga mattress 1010 image from the original position to the vase 1040 from the original position under the control of the processor (eg, the processor 320 of FIG. 3 ).
  • the distance value (d) can be changed so as to be further away in the y-axis direction by using the distance value to .
  • the electronic device 110 transmits the distance value d of the image of the yoga mattress 1010 from the electronic device 110 at the original position under the control of the processor (eg, the processor 320 of FIG. 3 ).
  • the distance value d may be changed so as to be further away in the -z-axis direction by using the distance value up to (1050).
  • the electronic device 110 when the distance value to the vase 1040 and the light 1050 is used, the electronic device 110 performs an image of a person 1020 and a yoga pose under the control of a processor (eg, the processor 320 of FIG. 3 ).
  • the distance value d may be changed by adjusting the mattress 1010 image so that it does not overlap.
  • the electronic device 110 controls the distance of the object (eg, the vase 1040 ) determined to be the furthest on the camera screen based on the y-axis under the control of the processor (eg, the processor 320 of FIG. 3 ).
  • the value ( ⁇ ) can be obtained.
  • the electronic device 110 controls the coordinates (eg, the distance value d) of the original first object image (eg, the yoga mattress 1010 image) under the control of the processor (eg, the processor 320 of FIG. 3 ). : (a, b, c)) can be made further away in the y-axis direction using the distance value ( ⁇ ).
  • the distance value (d) of the first object image may be changed (eg, the distance value (d')), and the coordinates (eg (a, b, c)) can also be changed to other coordinates (eg (a, b+ ⁇ ', c)).
  • the electronic device 110 sets the first object (eg, the yoga mattress 1010) and the second object (eg, the person 1020) as a specific distance value under the control of the processor (eg, the processor 320 of FIG. 3 ). In order to separate as much as possible, it may be implemented in various ways other than the above-described method.
  • the electronic device 110 sets the distance value d of the image of the first object (eg, yoga mattress 1010) based on a specific coordinate axis under the control of the processor (eg, the processor 320 of FIG. 3 ).
  • the changed value can be stored as specific data in memory.
  • the electronic device 110 may acquire joint information of the second object based on location information of the objects under the control of a processor (eg, the processor 320 of FIG. 3 ).
  • a processor eg, the processor 320 of FIG. 3
  • the electronic device 110 in an embodiment moves the first object (eg, the yoga mattress 1010 ) further away from the data based on a specific coordinate axis.
  • joint information of the second object may be acquired by using the stereo camera.
  • the electronic device 110 may perform a neck joint, a shoulder joint, a wrist joint, a wrist joint, and a pelvis of the person 1020 as shown in FIG. 10 under the control of a processor (eg, the processor 320 of FIG. 3 ).
  • Joint information 1030 including a joint, a knee joint, an ankle joint, and the like may be obtained.
  • the yoga mattress 1010 is not positioned under the person 1020 according to the control of the processor (eg, the processor 320 of FIG. 3 ). of the joint information 1030 can be obtained.
  • the electronic device 110 may track the movement of the second object 1020 based on the joint information 1030 under the control of a processor (eg, the processor 320 of FIG. 3 ).
  • the electronic device 110 obtains joint information 1030 of each joint including coordinate information when the person 1020 stands in an upright posture under the control of a processor (eg, the processor 320 of FIG. 3 ).
  • the person 1020 may perform a yoga movement in which the right knee is bent and the right hand is stretched toward the sky.
  • the electronic device 110 may acquire coordinate information of each joint in real time while the person 1020 is changing from the normal posture to the yoga posture under the control of the processor (eg, the processor 320 of FIG. 3 ).
  • the electronic device 110 may track the movement of the person 1020 based on the coordinate information and the coordinate movement information under the control of a processor (eg, the processor 320 of FIG. 3 ).
  • the electronic device 110 performs data related to the location information of the first object image and the motion tracking of the second object on the display device 120 under the control of the processor (eg, the processor 320 of FIG. 3 ).
  • data can be transmitted.
  • the display device 120 may display a motion image of the second object based on the first object and data about the second object under the control of the processor (eg, the processor 320 of FIG. 3 ).
  • the electronic device 110 sets the distance value d of the image of the first object (eg, yoga mattress 1010) based on a specific coordinate axis under the control of the processor (eg, the processor 320 of FIG. 3 ). It is possible to transmit the original data, not the specific data changed to , to the display device 120 . Also, the electronic device 110 may transmit data tracking the movement of the second object 1020 to the display device 120 under the control of a processor (eg, the processor 320 of FIG. 3 ). In this case, the display device 120 receiving the above-described data may display, for example, an image of a person 1020 doing yoga on the yoga mattress 1010 .
  • the electronic device 110 sets the distance value d of the image of the first object (eg, yoga mattress 1010) based on a specific coordinate axis under the control of the processor (eg, the processor 320 of FIG. 3 ).
  • the specific data changed to . may be transmitted to the display device 120 .
  • the electronic device 110 may transmit data tracking the movement of the second object 1020 to the display device 120 under the control of a processor (eg, the processor 320 of FIG. 3 ).
  • the display device 120 receiving the above data is controlled by a processor (eg, the processor 320 of FIG. 3 ), for example, the electronic device 110 is a processor (eg, the processor of FIG. 3 ). ( 320 )), an image of a person 1020 doing yoga in a state where the yoga mattress 1010 is not positioned below may be displayed on the display device 120 under the control of the display device 120 .
  • Electronic devices may be devices of various types.
  • the electronic device may include, for example, a portable communication device (eg, a smart phone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance device.
  • a portable communication device eg, a smart phone
  • a computer device e.g., a smart phone
  • a portable multimedia device e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a camera
  • a wearable device e.g., a smart bracelet
  • first”, “second”, or “first” or “second” may simply be used to distinguish the component from other components in question, and may refer to components in other aspects (e.g., importance or order) is not limited. It is said that one (eg, first) component is “coupled” or “connected” to another (eg, second) component, with or without the terms “functionally” or “communicatively”. When referenced, it means that one component can be connected to the other component directly (eg by wire), wirelessly, or through a third component.
  • module may include a unit implemented in hardware, software, or firmware, and may be used interchangeably with terms such as, for example, logic, logic block, component, or circuit.
  • a module may be an integrally formed part or a minimum unit or a part of the part that performs one or more functions.
  • the module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • one or more instructions stored in a storage medium may be implemented as software (eg, the program 340) including
  • the processor eg, the processor 320 of the device (eg, the electronic device 301 ) may call at least one of one or more instructions stored from a storage medium and execute it. This makes it possible for the device to be operated to perform at least one function according to the at least one command called.
  • the one or more instructions may include code generated by a compiler or code executable by an interpreter.
  • the device-readable storage medium may be provided in the form of a non-transitory storage medium.
  • 'non-transitory' only means that the storage medium is a tangible device and does not contain a signal (eg, electromagnetic wave), and this term is used in cases where data is semi-permanently stored in the storage medium and It does not distinguish between temporary storage cases.
  • a signal eg, electromagnetic wave
  • the method according to various embodiments disclosed in this document may be included and provided in a computer program product.
  • Computer program products may be traded between sellers and buyers as commodities.
  • the computer program product is distributed in the form of a device-readable storage medium (eg compact disc read only memory (CD-ROM)), or through an application store (eg Play StoreTM) or on two user devices (eg, It can be distributed (eg downloaded or uploaded) directly, online between smartphones (eg: smartphones).
  • a part of the computer program product may be temporarily stored or temporarily created in a machine-readable storage medium such as a memory of a server of a manufacturer, a server of an application store, or a relay server.
  • each component eg, a module or a program of the above-described components may include a singular or a plurality of entities.
  • one or more components or operations among the above-described corresponding components may be omitted, or one or more other components or operations may be added.
  • a plurality of components eg, a module or a program
  • the integrated component may perform one or more functions of each component of the plurality of components identically or similarly to those performed by the corresponding component among the plurality of components prior to the integration. .
  • operations performed by a module, program, or other component are executed sequentially, in parallel, repeatedly, or heuristically, or one or more of the operations are executed in a different order, omitted, or , or one or more other operations may be added.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)
  • User Interface Of Digital Computer (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Studio Devices (AREA)

Abstract

L'invention concerne un dispositif électronique selon un mode de réalisation comprenant : au moins un module de caméra ; une mémoire ; et un processeur connecté fonctionnellement audit au moins un module de caméra et à la mémoire, le processeur pouvant : capturer une image d'un premier objet ; obtenir des informations de couleur du premier objet capturé par image ; démarrer la prise de vue d'une vidéo du premier objet et d'un second objet se déplaçant sur le premier objet ; déterminer, sur la base des informations de couleur du premier objet, une première image d'objet correspondant au premier objet à partir de données obtenues par le tournage de la vidéo ; la modification des coordonnées tridimensionnelles d'une zone déterminée en tant que première image d'objet dans les données sur la base d'une valeur de distance spécifiée ; et le suivre d'un mouvement du second objet sur la base des données dans lesquelles les coordonnées tridimensionnelles de la première image d'objet sont modifiées.
PCT/KR2021/002061 2020-02-21 2021-02-18 Dispositif électronique et procédé de suivi du mouvement d'un objet WO2021167365A2 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020200021685A KR20210106763A (ko) 2020-02-21 2020-02-21 객체의 움직임을 추적하는 전자 장치 및 방법
KR10-2020-0021685 2020-02-21

Publications (2)

Publication Number Publication Date
WO2021167365A2 true WO2021167365A2 (fr) 2021-08-26
WO2021167365A3 WO2021167365A3 (fr) 2021-10-14

Family

ID=77391106

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2021/002061 WO2021167365A2 (fr) 2020-02-21 2021-02-18 Dispositif électronique et procédé de suivi du mouvement d'un objet

Country Status (2)

Country Link
KR (1) KR20210106763A (fr)
WO (1) WO2021167365A2 (fr)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9011293B2 (en) * 2011-01-26 2015-04-21 Flow-Motion Research And Development Ltd. Method and system for monitoring and feed-backing on execution of physical exercise routines
KR101815975B1 (ko) * 2011-07-27 2018-01-09 삼성전자주식회사 객체 자세 검색 장치 및 방법
KR20160146567A (ko) * 2015-06-12 2016-12-21 신동한 가변적으로 빠르게 움직이는 객체를 검출하는 방법 및 장치
KR20180058139A (ko) * 2016-11-23 2018-05-31 한국기술교육대학교 산학협력단 스마트 헬스 서비스 시스템 및 스마트 헬스 서비스 방법
KR20190056458A (ko) * 2017-11-10 2019-05-27 전자부품연구원 객체 추적에서 겹침 객체 분리 방법

Also Published As

Publication number Publication date
WO2021167365A3 (fr) 2021-10-14
KR20210106763A (ko) 2021-08-31

Similar Documents

Publication Publication Date Title
WO2020171540A1 (fr) Dispositif électronique permettant de fournir un mode de prise de vue sur la base d'un personnage virtuel et son procédé de fonctionnement
WO2020080845A1 (fr) Dispositif électronique et procédé pour obtenir des images
EP3895130A1 (fr) Dispositif électronique pour générer une animation d'avatar et procédé associé
WO2020130654A1 (fr) Module de caméra ayant une structure multi-cellulaire et dispositif de communication portable le comprenant
WO2020130281A1 (fr) Dispositif électronique et procédé de fourniture d'un avatar sur la base de l'état émotionnel d'un utilisateur
WO2019066373A1 (fr) Procédé de correction d'image sur la base de catégorie et de taux de reconnaissance d'objet inclus dans l'image et dispositif électronique mettant en œuvre celui-ci
WO2019139404A1 (fr) Dispositif électronique et procédé de traitement d'image correspondante
WO2019039870A1 (fr) Dispositif électronique capable de commander un effet d'affichage d'image, et procédé d'affichage d'image
WO2020116844A1 (fr) Dispositif électronique et procédé d'acquisition d'informations de profondeur à l'aide au moins de caméras ou d'un capteur de profondeur
WO2019035551A1 (fr) Appareil de composition d'objets à l'aide d'une carte de profondeur et procédé associé
WO2021133025A1 (fr) Dispositif électronique comprenant un capteur d'image et son procédé de fonctionnement
WO2021158057A1 (fr) Dispositif électronique et procédé d'affichage d'image sur le dispositif électronique
WO2021215795A1 (fr) Filtre couleur pour dispositif électronique, et dispositif électronique le comportant
WO2019160237A1 (fr) Dispositif électronique, et procédé de commande d'affichage d'images
WO2019168374A1 (fr) Procédé de génération d'informations multiples à l'aide d'une caméra pour détecter une largeur de bande d'onde multiple et appareil associé
WO2021080307A1 (fr) Procédé de commande de caméra et dispositif électronique correspondant
WO2019066370A1 (fr) Dispositif électronique pour commander une caméra sur la base d'une lumière extérieure, et procédé de commande associé
WO2021167365A2 (fr) Dispositif électronique et procédé de suivi du mouvement d'un objet
WO2019172577A1 (fr) Dispositif et procédé de traitement d'images d'un dispositif électronique
WO2019182359A1 (fr) Dispositif électronique de notification de mise à jour de traitement de signal d'image et procédé de fonctionnement de celui-ci
WO2021162263A1 (fr) Procédé de génération d'image et dispositif électronique associé
WO2021246736A1 (fr) Appareil de traitement d'images et procédé de traitement d'images
WO2021162241A1 (fr) Procédé et dispositif de commande d'un capteur d'image
WO2022196993A1 (fr) Dispositif électronique et procédé de capture d'image au moyen d'un angle de vue d'un module d'appareil de prise de vues
WO2015009112A9 (fr) Procédé et appareil pour afficher des images sur un terminal portable

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21756461

Country of ref document: EP

Kind code of ref document: A2