US20190132502A1 - Method of operating a wearable lifelogging device - Google Patents
Method of operating a wearable lifelogging device Download PDFInfo
- Publication number
- US20190132502A1 US20190132502A1 US16/053,660 US201816053660A US2019132502A1 US 20190132502 A1 US20190132502 A1 US 20190132502A1 US 201816053660 A US201816053660 A US 201816053660A US 2019132502 A1 US2019132502 A1 US 2019132502A1
- Authority
- US
- United States
- Prior art keywords
- image data
- rules
- application
- received
- connected device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H04N5/23203—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/26—Power supply means, e.g. regulation thereof
- G06F1/32—Means for saving power
- G06F1/3203—Power management, i.e. event-based initiation of a power-saving mode
- G06F1/3234—Power saving characterised by the action undertaken
- G06F1/3287—Power saving characterised by the action undertaken by switching off individual functional units in the computer system
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/64—Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/65—Control of camera operation in relation to power supply
- H04N23/651—Control of camera operation in relation to power supply for reducing power consumption by affecting camera operations, e.g. sleep mode, hibernation mode or power off of selective parts of the camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/667—Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/681—Motion detection
- H04N23/6812—Motion detection based on additional sensors, e.g. acceleration sensors
-
- H04N5/23206—
-
- H04N5/23216—
-
- H04N5/23222—
-
- H04N5/23241—
-
- H04N5/23245—
-
- H04N5/23258—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/188—Capturing isolated or intermittent images triggered by the occurrence of a predetermined event, e.g. an object reaching a predetermined position
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/4223—Cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
- H04N23/661—Transmitting camera control signals through networks, e.g. control via the Internet
Definitions
- Lifelogging is the process of digitally lifelogging life experiences.
- a lifelogging system usually comprises a wearable lifelogging device which automatically and continuously captures the user's activities in the form of text and/or sensor data, such as image, audio or video lifeloggings which are stored and organized for future use.
- lifelogging system For a lifelog not to be inconvenient and cumbersome to produce, it is important that the lifelogging system be user friendly and interfere as little as possible with the user's daily routine. It is desirable to improve existing lifelogging systems in these respects.
- the clock signal may be a real time clock signal provided by a device clock.
- the first communication device 20 can have a controller 23 and a charger 24 .
- the network-based storage service 30 may comprise a storage service 31 and a computing service 32 .
- the network-based storage service 30 may also be connected to a second communication device 40 via the latter's interface 41 .
- the second communication device 40 also comprises application software 42 .
- the first communication device 20 is configured to send the image, location, orientation and time data to the network-based storage service 30 over a connection 51 which may be a wired connection, such as a wired LAN connection, or a wireless connection, such as wireless LAN, CDMA, GSM, 3G or 4G connection.
- the protocol used for communicating over the connection 51 may be TCP/IP, HTTP, HTTPS, SSL or TSL.
- the device may be operable in four different states.
- a ready state 100 the camera may be active in the sense that it can provide a signal indicating at least an amount of light received at the camera's image sensor, while the remainder of the device is in a low power mode.
- the low power mode one or more parts of the device may not be active, or may be active but operating at a lower clock frequency as compared to normal operation.
- the device may further be operable in a take photo state 200 , wherein the device wakes up from the low power mode, the camera takes a photo and optionally all sensors (e.g. accelerometer, GPS sensor, etc.) are read.
- sensors e.g. accelerometer, GPS sensor, etc.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Computing Systems (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Telephone Function (AREA)
- Studio Devices (AREA)
- Details Of Cameras Including Film Mechanisms (AREA)
- Camera Bodies And Camera Details Or Accessories (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The present disclosure provides a method of operating a wearable life logging device (2) comprising a data processing unit (9), a camera unit (7), and at least one motion sensor (4, 5). The method comprises selectively operating the device in a take photo state (200), wherein a photo is captured by means of the camera unit (7), and selectively operating the device in a sleep state (300), wherein the camera unit (7) is in a low power mode. The method further comprises causing the device to transition (120, 330, 430) to the take photo state (200) in response to a signal from the motion sensor (4, 5).
Description
- Any and all applications for which a foreign or domestic priority claim is identified in the Application Data Sheet as filed with the present application are hereby incorporated by reference under 37 CFR 1.57.
- The present disclosure relates to a method for lifelogging and to a system which implements such a method.
- Lifelogging is the process of digitally lifelogging life experiences. A lifelogging system usually comprises a wearable lifelogging device which automatically and continuously captures the user's activities in the form of text and/or sensor data, such as image, audio or video lifeloggings which are stored and organized for future use.
- People may want to log their activities for their own enjoyment, for example for keeping a diary or being able to retrieve and share personal experiences, including unanticipated ones, with others. Lifelogging may also be used for medical purposes, for example as an aid for people suffering from memory impairment.
- Another application for lifelogging devices is one of personal security; a lifelogging device may be used to provide evidence in the event the user is harassed or attacked.
- Yet another application is one of quality assurance; a lifelogging device may be used to document the user's activity. For example a police officer, nurse, physician, fireman, corrections officer, keeper or caretaker may use a lifelogging device to provide documentation of work performed and/or as evidence against accusations of malpractice or abuse.
- For a lifelog not to be inconvenient and cumbersome to produce, it is important that the lifelogging system be user friendly and interfere as little as possible with the user's daily routine. It is desirable to improve existing lifelogging systems in these respects.
- On the other hand, there remains a need to interact with the lifelogging system, e.g. in order to prevent it from taking photos in situations when this would be inappropriate or forbidden.
- There is also a need to manage battery capacity of the lifelogging device such that it will have the capability of taking photos for a sufficient period of time before it needs charging.
- It is an objective to improve the user experience of lifelogging systems. A particular object is to provide a lifelogging system which is easy, efficient and satisfying to use.
- The invention is defined by the independent claims. Embodiments are set forth in the dependent claims and in the descriptions and drawings.
- According to a first aspect, there is provided a method of operating a wearable life logging device comprising a data processing unit, a camera unit and at least one motion sensor. The method comprises selectively operating the device in a take photo state, wherein a photo is captured by means of the camera unit, and selectively operating the device in a sleep state, wherein the camera unit is in a low power mode. The method further comprises causing the device to transition to the take photo state in response to a signal from the motion sensor.
- A lifelogging device is defined as a device which is configured for continuous or intermittent capture of images of the user and/or the user's experiences. Images thus captured may be associated with data indicating time/date and/or geographic position information for the respective image. The lifelogging device may be configured with an integrated intermediate image and data storage device. In the alternative, the lifelogging device may be configured for online transmission of image and data to a network-accessible storage device.
- The operating of the device in the take photo state may further comprise reading at least the motion sensor and optionally causing the device to transition from the low power mode.
- The method may further comprise causing the device to transition from the take photo state to the sleep state if the signal from the motion sensor is below a first threshold level or if a signal from the camera unit represents a light level lower than a predetermined second threshold value.
- The first threshold level may be set so as to indicate whether the device is stationary on a substantially flat surface or whether it is in motion.
- In one embodiment, the device may be caused to transition from the take photo state to the sleep state if the signal from the motion sensor has been below the first threshold value for a sufficient period of time or if the signal from the camera unit represents a light level lower than the first threshold value for a sufficient period of time.
- The periods of time may be same or different.
- The method may further comprise determining a gravity vector and causing the device to transition to the sleep state only if the gravity vector is within a predetermined range.
- In this context, the term “range” is to be understood as a range of directions.
- Hence, it is possible to determine the orientation of the device, and to condition the transition to the sleep state on the device having a specific orientation, such as horizontal.
- The term “horizontal” should be construed as horizontal+/−10°, +/−5° or +/−1°.
- In this manner, a user of the device is able to make the device transition to the sleep state by placing the device in a specific orientation, such as horizontal on a surface. Thereby, the device is, in a user friendly manner, prevented from taking photos in situations when it is not appropriate. Moreover, the user and others will recognize that, when the device is in the specific orientation, it will not take pictures.
- The method may further comprise selectively operating the device in a ready state, wherein the camera is active and wherein the data processing unit is in a low power mode.
- The method may further comprise causing the device to transition from the take photo state to the ready state if the signal from the motion sensor is above the first threshold value and/or if the signal from the camera unit represents a light level higher than the second threshold value.
- Optionally, the transition to the sleep state may require a lower second threshold value than a transition to the ready state. For example, it may be desirable to transition to sleep state only when there is substantially no light, while remaining in ready state when there is little light, although not sufficient to provide an acceptable photo.
- The device may be caused to transition from the ready state to the take photo state in response to a clock signal.
- The clock signal may be a real time clock signal provided by a device clock.
- The method may further comprise causing the device to transition from the ready state to the take photo state if the signal from the motion sensor represents a predetermined motion pattern, such as a particular sequence of motions.
- The predetermined motion pattern may be composed of one or more absolute values, which may occur within a predetermined time period. For example, two values exceeding a predetermined threshold value and received within a predetermined time interval may be understood as a “double tap”, i.e. the user taps the device twice with his/her finger in order to trigger it to take an ad hoc photo.
- The method as claimed in any one of the preceding claims, further comprising selectively operating the device in a snooze state, wherein the device checks at least one transition indicator.
- Such a transition indicator may be an indicator indicating that a particular event has occurred, such as the device having been shaken or received a clock signal.
- The method may further comprise causing the device to transition from the snooze state to the sleep state if the signal from the motion sensor is below a first threshold level or if the signal from the camera unit represents a light level lower than a predetermined second threshold value.
- The first threshold level may be set so as to indicate whether the device is stationary on a substantially flat surface or whether it is in motion.
- In the method, the device may be caused to transition from the snooze state to the sleep state if the signal from the motion sensor has been below the first threshold value for a sufficient period of time or if the signal from the camera unit represents a light level lower than the first threshold value for a sufficient period of time.
- The periods of time may be same or different.
- The method may further comprise determining a gravity vector and causing the device to transition to the sleep state only if the gravity vector is within a predetermined range.
- Hence, it is possible to determine the orientation of the device, and to condition the transition to the sleep state on the device having a specific orientation, such as horizontal.
- The method may further comprise causing the device to transition from the snooze state to the take photo state if the signal from the motion sensor exceeds the first threshold value and/or if the signal from the camera unit represents a light level higher than the second threshold value.
- The method may further comprise causing the device to transition from the sleep state to the snooze state if the signal from the motion sensor represents an absolute value larger than a third threshold level.
- For example, this third threshold value could be set such that it will indicate that the device is in motion, or it may be slightly higher, such that it will indicate that the device is being shaken, i.e. that the user deliberately shakes it to cause it to “wake up”.
- The method may further comprise increasing the third threshold level if a signal from the camera unit represents a light level higher than the predetermined second threshold level.
- Hence, it is possible to cause the camera to increase its tolerance for movements when it is too dark to take acceptable photos.
- The method may further comprise causing the device to transition from the sleep state to the snooze state in response to a clock signal.
- The method may further comprise causing the device to transition from the ready state to the take photo state if the signal from the motion sensor represents a predetermined motion pattern, such as a particular sequence of motions.
- The method may further comprise causing the device to transition from the sleep state to the take photo state if the signal from the motion sensor represents a predetermined motion pattern, such as a particular sequence of motions, and/or if the signal from the motion sensor exceeds a first threshold value. Hence, the device may be caused to wake up to immediately take a photo upon recording a predetermined motion pattern and/or a sufficiently large/fast motion.
- The predetermined motion pattern may be composed of one or more absolute values, which may occur within a predetermined time period. For example, two values exceeding a predetermined threshold value and received within a predetermined time interval may be understood as a “double tap”, i.e. the user taps the device twice with his/her finger in order to trigger it to take an ad hoc photo.
- The method may further comprise that control parameters for the device is set by using a communication device having a first interface for communicating with the lifelogging device and a second interface for communicating with a network-based storage service.
- Thereby, the user may determine settings for the lifelogging device by using a communication device such as a Smartphone, a tablet computer, a desktop computer, a laptop computer etc which is able to control settings of the lifelogging device. For example, the rate at which photos are taken can be adjusted. Hence, a better user experience and a higher user friendliness is achieved.
- The method may further comprise sending a captured photo from the device to a storage device or a storage service via at least one wireless communication interface. The wireless interface may, as non-limiting examples, be a Bluetooth® interface or a WiFi interface to which the lifelogging device is connected. Another interface may be a wired interface, such as an electronic or optical fiber based interface, or a wireless interface, such as a cellular phone/data based interface.
- The photo taken by the lifelogging device may be a single photo, a sequence of photos or a video sequence.
- According to a second aspect, there is provided a wearable lifelogging device comprising a data processing unit, a camera unit, and at least one motion sensor. The device is configured to perform the method described above.
- The lifelogging device may have a front face (i.e. the face with camera lens) which is designed such that when placing the device face down on a horizontal surface, the amount of light reaching the camera lens will be sufficiently low for the camera to detect a light level lower than the first threshold value. For example, the camera may have a substantially planar front surface, with the camera lens being flush with, or retracted from, the front surface.
-
FIG. 1 is a schematic illustration of components of a lifelogging system. -
FIGS. 2a-2c are schematic illustrations of different configurations of a lifelogging system. -
FIG. 3 is a state diagram, which schematically illustrates operation of thelifelogging device 2. -
FIG. 1 illustrates schematically components of a lifelogging system 1. The system 1 comprises awearable lifelogging device 2 which has a weather protected housing 3 which encloses aGPS unit 4, anaccelerometer 5, a timer 6 (which may have the form of a real-time clock (RTC)), acamera 7, a storage device 8 which may comprise volatile and non-volatile memory, aCPU 9, arechargeable battery 10, acommunication interface 11 and auser interface 12. Aclip 13 is arranged on the outside of the housing. The system 1 also comprises afirst communication device 20 which hasinterfaces lifelogging device 2 and a network-basedstorage service 30, respectively. Thefirst communication device 20 can have acontroller 23 and acharger 24. The network-basedstorage service 30 may comprise astorage service 31 and acomputing service 32. The network-basedstorage service 30 may also be connected to asecond communication device 40 via the latter'sinterface 41. Thesecond communication device 40 also comprisesapplication software 42. - Dashed lines represent
connections - When a user wears the
lifelogging device 2, for example by attaching it to a piece of clothing or a necklace, thecamera 7 may be preset to automatically take two photographs a minute. It is also possible to allow the user to trigger the camera to take a photograph by performing a predetermined gesture, such as tapping the camera once, twice or three times within a predetermined time period and possibly in a predetermined direction. - When the lifelogging device takes a photo it may be a single photograph, a sequence of photos or a video sequence. A video sequence normally means a frame rate of more than 20 frames per second, whereas a sequence of photos may have a frame rate between 1 and 20 frames per second. A single photo refers to photos with a frame rate less than 1 frame per second, preferably on the order of one frame every 5-240 seconds, more preferably on the order of one frame every 20-120 seconds.
- The
CPU 9 may be programmed to stop taking photographs, power off and/or cause thecamera 7 to hibernate or go into a low-power mode (e.g. by turning off one or more other sensors), if it receives a predetermined sensor signal from the camera indicating that the photograph is darker than a predetermined level. - The
CPU 9 may also be programmed to stop taking photographs, power off and/or cause thecamera 7 to hibernate or go into a low-power mode (e.g. by turning off one or more other sensors), if it receives a sensor signal from theaccelerometer 5 indicating that thelifelogging device 2 has moved less than a predetermined amount during a predetermined amount of time. - With each photograph, the
CPU 9 may associate location data from theGPS unit 4, orientation data from theaccelerometer 5 and time (optionally including date) data from the timer 6. The image, location, orientation and time data of the photograph are stored in non-volatile memory on the storage device 8 and transferred via thefirst communication device 20 to thestorage service 31 of the network-basedstorage service 30 when aconnection 50 is established between thelifelogging device 2 and thefirst communication device 20. The transfer may occur during charging of thebattery 10 by thecharger 24 and the storage may be encrypted. - The
first communication device 20 may be a docking station, a Smartphone, a tablet computer, a desktop computer or a laptop computer. - The
connection 50, which connects thelifelogging device 2 to thefirst communication device 20 via theinterfaces - The protocol used for communication between the
lifelogging device 2 and thefirst communication device 20 may be the USB mass storage device protocol or a device-specific protocol. A device-specific protocol may comprise features making communication possible only between aspecific lifelogging device 2 and aspecific communication device 20, which would make it more difficult for an unauthorized person to retrieve the data stored on thelifelogging device 2. - The
connection 51, which connects thefirst communication device 20 to the network-basedstorage service 30 via theinterface 22, may be a wired connection, such as a wired LAN connection, or a wireless connection, such as wireless LAN, CDMA, GSM, 3G or 4G connection. - The protocol used for communication between the
first communication device 20 and the network-basedstorage service 30 may be TCP/IP, HTTP, HTTPS, SSL and TLS. - The network-based
storage service 30 may be a REST service. - The
computing service 32 of the network-basedstorage service 30 may analyze and organize the photographs based on its image, location, orientation and/or time data. The photographs may be organized on a timeline and into groups of photographs fulfilling certain criteria, such as being from the same surroundings. The criteria may be user defined and the colors of the images may be used to determine which photographs were taken in the same surroundings. The photographs may be analyzed and their relevance assessed using criteria based on, for example, motion blur, contrast, composition, light, face recognition and object recognition. A group of photographs fulfilling certain criteria may be analyzed in order to select a photograph which is particularly representative, according to some criteria, of that group of photographs. The selected photograph may be used to give the user a rough idea of what the photographs in the group of photographs are showing and when they were taken. - The user can use a
second communication device 40, for example a Smartphone, a tablet computer, a desktop computer or a laptop computer, to access the photographs and to set control parameters for thelifelogging device 2, the network-basedstorage service 30 and thefirst communication device 20. - Control parameters may, for example, determine the image capture rate and the time dependence of the image capture rate. For example, a user may set a higher image capture rate during a certain time period of a certain day. Control parameters may also determine whether a photograph stored on
storage device 31 can be accessed by other users. - The user may use the
second communication device 40 to perform computing functions of thecomputing service 32. A computing function may be photo editing. - The
application software 42, which, for example, can be a web browser or an application for smart phones or tablet computers, may be used to perform the computing functions and to set control parameters. - The
connection 52, which connects thesecond communication device 40 to the network-basedstorage service 30 via theinterface 41, may be a wired connection, such as a wired LAN connection, or a wireless connection, such as a wireless LAN, CDMA, GSM, 3G or 4G connection. - The protocol used for communication between the
second communication device 40 and the network-basedstorage service 30 may be TCP/IP, HTTP, HTTPS, SSL and TLS. - The network-based
storage service 30 may send push notifications to thesecond communication device 40, for example when photographs taken by thelifelogging device 2 have been transferred to the network-basedstorage service 30. - The network-based
storage service 30 may send data to and receive data from devices which are not a part of thelifelogging system 2. For example, data captured by theGPS unit 4 may be sent to a third-party which analyses the data and sends the coordinates represented by the data to the network-basedstorage service 30. The network-basedstorage service 30 may send and receive data, for example image data, to other network-based services, for example social-networking services. -
FIG. 2a illustrates schematically a lifelogging system 1 in which thelifelogging device 2 is configured to send image, location, orientation and time data to afirst communication device 20 over aconnection 50 which may be a wired connection, such as a micro-USB, USB or wired LAN connection, or a wireless connection, such as a wireless LAN, Bluetooth, NFC, IR, CDMA, GSM, 3G or 4G connection. The protocol used for communicating over theconnection 50 may be the USB mass storage device protocol, TCP/IP, HTTP, HTTPS, SSL or TSL or a device-specific protocol. - The
first communication device 20 may be a docking station, a Smartphone, a tablet computer, a desktop computer or a laptop computer. - The
first communication device 20 is configured to send the image, location, orientation and time data to the network-basedstorage service 30 over aconnection 51 which may be a wired connection, such as a wired LAN connection, or a wireless connection, such as wireless LAN, CDMA, GSM, 3G or 4G connection. The protocol used for communicating over theconnection 51 may be TCP/IP, HTTP, HTTPS, SSL or TSL. - A user can access the data stored on the network-based
storage service 30 through asecond communication device 40 which is also configured to send control parameters to the network-basedstorage service 30 over aconnection 52. Theconnection 52 may be a wired connection, such as a wired LAN connection, or a wireless connection, such as a wireless LAN, CDMA, GSM, 3G or 4G connection. The protocol used for communicating over theconnection 52 may be TCP/IP, HTTP, HTTPS, SSL or TSL. - The control parameters sent by the
second communication device 40 may comprise control parameters for the network-basedstorage service 30, thefirst communication device 20 and thewearable lifelogging device 2. The network-basedstorage service 30 transfers, over theconnection 51, control parameters to thefirst communication device 20 which, in turn, transfers the control parameters to thewearable lifelogging device 2 over theconnection 50. -
FIG. 2b illustrates schematically a lifelogging system 1 in which awearable lifelogging device 2 is configured to communicate directly with asecond communication device 40 and a network-basedstorage service 30 overconnections - The
second communication device 40 may be a Smartphone, a tablet computer, a desktop computer or a laptop computer. - The
connection 53 may be a wired connection, such as a wired LAN connection, or a wireless connection, such as a wireless LAN, CDMA, GSM, 3G or 4G connection. The protocol used for communicating over theconnection 53 may be TCP/IP, HTTP, HTTPS, SSL or TSL. - The
connection 54 which may be a wired connection, such as a micro-USB, USB or wired LAN connection, or a wireless connection, such as a wireless LAN, Bluetooth, NFC, IR, CDMA, GSM, 3G or 4G connection. The protocol used for communicating over theconnection 54 may be the USB mass storage device protocol, TCP/IP, HTTP, HTTPS, SSL or TSL or a device-specific protocol. - If a
first communication device 20 is provided, thelifelogging device 2 may also communicate with thefirst communication device 20 which may be configured to communicate with the network-basedstorage service 30. - The
first communication device 20 may be a docking station, a Smartphone, a tablet computer, a desktop computer or a laptop computer. - The
first communication device 20 may communicate with the belifelogging device 2 over a wired connection, such as a micro-USB, USB or wired LAN connection, or a wireless connection, such as a wireless LAN, Bluetooth, NFC, IR, CDMA, GSM, 3G or 4G connection. The protocol used for communicating may be the USB mass storage device protocol, TCP/IP, HTTP, HTTPS, SSL or TSL or a device-specific protocol. - The
first communication device 20 may communicate with the network-basedstorage service 30 over a wired connection, such as a wired LAN connection, or a wireless connection, such as a wireless LAN, CDMA, GSM, 3G or 4G connection. The protocol used for communicating may be TCP/IP, HTTP, HTTPS, SSL or TSL. -
FIG. 2c illustrates schematically a lifelogging system 1 in which awearable lifelogging device 2 and a network-basedstorage service 30 are configured to communicate over aconnection 53 which may be a wired connection, such as a wired LAN connection, or a wireless connection, such as a wireless LAN, CDMA, GSM, 3G or 4G connection. The protocol used for communicating over theconnection 53 may be TCP/IP, HTTP, HTTPS, SSL or TSL. - Communication also occurs between the network-based
storage device 30 and acommunication device 40 over aconnection 52 which may be a wired connection, such as a wired LAN connection, or a wireless connection, such as a wireless LAN, CDMA, GSM, 3G or 4G connection. The protocol used for communicating over theconnection 52 may be TCP/IP, HTTP, HTTPS, SSL or TSL. - The
communication device 40 may be a Smartphone, a tablet computer, a desktop computer or a laptop computer. - Referring to
FIG. 3 , a method of operating the wearable lifelogging device will now described. - The method may be implemented in a finite state machine.
- The device may be operable in four different states. In a
ready state 100, the camera may be active in the sense that it can provide a signal indicating at least an amount of light received at the camera's image sensor, while the remainder of the device is in a low power mode. In the low power mode, one or more parts of the device may not be active, or may be active but operating at a lower clock frequency as compared to normal operation. - The device may further be operable in a
take photo state 200, wherein the device wakes up from the low power mode, the camera takes a photo and optionally all sensors (e.g. accelerometer, GPS sensor, etc.) are read. - The device may further be operable in a
sleep state 300, wherein the entire device is in a low power mode. In this mode, the real time clock may be active to provide a clock signal, as may those parts of the device necessary to determine whether the clock signal meets a criterion for a transition to take place. - The device may further be operable in a
snooze state 400, wherein the device checks whether it should wake up (i.e. transition to the take photo state). - The device may further be said to be operable in two modes: active mode, where the device shifts 110, 120, 230 between the
ready state 100 and thetake photo state 200 and a sleep mode, where the device shifts 310, 320, 410, 420 between thesleep state 300 and thesnooze state 400. - From the
ready state 100, the following transitions may take place. - On receipt of a signal from the clock, the device may transition 110 to the
take photo state 200. This may be the case where the device is in normal operation, i.e. takes photos at predetermined intervals, such as two photos per minute, etc. - On detection of a specific movement or sequence of movements by the motion sensor or the GPS sensor, the device may transition 120 to the
take photo state 200. This may be the case where the device is arrange to detect the user tapping on the device with his/her finger in order to trigger the device to take an ad hoc photo. Hence, in this specific example, the motion sensor would sense two or more accelerations exceeding a particular threshold value within a predetermined time period, such as 1-3 seconds. Optionally the durations of the accelerations may be very short, hence indicating that the device did not move too much, e.g. that it moved less than 10 cm or less than 5 cm as a consequence of the tapping. - From the
take photo state 200, the following transitions may take place. - When detecting that the device is placed stationary and optionally with a specific orientation, such as horizontally, the device may transition 210 to the
sleep state 300. In this case, the device may detect an acceleration vector which is below a first threshold value T1 (thus indicating that the device is stationary) and optionally also an acceleration vector, which may indicate that the device has been given a specific orientation, such as flat (with front or back side down) on a surface. This condition may be referred to as a “flat surface detection”. Thistransition 210 may be conditioned on the device having the specific orientation during a predetermined time period, such as e.g. 15, 30 or 45 seconds or 1-5 minutes. - Similarly, when the camera receives light below a predetermined second threshold value T2, the device may transition 220 to the
sleep state 300. This transition may be referred to as a “low light detection”. Thistransition 220 may also be conditioned on the device receiving the low level of light during a predetermined time period, such as e.g. 15, 30 or 45 seconds or 1-5 minutes. - If, when in the
take photo state 200, neither flat surface nor low light is detected, the device may transition 230 to theready state 100. - From the
sleep state 300, the following transitions may take place. - When detecting an absolute value larger than a third threshold value T3, the device may transition 310 to the
snooze state 400. The third threshold value T3 may be a value indicating that the user is deliberately shaking the device in order for it to wake up. Hence, an absolute value of an accelerometer signal may be used. The third threshold value T3 may typically be higher than the first threshold value T1, since that value merely distinguishes movement from stationary. In addition it is possible to set a flag F1 to indicate that a “wake up on shake” event has occurred. - The device may also transition 320 from the
sleep state 300 to the snooze state based on a clock signal. This may be the case where the device is in normal operation, i.e. takes photos at predetermined intervals, such as two photos per minute, etc. - The device may also transition 330 from the
sleep state 300 on detection of a specific movement or sequence of movements by the motion sensor or the GPS sensor, this procedure may be substantially the same as thetransition 120 to thetake photo state 200 described above. Hence, an interrupt could be provided by the motion sensor, which may immediately cause the device to transition 330 from thesleep state 300 to thetake photo state 200, without going through thesnooze state 400. - From the
snooze state 400, the following transitions may take place. - When detecting that the device is placed stationary and optionally with a specific orientation, as described above with respect to the
transition 210, the device may transition 410 to thesleep state 300. - Similarly, when detecting a low light condition, the device may transition 420 to the
sleep state 300, as described above with respect to thetransition 220 to the sleep state. - If the “wake up on shake” flag F1 is set, thus implying that the device has been shaken, and the light level is too low, the device may increase the third threshold value T3, thus making the device less sensitive to shaking, and then clear the “wake up on shake” flag F1.
- If neither flat surface nor low light is detected, the device may transition 430 to the
take photo state 200. In this case, the third threshold value T3 may be reset, i.e. to normal sensitivity to shaking, and the “wake up on shake” flag F1 may be cleared. - It would be possible to provide a separate light sensor in or near the camera opening, which would enable detection of sufficient light level without making use of the camera. This could be an option for further reducing the battery consumption.
Claims (21)
1.-20. (canceled)
21. A personal network connected device configured to automatically capture images, the personal network connected device comprising:
a housing comprising:
a battery;
an image sensor; a memory; and one or more hardware processors connected with the image sensor and the memory and a wireless network interface; and
wherein the one or more hardware processors are further configured to:
receive a first test image data from the image sensor at a first time; apply rules corresponding to picture quality on the received first test image data;
automatically determine whether to capture additional image data for a time period from the first time and store the additional image data in the memory based on the application of rules indicating that the first test image data meets criteria for storing additional image data in the memory;
receive a second test image data from the image sensor at a second time;
apply the rules corresponding to picture quality on the received second test image data; and
automatically turn off one or more components included in the housing based on the application of rules on the received second test image data indicating that conditions are not appropriate for capturing photos,
wherein the wireless network interface is configured to transmit stored image data to a remote device.
22. The personal network connected device of claim 21 , wherein the one or more hardware processors are further configured to capture a video sequence.
23. The personal network connected device of claim 21 , wherein the application of rules comprises detecting a light level in the received image data.
24. The personal network connected device of claim 21 , wherein the application of rules comprises detecting blur in the received image data.
25. The personal network connected device of claim 21 , wherein the application of rules comprises detecting a face in the received image data.
26. The personal network connected device of claim 21 , wherein the application of rules comprises detecting an object in the received image data.
27. The personal network connected device of claim 21 , wherein the application of rules comprises detecting an orientation of a face or an object in the received image data.
28. The personal network connected device of claim 21 , wherein the one or more hardware processors are configured to receive an input corresponding to a user tap and capture new image data based on the received input.
29. A method of automatically capturing images in a personal network connected device, wherein the personal network connected device comprises a battery, an image sensor, a memory, one or more hardware processors, and a wireless network interface; the method comprising:
receiving image data from an image sensor at a first time;
applying rules corresponding to picture quality on the received image data; and
automatically determining whether to store the received image data in the memory based on the application of rules, wherein the application of rules comprise detecting a light level directly from first test image data captured from the image sensor; and
automatically turning off one or more components included in the housing based on the application of rules on the received image data indicating that conditions are not appropriate for capturing photos.
30. The method of claim 29 , further comprising determining capture of additional image data from the image sensor for a period of time from the first time based on the application of the rules on the received image data, thereby capturing a video sequence.
31. The method of claim 29 , further comprising stopping capture of additional image data from the image sensor based on the application of rules, thereby conserving the battery.
32. The method of claim 29 , wherein the application of rules comprises detecting blur in the received image data.
33. The method of claim 29 , wherein the application of rules comprises detecting a face in the received image data.
34. The method of claim 29 , wherein the application of rules comprises detecting an orientation of a face or an object in the received image data.
35. A personal network connected device configured to automatically capture images, the personal network connected device comprising:
a housing comprising:
a battery;
an image sensor configured to detect image data;
a memory;
one or more hardware processors connected with the image sensor and the memory; and
a wireless network interface;
wherein the one or more hardware processors are further configured to determine whether to store the detected image data in the memory based on an application of rules corresponding to picture quality on the detected image data and determine whether to automatically turn off one or more components included in the housing based on the application of rules on the received image data indicating that conditions are not appropriate for capturing photos,
wherein the wireless network interface is configured to transmit stored image data to a remote device.
36. The camera of claim 35 , wherein the one or more hardware processors are further configured to automatically stop detecting image data from the image sensor based on the application of rules, thereby conserving the battery.
37. The camera of claim 35 , wherein the one or more hardware processors are further configured to automatically determine storage of additional image data from the image sensor for a period of time based on the application of the rules on the received image data, thereby capturing a video sequence.
38. The camera of claim 35 , wherein the application of rules comprises detecting a light level in the received image data.
39. The camera of claim 35 , wherein the application of rules comprises detecting a face in the received image data.
40. The camera of claim 35 , wherein the one or more hardware processors are further configured to automatically select a photograph based on selection rules, wherein the selected photograph is representative of the additional image data.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/053,660 US20190132502A1 (en) | 2013-10-14 | 2018-08-02 | Method of operating a wearable lifelogging device |
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SE1351216-5 | 2013-10-14 | ||
SE1351216 | 2013-10-14 | ||
PCT/EP2014/072024 WO2015055655A1 (en) | 2013-10-14 | 2014-10-14 | Method of operating a wearable lifelogging device |
US201615029448A | 2016-04-14 | 2016-04-14 | |
US15/790,557 US10051171B2 (en) | 2013-10-14 | 2017-10-23 | Method of operating a wearable lifelogging device |
US16/053,660 US20190132502A1 (en) | 2013-10-14 | 2018-08-02 | Method of operating a wearable lifelogging device |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/790,557 Continuation US10051171B2 (en) | 2013-10-14 | 2017-10-23 | Method of operating a wearable lifelogging device |
Publications (1)
Publication Number | Publication Date |
---|---|
US20190132502A1 true US20190132502A1 (en) | 2019-05-02 |
Family
ID=51753203
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/029,448 Active US9800782B2 (en) | 2013-10-14 | 2014-10-14 | Method of operating a wearable lifelogging device |
US15/790,557 Active US10051171B2 (en) | 2013-10-14 | 2017-10-23 | Method of operating a wearable lifelogging device |
US16/053,660 Abandoned US20190132502A1 (en) | 2013-10-14 | 2018-08-02 | Method of operating a wearable lifelogging device |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/029,448 Active US9800782B2 (en) | 2013-10-14 | 2014-10-14 | Method of operating a wearable lifelogging device |
US15/790,557 Active US10051171B2 (en) | 2013-10-14 | 2017-10-23 | Method of operating a wearable lifelogging device |
Country Status (5)
Country | Link |
---|---|
US (3) | US9800782B2 (en) |
EP (1) | EP3058715B1 (en) |
JP (2) | JP6529491B2 (en) |
CN (1) | CN105850111A (en) |
WO (1) | WO2015055655A1 (en) |
Families Citing this family (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9800782B2 (en) | 2013-10-14 | 2017-10-24 | Narrative AB | Method of operating a wearable lifelogging device |
JP6356552B2 (en) * | 2014-09-16 | 2018-07-11 | 東芝メモリ株式会社 | Information processing device |
JP6596328B2 (en) * | 2014-12-26 | 2019-10-23 | アサヒリサーチ株式会社 | Wearable camera |
US9930299B2 (en) * | 2015-12-15 | 2018-03-27 | BOT Home Automation, Inc. | Video on demand for audio/video recording and communication devices |
EP3391650B1 (en) * | 2015-12-15 | 2021-03-03 | Amazon Technologies Inc. | Video on demand for audio/video recording and communication devices |
WO2018099968A1 (en) | 2016-11-29 | 2018-06-07 | Ablynx N.V. | Treatment of infection by respiratory syncytial virus (rsv) |
JP2018195905A (en) * | 2017-05-15 | 2018-12-06 | オリンパス株式会社 | Data processing unit |
WO2019065454A1 (en) * | 2017-09-28 | 2019-04-04 | キヤノン株式会社 | Imaging device and control method therefor |
JP6766086B2 (en) | 2017-09-28 | 2020-10-07 | キヤノン株式会社 | Imaging device and its control method |
WO2019124055A1 (en) | 2017-12-18 | 2019-06-27 | キヤノン株式会社 | Image capturing device, control method therefor, program, and storage medium |
CN114827458A (en) | 2017-12-18 | 2022-07-29 | 佳能株式会社 | Image pickup apparatus, control method thereof, and storage medium |
JP7233162B2 (en) | 2017-12-18 | 2023-03-06 | キヤノン株式会社 | IMAGING DEVICE AND CONTROL METHOD THEREOF, PROGRAM, STORAGE MEDIUM |
US11032468B2 (en) | 2017-12-26 | 2021-06-08 | Canon Kabushiki Kaisha | Image processing apparatus, image processing method, image capturing apparatus, and storage medium |
JP7292853B2 (en) | 2017-12-26 | 2023-06-19 | キヤノン株式会社 | IMAGING DEVICE, CONTROL METHOD AND PROGRAM THEREOF |
GB2570792B (en) | 2017-12-26 | 2020-09-30 | Canon Kk | Image capturing apparatus, method of controlling the same, and storage medium |
JP2019186630A (en) | 2018-04-03 | 2019-10-24 | キヤノン株式会社 | Imaging apparatus, control method thereof, and program |
US11463617B2 (en) | 2018-10-25 | 2022-10-04 | Canon Kabushiki Kaisha | Information processing apparatus, information processing system, image capturing apparatus, information processing method, and memory |
JP6852141B2 (en) | 2018-11-29 | 2021-03-31 | キヤノン株式会社 | Information processing device, imaging device, control method of information processing device, and program |
JP7222683B2 (en) | 2018-12-06 | 2023-02-15 | キヤノン株式会社 | IMAGING DEVICE AND CONTROL METHOD THEREOF, PROGRAM, STORAGE MEDIUM |
JP7348754B2 (en) | 2019-06-03 | 2023-09-21 | キヤノン株式会社 | Image processing device and its control method, program, storage medium |
JP7527769B2 (en) | 2019-09-30 | 2024-08-05 | キヤノン株式会社 | Imaging device, control method thereof, program, and storage medium |
CN110933371B (en) * | 2019-12-02 | 2021-07-13 | 广州小鹏汽车科技有限公司 | Monitoring method and device and vehicle |
JP7500194B2 (en) | 2019-12-27 | 2024-06-17 | キヤノン株式会社 | Imaging device, control method thereof, program, and storage medium |
JP2022070684A (en) | 2020-10-27 | 2022-05-13 | キヤノン株式会社 | Imaging device, control method thereof, and program |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090110386A1 (en) * | 2007-10-31 | 2009-04-30 | Sony Corporation | Photographic apparatus and photographic method |
US9307112B2 (en) * | 2013-05-31 | 2016-04-05 | Apple Inc. | Identifying dominant and non-dominant images in a burst mode capture |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3299741B2 (en) * | 1990-03-14 | 2002-07-08 | オリンパス光学工業株式会社 | Automatic shooting system |
US7086165B2 (en) * | 2004-12-27 | 2006-08-08 | Lenovo Pte. Ltd. | System and method for managing power in an electronic device |
JP2006203395A (en) * | 2005-01-19 | 2006-08-03 | Konica Minolta Holdings Inc | Moving body recognition system and moving body monitor system |
JP2006337689A (en) * | 2005-06-01 | 2006-12-14 | Ricoh Co Ltd | Image photographing apparatus |
EP1793580B1 (en) * | 2005-12-05 | 2016-07-27 | Microsoft Technology Licensing, LLC | Camera for automatic image capture having plural capture modes with different capture triggers |
JP4961914B2 (en) * | 2006-09-08 | 2012-06-27 | ソニー株式会社 | Imaging display device and imaging display method |
US7855743B2 (en) * | 2006-09-08 | 2010-12-21 | Sony Corporation | Image capturing and displaying apparatus and image capturing and displaying method |
JP4810476B2 (en) * | 2007-03-20 | 2011-11-09 | オリンパス株式会社 | Image photographing apparatus and image photographing system |
JP4458151B2 (en) * | 2007-11-06 | 2010-04-28 | ソニー株式会社 | Automatic imaging apparatus, automatic imaging control method, image display system, image display method, display control apparatus, display control method |
US20100315535A1 (en) * | 2008-02-13 | 2010-12-16 | Freescale Semicunductor, Inc. | Reducing power consumption in a portable electronic device with a luminescent element |
JP5015058B2 (en) * | 2008-04-16 | 2012-08-29 | オリンパスイメージング株式会社 | Imaging device |
JP2009267792A (en) * | 2008-04-25 | 2009-11-12 | Panasonic Corp | Imaging apparatus |
KR101477542B1 (en) | 2008-11-12 | 2014-12-30 | 삼성전자주식회사 | Apparatus for processing digital image and method for controlling the same |
JP2010154070A (en) * | 2008-12-24 | 2010-07-08 | Olympus Imaging Corp | Camera |
JP5703806B2 (en) * | 2011-02-08 | 2015-04-22 | 株式会社リコー | Imaging apparatus and imaging method |
CN202261565U (en) * | 2011-09-30 | 2012-05-30 | 深圳市三辰科技有限公司 | Intelligent camera system with timing control function |
US8979398B2 (en) * | 2013-04-16 | 2015-03-17 | Microsoft Technology Licensing, Llc | Wearable camera |
US9800782B2 (en) | 2013-10-14 | 2017-10-24 | Narrative AB | Method of operating a wearable lifelogging device |
-
2014
- 2014-10-14 US US15/029,448 patent/US9800782B2/en active Active
- 2014-10-14 JP JP2016523979A patent/JP6529491B2/en not_active Expired - Fee Related
- 2014-10-14 WO PCT/EP2014/072024 patent/WO2015055655A1/en active Application Filing
- 2014-10-14 CN CN201480066610.XA patent/CN105850111A/en active Pending
- 2014-10-14 EP EP14786647.9A patent/EP3058715B1/en active Active
-
2017
- 2017-10-23 US US15/790,557 patent/US10051171B2/en active Active
-
2018
- 2018-08-02 US US16/053,660 patent/US20190132502A1/en not_active Abandoned
-
2019
- 2019-05-23 JP JP2019096923A patent/JP2019146268A/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090110386A1 (en) * | 2007-10-31 | 2009-04-30 | Sony Corporation | Photographic apparatus and photographic method |
US9307112B2 (en) * | 2013-05-31 | 2016-04-05 | Apple Inc. | Identifying dominant and non-dominant images in a burst mode capture |
Also Published As
Publication number | Publication date |
---|---|
CN105850111A (en) | 2016-08-10 |
JP2016536868A (en) | 2016-11-24 |
US10051171B2 (en) | 2018-08-14 |
EP3058715A1 (en) | 2016-08-24 |
EP3058715B1 (en) | 2021-09-01 |
US20160269613A1 (en) | 2016-09-15 |
JP6529491B2 (en) | 2019-06-12 |
US9800782B2 (en) | 2017-10-24 |
WO2015055655A1 (en) | 2015-04-23 |
US20180048802A1 (en) | 2018-02-15 |
JP2019146268A (en) | 2019-08-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10051171B2 (en) | Method of operating a wearable lifelogging device | |
JP2016536868A5 (en) | ||
US20200396383A1 (en) | Methods and apparatus to operate a mobile camera for low-power usage | |
US10638046B2 (en) | Wearable device, control apparatus, photographing control method and automatic imaging apparatus | |
EP3200483B1 (en) | Method and device for acquiring location information | |
US20120081392A1 (en) | Electronic device operation adjustment based on face detection | |
US20150189176A1 (en) | Domain aware camera system | |
US9794456B2 (en) | Camera device and method for controlling a camera device | |
KR101819925B1 (en) | Dynamic control for data capture | |
US9930479B2 (en) | Method, apparatus, and mobile terminal for collecting location information | |
US20160249852A1 (en) | Information processing apparatus, information processing method, and program | |
EP2958315B1 (en) | Camera device and method for controlling a camera device | |
JP2013197625A (en) | Terminal, display screen drawing direction correction method and program, and computer readable recording medium | |
US20150103164A1 (en) | Lifelogging device, lifelogging system and method of controlling a wearable lifelogging device | |
US9552043B2 (en) | Handheld electronic device and operation method of the same | |
US20230075940A1 (en) | Wrist-Wearable Device for Delayed Processing of Images Captured by the Wrist-Wearable Device, And Methods of Use Thereof | |
KR101805347B1 (en) | Lifelog camera and method of controlling in association with an intrapersonal area network | |
US20160004619A1 (en) | Automated maintenance of an electronic device | |
US20170162032A1 (en) | Personal security | |
CN106936461B (en) | A kind of intelligence wearing vest based on IOT platform | |
KR101138345B1 (en) | Digital camera of frame type with wireless communication function | |
KR20150030455A (en) | A Portable Device and A Method for Controlling the Same | |
KR20150042918A (en) | Terminal for ward and guarddian, cloud server, their method of operation and recording medium for safety management of ward | |
WO2016057972A1 (en) | Method, apparatus, and mobile terminal for collecting location information |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |