CN107580292B - Electronic device, selection control system, selection method, and recording medium - Google Patents

Electronic device, selection control system, selection method, and recording medium Download PDF

Info

Publication number
CN107580292B
CN107580292B CN201710505561.0A CN201710505561A CN107580292B CN 107580292 B CN107580292 B CN 107580292B CN 201710505561 A CN201710505561 A CN 201710505561A CN 107580292 B CN107580292 B CN 107580292B
Authority
CN
China
Prior art keywords
sensor information
action state
sensor
unit
electronic device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710505561.0A
Other languages
Chinese (zh)
Other versions
CN107580292A (en
Inventor
中村优
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Casio Computer Co Ltd
Original Assignee
Casio Computer Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Casio Computer Co Ltd filed Critical Casio Computer Co Ltd
Publication of CN107580292A publication Critical patent/CN107580292A/en
Application granted granted Critical
Publication of CN107580292B publication Critical patent/CN107580292B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention provides an electronic device, a selection control system, a selection method and a recording medium, which can easily acquire necessary information according to the action state of a user. The electronic device is provided with: a sensor unit (16) that acquires a plurality of types of sensor information; a behavior state determination unit (51) for determining the behavior state of the subject; a report control unit (53) that selects, as sensor information to be reported to the user, sensor information corresponding to the determination result among the plurality of types of sensors, based on the determination result of the action state determination unit (51); and an output unit (18). The action state determination unit (53) determines the action state of the user based on sensor information such as biological information or external environment information obtained from the sensor unit (16). The report control unit (53) controls the electronic device so that sensor information corresponding to the user's action state determined by the action state determination unit (51) is reported via the output unit (18).

Description

Electronic device, selection control system, selection method, and recording medium
The present application claims priority based on application 2016-.
Technical Field
The invention relates to an electronic device, a selection control system, a selection method, and a recording medium.
Background
Conventionally, as described in japanese patent application laid-open No. 2009-88989, there is a technique of acquiring the number of steps and the movement distance of a user in a mobile phone including a plurality of sensors and reporting the number of steps and the movement distance to the user.
However, with the recent increase in the number of types of highly functional sensors for mobile terminals and tablet panels, it is necessary for a user to perform a selection operation of a sensor every time when confirming desired sensor information, and the selection operation is very troublesome.
Disclosure of Invention
The present invention has been made in view of the above problems, and an object of the present invention is to enable a user to confirm desired sensor information without bothering the user.
An electronic device according to claim 1 of the present invention is characterized by comprising: an acquisition unit that acquires information of a plurality of sensors; a determination unit that determines an action state of the subject; and a selection unit that selects, based on the determination result of the determination unit, sensor information corresponding to the determination result among the plurality of types of sensors as sensor information to be reported to a user.
A selection control system according to claim 2 of the present invention is a selection control system for transmitting arbitrary information between a server and an electronic device via a network, the selection control system including: a determination unit that determines the action state of the subject person based on the received sensor information; a selection unit that selects a sensor based on a determination result of the action state; and a transmission unit that transmits the selected selection result to the electronic device, the electronic device including: a plurality of sensors; an acquisition unit that acquires information of a plurality of sensors; a transmitting unit that transmits the sensor information to the server; and a selection unit that selects, as sensor information to be reported to a user, sensor information corresponding to a determination result among the plurality of types of sensors, based on the determination result of the determination unit.
The selection method according to claim 3 of the present invention comprises the steps of: an acquisition step, acquiring information of various sensors; a determination step of determining an action state of the subject; and a selection step of selecting, as the sensor information reported to the user, sensor information corresponding to a determination result among the plurality of types of sensors, based on the determination result of the determination step.
A computer-readable recording medium according to claim 4 of the present invention is a computer-readable recording medium storing a program for causing a computer to function as: acquiring and processing to acquire various sensor information; a determination process of determining an action state of the subject; and a selection process of selecting, as the sensor information to be reported to the user, sensor information corresponding to a determination result among the plurality of types of sensors based on the determination result of the determination process.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and, together with a general description given above and the detailed description of the embodiments given below, serve to explain the principles of the invention.
A further understanding of the present application can be obtained by considering the following detailed description in conjunction with the following drawings.
Fig. 1 is a block diagram showing a hardware configuration of an electronic device according to an embodiment of the present invention.
Fig. 2A is a diagram showing an example of a display pattern of an electronic device according to an embodiment of the present invention.
Fig. 2B is a diagram showing an example of a display pattern of the electronic device according to the embodiment of the present invention.
Fig. 2C is a diagram showing an example of a display pattern of the electronic device according to the embodiment of the present invention.
Fig. 2D is a diagram showing an example of a display pattern of the electronic device according to the embodiment of the present invention.
Fig. 3A is a diagram showing an example of a data structure of an input data table (motion state).
Fig. 3B is a diagram showing an example of a data structure of an input data table (external environment).
Fig. 4A is a table showing action patterns used for action estimation in the present embodiment.
Fig. 4B is a table showing combinations of sensor information set in accordance with the action pattern.
Fig. 5 is a functional block diagram relating to the report processing of the present invention.
Fig. 6 is a flowchart showing the overall flow of the sensor information selection process.
Fig. 7 is a detailed flowchart related to a portion of sensor information acquisition suitable for the action state in the flowchart of fig. 6.
Fig. 8 is a block diagram of a server according to an embodiment of the present invention.
Fig. 9 is a flowchart showing a flow of processing in the selection control system according to the present invention.
Detailed Description
The embodiments of the present invention will be explained based on the drawings
Hereinafter, embodiments of the present invention will be described with reference to the drawings.
Fig. 1 is a block diagram showing a hardware configuration of an electronic device 1 according to an embodiment of the present invention.
The electronic device 1 is configured as a mobile terminal such as a smartphone or a list terminal, for example.
The electronic device 1 includes: a processor (Central Processing Unit) 11, a ROM (read only Memory) 12, a RAM (Random Access Memory) 13, a bus 14, an input/output interface 15, a sensor Unit 16, an input Unit 17, an output Unit 18, a storage Unit 19, a communication Unit 20, and a driver 21.
The processor 11 executes various processes in accordance with a program recorded in the ROM12 or a program downloaded from the storage unit 19 to the RAM 13.
The RAM13 also stores data and the like necessary for the processor 11 to execute various processes.
The processor 11, the ROM12, and the RAM13 are connected to each other via a bus 14. The bus 14 is also connected to an input/output interface 15. The input/output interface 15 is connected to a sensor unit 16, an input unit 17, an output unit 18, a storage unit 19, a communication unit 20, and a driver 21.
The sensor unit 16 includes a biosensor 111, an environment sensor 112, and an imaging sensor 113. The biosensor 11 includes: a 6-axis acceleration sensor capable of detecting a moving direction, a gyroscope sensor capable of detecting an orientation, a magnetic sensor capable of detecting a combination of an orientation and a rotation, and a plurality of sensors for measuring biological information such as a pulse rate (heart rate), a blood pressure, and a body temperature. The measured biometric information is stored in the storage unit 19. The environment sensor 112 includes a plurality of sensors for measuring environment information such as a position, an air temperature, an air pressure, a humidity, an ultraviolet amount, and noise, which are environments where the user is located. For example, a GPS unit that receives GPS signals from a plurality of GPS satellites via a GPS receiving antenna is provided, and the environment information measured by the environment sensor 112 is stored in the storage unit 19.
The image sensor 113 includes an optical lens unit and an image sensor, not shown, and acquires biometric information and environmental information of the user by analyzing the captured content.
For example, the behavior state of the user can be determined with higher accuracy by detecting the degree of severity of the behavior of the user from the degree of blur of the captured image, or by determining that the user is near water when a mountain or a sea is reflected in the image.
The optical lens unit is configured by a light-collecting lens, for example, a focus lens or a zoom lens, for capturing an image of a subject. The focus lens is a lens that forms an object image on a light receiving surface of the image sensor. A zoom lens is a lens in which a focal length is freely changed within a certain range. In addition, a peripheral circuit for adjusting setting parameters such as focus, exposure, and white balance is provided in the optical lens unit as necessary.
The image sensor is configured by a photoelectric conversion element, AFE (Analog Front End element), and the like. The photoelectric conversion element is formed of, for example, a CMOS (Complementary Metal Oxide Semiconductor) type photoelectric conversion element or the like. The subject image is incident on the photoelectric conversion element from the optical lens unit. Therefore, the photoelectric conversion element photoelectrically converts (captures) the subject image, accumulates image signals for a certain period of time, and sequentially supplies the accumulated image signals to the AFE as analog signals.
The AFE performs various signal processes such as an Analog/Digital (a/D) conversion process on the Analog image signal. Through various signal processing, a digital signal is generated and output as an output signal of the imaging unit 16.
Hereinafter, such an output signal of the imaging unit 16 is referred to as "captured image data". The data of the captured image is appropriately supplied to the processor 11, an image processing unit not shown, and the like.
The input unit 17 is configured by various buttons, a touch panel provided in the display, and the like, and inputs various information in accordance with an instruction operation by the user.
The output unit 18 is configured by a display, a speaker, and the like, and outputs images and sounds.
The storage unit 19 is configured by a hard disk, a DRAM (Dynamic Random Access Memory), or the like, and stores data of various images.
The communication unit 20 is configured to be able to communicate with an external device/an external terminal by using short-range wireless communication such as BLE (Blue tooth: registered trademark Low Energy) or IEEE 802.11-based wireless LAN (Local Area Network).
A removable medium 31 configured as a magnetic disk, an optical disk, an opto-magnetic disk, a semiconductor memory, or the like is appropriately attached to the drive 21. The program read from the removable medium 31 by the drive 21 is installed in the storage unit 19 as needed. The removable medium 31 can also store various data such as image data stored in the storage unit 19, as in the storage unit 19.
Fig. 2 is a schematic diagram showing an example of a display pattern in the output unit 18 according to the embodiment of the present invention.
As shown in fig. 2, the electronic device 1 according to the present embodiment is configured to notify the user by displaying necessary information in the display area 181 and the display area 182 in accordance with the action state of the user.
In the present embodiment, since the electronic device 1 is exemplified by a smart watch, the area for counting time when the watch is displayed is configured to be the display area 181 and the display area 182, and the sensor information is displayed.
The processor 11 determines the exercise state of the user and the external environment from an input data table shown in fig. 3 described later, using the data of the biometric information of the user acquired by the biometric sensor 111 and the data of the external environment acquired by the environment sensor 112. The motion state of the user and the external environment may be determined by adding data of the biometric information of the user acquired by the imaging sensor 113 and data of the external environment.
The determined exercise state of the user and the external environment are used to determine the action state of the user from an action state determination table shown in fig. 4A described later, and sensor information is selected based on a sensor information selection table shown in fig. 4B described later.
Then, the selected sensor information is displayed in the display area 181 and the display area 182 shown in fig. 2A, whereby necessary sensor information is reported in accordance with the action state of the user.
For example, when it is determined that the action state of the user is "walking" based on the exercise state data of the user and the external environment data with reference to the result of the action state determination table shown in fig. 4A described later, the sensor information of "number of steps" and "calories burned" is selected in the sensor information selection table.
Then, as in the display state shown in fig. 2B, by displaying the "number of steps" information in the display area 181 in fig. 2A, the "calorie consumed" information is displayed in the display area 182 to be reported to the user.
The display mode of the output unit 18 is not limited to the simultaneous display of analog needles as shown in fig. 2A, and may be configured to perform display processing even in the case of performing digital display without needles as shown in fig. 2D, for example.
In fig. 2A, 2 display regions are set in the region of the output unit 18, but the present invention is not limited to this, and a plurality of display regions may be further set in the region of the output unit 18.
With the above configuration, the user can visually recognize more sensor information at a time.
Fig. 3A shows an example of a data structure of an input data table 121a for setting a motion state of a user determined based on biological information acquired from the biological sensor 111, and fig. 3B shows an example of a data structure of an input data table 121B for setting based on information of an external environment acquired from the environment sensor 112.
From the input data table 121a of fig. 3A, the exercise state of the user such as "walking" or "still" can be acquired from the biological information such as the heart rate, pulse, and motion history acquired from the biological sensor 111, and these pieces of information are set in the input data table 122a of the storage unit 19 under the control of the processor 11. Further, by having the environment information acquired from the environment sensor 112, or the GPS track and the map data, it is possible to investigate the altitude and other geographical attributes for certain position information. For example, it can be determined that the user is now on the road, or is in a mountain, a lake, a sea, or the like, and further, is above the altitude o m, in the road/building/near Δ mountain/. o lake, which information is set to the input data table 122b of the storage section 19 under the control based on the processor 11.
Fig. 4A shows an example of the data structure of the action state determination table 122a stored in the storage unit 19, and fig. 4B similarly shows an example of the data structure of the sensor information selection table 122B stored in the storage unit 19. Fig. 4A is a table composed of a matrix of motion states and external environments stored in an input data table, and is used to determine the current action state of the user. Fig. 4B is a table used to select a combination of sensor information (display information 1 and display information 2) necessary for the user in the action state. Are determined or predicted based on the motion state (input data table 121a) and the external environment (input data table 121b) set in the input data table 121.
According to the action state determination table 122a of fig. 4A, if the action state of the input data table 121a is "on foot" and the external environment of the input data table 121b is "on road", it can be determined that the action state of the user is "walking", and if the action state is "on foot", and the external environment is "mountain" or the increase and decrease in altitude is o m/h or more ", it is determined that the action state of the user is" hiking ". The determination is made by the processor 11. When the exercise state of the input data table 121a is "still", the action state of the user is determined as "rest" when the external environment of the input data table 121b is arbitrary. Further, in the case where the exercise state of the input data table 121a is "exercise with a specific arm", the action state of the user is determined as "fishing" in each of the external environments of the input data table 121b, i.e., "mountain", "lake", and "sea".
Note that, in the action pattern determination table 122a shown in fig. 4A, a symbol x indicates an action pattern that cannot be generated normally.
However, this is given as an example, and the action pattern to which the present invention can be applied is not limited to the description of the action pattern determination table 122a, and may be configured so that the action pattern can be arbitrarily input at the position of the symbol ×.
The sensor information selection table 122B shown in fig. 4B represents a selection table of sensor information corresponding to the action state. That is, when the determined or predicted action state of the user is "walking", the 2 data are reported to the user in order to estimate that the sensor information necessary for the user is "number of steps" and "calories burned". When the running mode is determined to be "running", the 2 data are set to be reported to the user in order to estimate that the sensor information necessary for the user is "speed" and "calories burned".
In addition, if the action state of the user is "fishing", the 2 data are set to be reported to the user in order to infer that the sensor information necessary for the user is "weather" and "air pressure". In the sensor information selection sensor control table 122B shown in fig. 4B, a combination of sensor information necessary for the user is determined in advance, but the combination may be set arbitrarily by the user, and the sensor information necessary for the user can be reported more reliably by such a configuration.
In the above configuration, when the sensor information suitable for the action state of the user is selected in the sensor information selection table 122b, the electronic device 1 may be configured to start the sensors necessary for acquiring the selected sensor information and stop the unnecessary sensors.
In the above-described configuration of the sensor control, in order to repeat the selection/notification process of the sensor information, the sensor necessary for determining the action state of the user may be configured not to be stopped even if the sensor is not necessary for acquiring the selected sensor information.
Fig. 5 is a functional block diagram showing a functional configuration for executing the report processing among the functional configurations of the electronic apparatus 1.
The report processing is a series of processing for notifying information necessary for the user from information acquired by the electronic device 1 from a sensor or the outside when the action state of the user is estimated.
The "reporting process" includes an "action state determination process based on the action state and the environmental information" and a "sensor information selection process based on the action state".
When the report processing is executed, as shown in fig. 5, the action state determination unit 51, the sensor control unit 52, and the report control unit 53 function in the processor 11 to perform the notification processing to the user.
In addition, the action state storage section 71 and the sensor information storage section 72 are set in one area of the storage section 19.
The action state storage unit 71 stores an action state determination table 122a for determining the action state of the user based on the sensor information acquired from the sensor unit 16, and also stores information on the trend of the action state determination result of the user.
The sensor information storage unit 72 stores a sensor information selection table for selecting sensor information to be reported in accordance with the action state of the user, and stores change information in a case where the user changes a combination of sensor information stored in advance as change tendency information of the user.
The action state determination unit 51 analyzes the state of the electronic device 1 based on the sensor information acquired by the sensor unit 16, and determines the action state of the user carrying the electronic device 1 based on the action state determination table stored in the action state storage unit 71.
The sensor control unit 52 performs control so as to activate sensors necessary for acquiring sensor information corresponding to the action state of the user determined by the action state determination unit 51 and deactivate sensors not related to the action state determination.
The report control unit 53 acquires sensor information corresponding to the action state of the user determined by the action state determination unit 51 from the sensor information storage unit 72, and outputs and reports the sensor information through the output unit 18.
The operation (report processing) of the electronic device 1 according to embodiment 1 of the present invention shown in fig. 1 to 6 will be described in detail below with reference to flowcharts in fig. 6 and 7. The steps described below can be implemented by causing a computer to execute a corresponding program.
(basic action)
Fig. 6 shows a flowchart of basic operations of the electronic device 1 according to the embodiment of the present invention. Referring to fig. 6, the biometric sensor 111 of the sensor unit 16 of the electronic device 1 according to the embodiment of the present invention detects physical condition data (biometric information) of the user, and the processor 11 acquires the data and sends the data to the action state determination unit 51 (step S100). Next, the environment sensor 112 detects external environment data (environment information), and the processor 11 acquires the data and delivers it to the action state determination unit 51 (step S101). At this time, the action state determination unit 51 also acquires data other than GPS detected by the environment sensor 112. In addition, the measurement and acquisition of the biological information and the environmental information may be performed in any order, and may be performed in the order of detection by the respective sensors.
The action state determination unit 51 of the processor 11 determines the action state of the user based on the action state determination table 122 stored in the action state storage unit 71 (step S102). In the determination of the action state, the action state determination unit 51 searches for data suitable for the state of the user from the motion state and the external environment shown in the input data tables 121a and 121b based on the biological information and the environmental information acquired from the sensor unit 16. Then, the action state determination table 122a is indexed based on the motion state and the external environment detected from the input data table 121. For example, when the exercise state set in the input data table 121a is "on foot" and the external environment set in the input data table 121b is "on the road", the action state determination unit 51 determines the action state of the user as "walking" based on the action state determination table 122 a.
Next, the action state determination unit 51 performs an action history learning process based on the determination result (step S105). The action state determination unit 51 performs control so that the learning result is reflected in the action pattern of the type stored in the action pattern determination table 122a, and stores the action pattern as the action tendency of the user. That is, the action history is updated in sequence in accordance with the action of the user, and is predicted with higher accuracy with reference to the action tendency in which the action state is updated.
Next, the report control unit 53 receives the determination result from the action state determination unit 51, and performs a selection process of sensor information corresponding to the determined action state of the user (step S105).
A specific process flow will be described later with reference to the flowchart of fig. 7.
After the selection processing based on the sensor information by the report control unit 53, the sensor control unit 52 performs the sensor control processing of the sensor unit 16, starts the sensors necessary for the report processing, and stops the sensors unnecessary for the report processing (step S106).
For example, the sensor control unit 52 starts a sensor necessary for the report processing based on the sensor information selected by the report control unit 53. Further, whether or not an unnecessary sensor which is not related to acquisition of the selected sensor information and is not related to determination of the action state is activated is checked, and when it is determined that the unnecessary sensor is activated, the unnecessary sensor is stopped.
The report control unit 53 outputs the sensor information selected in step S105 to the output unit 18, thereby reporting to the user (step S107).
For example, as shown in fig. 2B, the report control unit 53 displays the selected sensor information in the display areas 181 and 182 of the display, thereby reporting the necessary sensor information according to the action state of the user.
After the report processing to the user is performed, the action state determination unit 51 determines whether or not there is a change in the action state of the user based on the sensor information from the sensor unit 16 (step S108).
If it is determined in step S108 that the action state of the user has changed, the process returns to step S101, and the reporting process is performed again.
If it is determined in step S108 that the action state of the user has not changed, the process proceeds to step S109.
In step S109, it is determined whether an operation to end the content of the report processing is detected.
For example, when the user inputs an operation to end the content of the report processing to the input unit 17, or when the operation input from the user and the action information of the user are not detected for a long time, the electronic device 1 ends the report processing.
Further, in a case where the above-described end operation is not detected, the process returns to step S107.
(sensor information selection processing)
The flow of the selection process of the sensor information according to the action state of the user by the report control unit 53 will be described in detail with reference to the flowchart shown in fig. 7.
First, in step S103, sensor information corresponding to the action state of the user determined by the action state determination unit 51 is selected (step S200).
For example, the report control unit 53 refers to the sensor information selection table 122b stored in the sensor information storage unit 72 to select sensor information corresponding to the determined action state of the user. For example, when the action state of the user is determined to run, the sensor information selection table 122b selects "speed" and "calories burned" which are combinations of the sensor information corresponding to running.
Next, it is determined whether or not the change history by the user is stored for the selected combination of sensor information (step S201).
For example, the report control unit 53 refers to the sensor information storage unit 72 to check whether or not the user has performed information on the change history of the change process of the sensor information with respect to the selected combination of the sensor information.
If it is determined that the change history is recorded, the process proceeds to step S202.
If it is determined that the change history is not recorded, the process returns to the basic operation flow.
In step S202, the report controller 53 changes the combination of the sensor information based on the change history of the sensor information by the user.
For example, the report control unit 53 refers to the change history based on the sensor information of the user, changes the combination of the sensor information determined in advance to the combination of the sensor information changed by the user, and returns to the flow of the basic operation.
With the above configuration, the action state determination unit 51 of the electronic device 1 according to the embodiment of the present invention can select sensor information suitable for the action state of the user and output the sensor information to the output unit 18 in order to refer to the action state storage unit 71 storing the action state, and can automatically provide information necessary for the user to the user in accordance with the action state of the user. Therefore, the user can automatically obtain necessary information according to various actions of the user without manually selecting necessary information from various information.
In the present embodiment, the combination of sensor information determined in advance is changed to the combination of sensor information changed by the user with reference to the change history based on the sensor information of the user, but the combination of sensor information whose change history is not recorded may be changed to the combination of sensor information according to the preference of the user by reflecting the change history recorded in the other combination of sensor information.
For example, when the combination of "number of steps" and "calories burned" corresponding to "walking" is frequently changed to "number of steps" and "air temperature" by the user, the combination of "speed" and "air temperature" may be changed and reported even when the user's action is determined to be "running".
In the present embodiment, the configuration is such that the reported sensor information is switched when the user's action state is detected, the reported sensor information is selected and displayed, and then a change in the action state is detected again.
For example, the change in the action state of the user may be detected once, and after the reported sensor information is selected and displayed, the change in the action state of the user may not be detected for a certain period of time.
In the present embodiment, the combination of the sensor information corresponding to the determined action state is displayed, but all or a plurality of displayable sensor information may be displayed, and the sensor information corresponding to the action state among them may be highlighted.
For example, the display area of the sensor information corresponding to the action state of the user among the plurality of sensor information displayed is configured to be displayed wider than the other sensor information, to be displayed with brightness higher than the other sensor information, or to be displayed at a more conspicuous position (center, upper portion, etc. of the display area) than the other sensor information.
The sensor unit 16 in the present embodiment may be configured to include a biosensor and an environment sensor having different accuracies. For example, when the determined action state of the user is an action state in which highly accurate sensor information is desired to be displayed, the sensor information acquired by the highly accurate sensor may be preferentially displayed. In addition, when the determined action state of the user is an action state in which it is desired to use a sensor with low power consumption, information acquired by a sensor with low accuracy and low power consumption may be preferentially displayed.
Further, the change in the action state of the user may be notified to the user by sound or light.
For example, when it is determined that the action state of the user changes from "walking" to "running", the action change and the change in the reported sensor information may be reported to the user by a report based on the sound of "the action state has changed" or a report based on the light emission of a light emitting member provided in the electronic device.
In the present embodiment, although the unnecessary sensors for acquiring the sensor information selected based on the action state of the user are stopped, the frequency of acquiring the information by the sensors (the frequency of sensing detection) may be reduced without stopping the unnecessary sensors.
In the present embodiment, the output unit is configured to report the sensor information to the user by displaying the sensor information on the display, but the present invention is not limited to this, and may be configured to report the sensor information desired by the user by outputting the sensor information as voice from a microphone provided in the output unit.
In the present embodiment, the above-described functions are implemented in one electronic device, but the above-described processing may be performed between a plurality of devices so that the information of the sensor acquired by the electronic device is transmitted to the server to determine the action state, and the electronic device having the output unit may perform the report processing of the sensor information based on the result.
In the present embodiment, the sensor information corresponding to the determination result among the plurality of types of sensors is selected as the sensor information to be reported to the user, but the sensor information may be transmitted to the outside via the communication unit.
Fig. 8 is a block diagram for executing the action determination process among the functional configurations of the server 2.
The action determination process is a series of processes for estimating the action of the user from the sensor information.
The server 2 includes: control unit 81, storage unit 82, communication unit 83, and determination unit 84.
When the action determination process is executed, as shown in fig. 8, the control unit 81, the storage unit 82, the communication unit 83, and the determination unit 84 function to perform the action determination process of the user.
The control unit 81 has a processor function, and controls the server 2 to control data communication among the storage unit 82, the communication unit 83, and the determination unit 84, and to control processing of each unit.
The storage unit 82 is configured by a hard disk, a Dynamic Random Access Memory (DRAM), or the like, and stores an input data table 121a in which the motion state of the user determined based on the received biological information is set, an input data table 121b set based on the received information on the external environment, an action state determination table 122a for determining the action state of the user based on the received sensor information, a sensor information selection table 122b for determining a selection table of sensor information corresponding to the action state, and also stores information on the trend of the action state determination result of the user.
The communication unit 83 is configured to be able to communicate with an external device/an external terminal by using short-range wireless communication such as BLE (Blue tooth: registered trademark Low Energy) or IEEE 802.11-based wireless lan (local Area network).
The determination unit 84 analyzes the state of the user based on the received sensor information, and determines the action state of the user based on the action state determination table 122a stored in the storage unit 82.
Hereinafter, the process of the selection control system according to the present invention will be described in detail with reference to the flowchart of fig. 9. The steps described below can be implemented by causing a computer to execute a corresponding program.
Fig. 9 shows a report process of the electronic device 1 of the selection control system according to the present invention by a flowchart. Referring to fig. 9, the electronic device 1 of the selection control system according to the present invention acquires, as sensor information, physical condition data (biological information) of the user acquired by the biological sensor 111 of the sensor unit 16, external environment data (environmental information) acquired by the environment sensor 112, and data such as GPS (step S300). The measurement and acquisition of the biological information and the environmental information are performed in any order, and are acquired in the order of detection by the respective sensors.
Next, the electronic device 1 transmits the acquired sensor information to the server 2 via the communication unit 20 (step S301).
Next, the server 2 receives the sensor information via the communication unit 83 (step S400), and the determination unit 84 determines the action state of the user based on the action state determination table 122a stored in the storage unit 82 with respect to the received sensor information (step S401). In the determination of the motion state, based on the acquired sensor information, data suitable for the state of the user is searched for from the motion state and the external environment shown in the input data tables 121a and 121 b. Then, the action state determination table 122a is indexed based on the motion state detected from the input data table 121 and the external environment.
Next, the server 2 performs a process of selecting sensor information corresponding to the determined action state of the user (step S402). Then, the acquired result of the selection of the sensor is transmitted to the electronic apparatus 1 via the communication unit 83 (step S403).
Next, the electronic device 1 receives the determination result via the communication unit 20 (step S303), receives the determination result, and outputs sensor information corresponding to the determined action state of the user (step S304). For example, the sensor control unit 52 starts a sensor necessary for the report processing based on the sensor information selected by the report control unit 53. Further, whether or not an unnecessary sensor is activated, which is not involved in acquisition of the selected sensor information and is not involved in determination of the action state, is checked, and when it is determined that the unnecessary sensor is activated, the unnecessary sensor is stopped.
In the present embodiment, information may be communicated between the electronic device 1 and the server 2 via a network, and the processing provided by the electronic device 1 and the server 2 may be changed in consideration of the communication status of the network. For example, the server 2 performs the selection processing of the sensor information corresponding to the determined action state of the user (step S402), but the determination result may be transmitted from the server 2 to the electronic device 1, and the electronic device 1 may perform the selection processing of the sensor information corresponding to the determined action state of the user based on the received determination result.
In the present embodiment, the display shape is illustrated as a circle, but the display shape is not limited to this, and the display shape of the output unit 18 may be a polygon or a curved shape, and the shape and size are not limited.
In the present embodiment, the configuration is such that a single display displays a plurality of sensor information items, but the present invention is not limited to this, and the configuration may be such that a plurality of displays display sensor information items in a distributed manner in a reporting system including a plurality of displays.
In the present embodiment, a plurality of sensor information items are displayed on one display, but the present invention is not limited to this, and only 1 type of sensor information item suitable for the action state of the user may be reported on one display.
The present invention has been described above with reference to the embodiments, but the technical scope of the present invention is not limited to the scope described in the above embodiments. It will be apparent to those skilled in the art that various changes or modifications can be made to the above embodiments. It is to be understood that the embodiments to which such changes or improvements are applied are also encompassed in the technical scope of the present invention, as described in the claims.

Claims (17)

1. An electronic device is characterized by comprising:
an acquisition unit that acquires information of a plurality of sensors;
a determination unit that determines an action state of the subject;
a selection unit that selects sensor information corresponding to a determination result among the plurality of types of sensors as sensor information to be reported to a user, based on the determination result of the determination unit; and
a changing means for changing a combination of the action state and the sensor information stored in association with each other,
the selection means selects the predetermined sensor information based on a combination of the action state and the sensor information changed by the change means.
2. The electronic device of claim 1,
the determination unit determines the action state of the subject person based on the sensor information acquired by the acquisition unit.
3. The electronic device of claim 1,
the plurality of sensors includes: a 1 st sensor that acquires information on a motion state of the subject person, and a 2 nd sensor that acquires information on an external environment of the subject person,
the determination means determines the action state of the subject person based on the information acquired by the 1 st sensor or the 2 nd sensor.
4. The electronic device of claim 1,
further provided with: a storage unit for storing a plurality of action states of the subject person,
the determination means determines the action state of the subject based on the plurality of action states stored in the storage means.
5. The electronic device of claim 4,
the electronic device further includes:
a storage control unit that associates the plurality of action states with sensor information of at least one of the plurality of sensors and stores the associated action states in the storage unit,
the selection means selects sensor information corresponding to the action state determined by the determination means, based on the storage means.
6. The electronic device of claim 1,
the selection unit comprises a display unit for displaying the sensor information,
the display unit is provided with a plurality of display areas for displaying the sensor information.
7. The electronic device of any of claims 1-6,
the selection unit selects the predetermined sensor information based on the frequency of the combination change by the change unit.
8. The electronic device of claim 1,
the selection unit preferentially reports sensor information acquired by a sensor corresponding to the determination among the plurality of types of sensors to a user.
9. The electronic device of claim 1,
the selection unit selects sensor information acquired by a sensor corresponding to the determination from among the plurality of types of sensors, and reports the sensor information acquired by the selected sensor to a user.
10. The electronic device of claim 1,
storing the plurality of action states in association with one sensor information among the plurality of sensors,
based on the action state of the subject person, sensor information corresponding to the determined action state is selected.
11. The electronic device of claim 1,
storing one action state among the plurality of action states in association with the plurality of sensor information,
based on the action state of the subject person, sensor information corresponding to the determined action state is selected.
12. The electronic device of claim 1,
the sensor information is information acquired by the sensor.
13. The electronic device of claim 1,
based on the determination result, which sensor among the plurality of sensors is used is specified.
14. The electronic device of claim 1,
the electronic device is provided with a communication unit,
the selection means transmits the sensor information from the communication unit to the outside via a network.
15. A selection control system for transmitting arbitrary information between a server and an electronic device via a network,
the server is provided with:
a determination unit that determines the action state of the subject person based on the received sensor information;
a selection unit that selects a sensor based on a determination result of the action state; and
a transmitting unit that transmits the selected selection result to the electronic device,
the electronic device is provided with:
a plurality of sensors;
an acquisition unit that acquires information of a plurality of sensors;
a transmitting unit that transmits the sensor information to the server;
a selection unit that selects, as sensor information to be reported to a user, sensor information corresponding to a determination result among the plurality of types of sensors, based on the determination result of the determination unit; and
a changing means for changing a combination of the action state and the sensor information stored in association with each other,
the selection means selects the predetermined sensor information based on a combination of the action state and the sensor information changed by the change means.
16. A method of selection, comprising the steps of:
an acquisition step, acquiring information of various sensors;
a determination step of determining an action state of the subject;
a selection step of selecting, based on a determination result of the determination step, sensor information corresponding to the determination result among the plurality of types of sensors as sensor information to be reported to a user; and
a changing step of changing a combination of the action state and the sensor information stored in association with each other,
in the selecting step, the predetermined sensor information is selected based on a combination of the action state and the sensor information changed in the changing step.
17. A storage medium that is a computer-readable recording medium storing a program for causing a computer to function as:
acquiring and processing to acquire various sensor information;
a determination process of determining an action state of the subject;
a selection process of selecting, as sensor information to be reported to a user, sensor information corresponding to a determination result among the plurality of types of sensors based on the determination result of the determination process; and
a change process of changing a combination of the action state and the sensor information stored in association with each other,
in the selection process, the predetermined sensor information is selected based on a combination of the action state changed by the change process and the sensor information.
CN201710505561.0A 2016-07-04 2017-06-27 Electronic device, selection control system, selection method, and recording medium Active CN107580292B (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2016-132684 2016-07-04
JP2016132684 2016-07-04
JP2017094317A JP6642515B2 (en) 2016-07-04 2017-05-11 Electronic device, selection control system, selection method, and program
JP2017-094317 2017-05-11

Publications (2)

Publication Number Publication Date
CN107580292A CN107580292A (en) 2018-01-12
CN107580292B true CN107580292B (en) 2020-08-25

Family

ID=60995593

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710505561.0A Active CN107580292B (en) 2016-07-04 2017-06-27 Electronic device, selection control system, selection method, and recording medium

Country Status (2)

Country Link
JP (1) JP6642515B2 (en)
CN (1) CN107580292B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11868405B2 (en) * 2018-01-23 2024-01-09 Sony Corporation Information processor, information processing method, and recording medium
JP6987662B2 (en) * 2018-02-07 2022-01-05 京セラ株式会社 Electronics, control methods and programs
JP7178931B2 (en) * 2019-03-11 2022-11-28 本田技研工業株式会社 Acquisition device for sensor arrangement mode

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101518442A (en) * 2008-07-05 2009-09-02 杭州义盛祥通信技术有限公司 Sport quantization watch and sport quantitative analysis method
CN104287327A (en) * 2014-10-09 2015-01-21 广东小天才科技有限公司 Method and device for reflecting motion state by virtue of sports bracelets
CN105204636A (en) * 2015-09-16 2015-12-30 宇龙计算机通信科技(深圳)有限公司 Information display method, information display device and intelligent watch
CN105204647A (en) * 2015-10-09 2015-12-30 联想(北京)有限公司 Information-processing method and electronic equipment

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FI113403B (en) * 2000-10-06 2004-04-15 Polar Electro Oy wrist device
JP2005342963A (en) * 2004-06-01 2005-12-15 Canon Inc Printing processor and method of setting print mode
JP4759304B2 (en) * 2005-04-07 2011-08-31 オリンパス株式会社 Information display system
JP5265141B2 (en) * 2007-06-15 2013-08-14 オリンパス株式会社 Portable electronic device, program and information storage medium
JP5795584B2 (en) * 2009-08-31 2015-10-14 アボット ダイアベティス ケア インコーポレイテッドAbbott Diabetes Care Inc. Medical device
US10143405B2 (en) * 2012-11-14 2018-12-04 MAD Apparel, Inc. Wearable performance monitoring, analysis, and feedback systems and methods
JP6166079B2 (en) * 2013-03-27 2017-07-19 ラピスセミコンダクタ株式会社 Semiconductor device, electronic device, and determination method
US20150127298A1 (en) * 2013-11-04 2015-05-07 Invensense, Inc. Activity detection and analytics
CN104949707B (en) * 2014-03-24 2018-07-24 深圳市埃微信息技术有限公司 Motion monitoring device and method based on information push
JP6413574B2 (en) * 2014-10-01 2018-10-31 セイコーエプソン株式会社 Activity state information detection apparatus and control method for activity state information detection apparatus

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101518442A (en) * 2008-07-05 2009-09-02 杭州义盛祥通信技术有限公司 Sport quantization watch and sport quantitative analysis method
CN104287327A (en) * 2014-10-09 2015-01-21 广东小天才科技有限公司 Method and device for reflecting motion state by virtue of sports bracelets
CN105204636A (en) * 2015-09-16 2015-12-30 宇龙计算机通信科技(深圳)有限公司 Information display method, information display device and intelligent watch
CN105204647A (en) * 2015-10-09 2015-12-30 联想(北京)有限公司 Information-processing method and electronic equipment

Also Published As

Publication number Publication date
JP2018010620A (en) 2018-01-18
JP6642515B2 (en) 2020-02-05
CN107580292A (en) 2018-01-12

Similar Documents

Publication Publication Date Title
JP6340477B2 (en) Distance image acquisition device and distance image acquisition method
CN108604432B (en) Electronic device and method for controlling the same
CN107580292B (en) Electronic device, selection control system, selection method, and recording medium
CN103702029B (en) The method and device of focusing is pointed out during shooting
EP2846135B1 (en) Portable Electronic Device with Environmental Sensor
US20170124837A1 (en) Communication method, apparatus, system and computer-readable medium for wearable device
CN109348125A (en) Video correction method, apparatus, electronic equipment and computer readable storage medium
US10067594B2 (en) Method and device for controlling touch screen
CN104303130A (en) Electronic system with augmented reality mechanism and method of operation thereof
KR20180102331A (en) Electronic device including camera module and method for controlling thereof
JP5262928B2 (en) Imaging device, portable terminal device, and focusing mechanism control method
US20220246015A1 (en) Fall detection method and apparatus, and wearable device
JP2020118699A (en) Electronic apparatus, calibration control method, and program
JP2017174212A (en) Action analysis device, action analyzing method and program
KR20170014919A (en) Electronic apparatus and method for detecting skin condition in electronic apparatus
EP2635003B1 (en) Mobile terminal device, notification method, and program
US8754767B2 (en) Geographic localization system
US10574756B2 (en) Electronic device, selection control system, selection method, and recording medium
JP2009027607A (en) Photography device, mobile communication device, and alarm system
CN114209298A (en) PPG sensor control method and device and electronic equipment
JP2019075007A (en) Portable terminal device and control program
JP2018186005A (en) Light reception amount measurement system
CN114979362A (en) Falling detection method and electronic equipment
JP2017200090A (en) Information processing system, picture processing device, information processing program, and information processing method
US20180264322A1 (en) Exercise Support Device, Exercise Support Method, and Storage Medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant