WO2017214490A1 - Wearable emotional feedback apparatus for autism spectrum disorder - Google Patents

Wearable emotional feedback apparatus for autism spectrum disorder Download PDF

Info

Publication number
WO2017214490A1
WO2017214490A1 PCT/US2017/036725 US2017036725W WO2017214490A1 WO 2017214490 A1 WO2017214490 A1 WO 2017214490A1 US 2017036725 W US2017036725 W US 2017036725W WO 2017214490 A1 WO2017214490 A1 WO 2017214490A1
Authority
WO
WIPO (PCT)
Prior art keywords
self
elapsedtime
cptfloat
return
cgfloat
Prior art date
Application number
PCT/US2017/036725
Other languages
French (fr)
Inventor
Helen KOO
Tingrui Pan
Daniel FONG
Susan RIVERA
Original Assignee
The Regents Of The University Of California
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by The Regents Of The University Of California filed Critical The Regents Of The University Of California
Publication of WO2017214490A1 publication Critical patent/WO2017214490A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6804Garments; Clothes
    • A61B5/6806Gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02405Determining heart rate variability
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • A61B5/053Measuring electrical impedance or conductance of a portion of the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • A61B5/053Measuring electrical impedance or conductance of a portion of the body
    • A61B5/0531Measuring skin impedance
    • A61B5/0533Measuring galvanic skin response
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/40Detecting, measuring or recording for evaluating the nervous system
    • A61B5/4076Diagnosing or monitoring particular conditions of the nervous system
    • A61B5/4088Diagnosing of monitoring cognitive diseases, e.g. Alzheimer, prion diseases or dementia

Definitions

  • the technology of this disclosure pertains generally to portable Autism Spectrum Disorder (ASD) devices, and more particularly to a sensing mesh configured for collecting and communicating ASD information to a mobile device.
  • ASD Autism Spectrum Disorder
  • Autism Spectrum Disorder is a group of disorders commonly characterized by atypical behavioral responses to social situations. This disease impairs patients' ability to communicate and interact, and patients are suffering from impulsivity, repetitive movements and even self-harm. A child with ASD often develops a learning disability or speech delay. It is often important during treatment of ASD to assess the arousal state of the patient during treatment or normal activities.
  • the present disclosure fulfills that need while providing additional benefits.
  • This disclosure describes a technology embodied in a glove referred to herein as "AutiSense”.
  • the "AutiSense” glove is an emotional feedback wearable device for Autism Spectrum Disorder (ASD).
  • ASD Autism Spectrum Disorder
  • FIG. 1 A and FIG. 1 B are top and underside views of a glove
  • FIG. 2A and FIG. 2B are GSR and HR sensors utilized within an autism spectrum disorder wearable emotional feedback apparatus according to an embodiment of the present disclosure.
  • FIG. 3A through FIG. 3C is a schematic of a wireless communication circuit utilized for the autism spectrum disorder wearable emotional feedback apparatus according to an embodiment of the present disclosure.
  • FIG. 4 is a schematic of a microcontroller circuit utilized for the
  • FIG. 5 is a schematic of a connector and miscellaneous circuits
  • ASD Autism Spectrum Disorder
  • the present disclosure can be similarly configured for working with Alzheimer patients toward improving their safety and lifestyle during treatment.
  • a wearable device is disclosed which provides an emotional feedback for the wearer, and/or primary caretaker, as an aid to understanding how their emotional arousal state is fluctuating so that early corrective actions may be taken.
  • Information of this nature can be collected by measuring physiological changes from the autonomic nervous system and transmitted to a mobile platform for display.
  • FIG. 1A and FIG. 1 B illustrate an example embodiment 10 of a mesh garment 12 of a wearable emotional feedback apparatus for autism spectrum disorder.
  • the garment comprises a glove device, although in other embodiments it may comprise a sock device.
  • the garment may comprise any garment or be incorporated to attach to a garment, which is configured for retention proximal to the face, armpits, and/or crotch area of a user, where physiological sensing is preferred for assessing emotion states regarding ASD or Alzheimer's disease.
  • the top 14 of this glove 10 is shown in FIG. 1 A with the underside 16 shown in FIG.
  • the example embodiment is fabricated from a stretchable material (e.g., stretchable fabric) that contains a control circuitry 26, such as a microcontroller circuit, application specific integrated circuit (ASIC), or similar device able to read, store and communicate data from the sensors.
  • a conductive path e.g., textile wire
  • control circuitry 26 incorporates at least one acceleration sensor, such as for sensing accelerations in all three dimensions. It should be appreciated that the disclosed apparatus is configured to utilize this optional acceleration sensor for discerning hand movement of the wearer to improve assessment of autism spectrum disorder (ASD). In addition, the acceleration sensor can be generally utilized for sensing a fall of the person wearing the sensing apparatus. In at least one embodiment control circuitry 26 incorporates a global positioning sensor (GPS) for communicating current location of the wearer to the mobile device. Location information can be utilized for discerning if the wearer is moving outside of a desired range from the person holding the mobile device while collecting data. By way of example and not limitation, that desired range may comprise the communication range between the wearable feedback apparatus and the mobile device. In addition, it should be appreciated that embodiments of the apparatus may comprise other sensors, such as skin temperature sensors, ambient temperature sensors, pressure sensors, fall sensors (detecting a fall by the person wearing the apparatus), audio sensing means, or even one or more image capture devices (cameras).
  • GPS global positioning sensor
  • control circuit 26 of the wearable apparatus may include output annunciators, such as indicators (e.g., LEDs), displays (e.g., LCD), and/or audio amplification and transduction.
  • indicators e.g., LEDs
  • displays e.g., LCD
  • audio amplification and transduction The use of indicators allows information to be communicated to the wearer of the apparatus, such as indicating the assessed arousal state associated with autism spectrum disorder (ASD).
  • the assessed value being annunciated may be generated by the processor within the apparatus, and/or it may receive feedback from the mobile device. For example displaying an indicator as a threshold sensing state is exceeded.
  • the annunciators are utilized for displaying information and directives from the person holding the mobile device which is collecting the information on arousal states for autism spectrum disorder (ASD).
  • This annunciation can aid in giving the wearer of the apparatus some autonomy while still being able to receive direction and emotional help when needed.
  • the use of an audio microphone and audio amplification and transduction on the sensory mesh apparatus allows the user of the mobile device to hear verbalizations from the wearer which may indicate the need for directions or other intervention; to which the mobile device user can speak directives or give emotional comfort to the wearer of the sensor apparatus.
  • ADCs analog-to-digital converters
  • a processor e.g., microcontroller, ASIC, and/or other processor-enabled circuitry.
  • BlendMicro® board was utilized with iOS® for programming.
  • the microcontroller in this example is Bluetooth 4.0 (Bluetooth Low Energy, BLE) compatible, which can provide important benefits in view of the connection being made via BLE.
  • BLE Bluetooth Low Energy
  • the exposed circuit part of the glove is sewed and covered by one or more material layers (e.g., a thinner layer of cloth) in order to enhance the user's experience before testing was performed for the volunteers.
  • response signal information collected at the glove are communicated and displayed on a remote device, preferably a mobile device (e.g., cellular smart phone) showing the rise of HR and drop of skin resistance because of additional sweat secretion.
  • FIG. 2A and FIG. 2B illustrate sensor example embodiments, in particular a pulse oximater 30 in FIG. 2A and an electrodermal response sensor, or galvanic skin response (GSR) sensor in FIG. 2B.
  • GSR galvanic skin response
  • GSR galvanic skin response
  • the pulse oximeter is used to obtain a photo-plethysmograph to
  • FIG. 2A depicts a circuit 30 having sensor 32 coupled through three lines power (VDD) 34a, ground (GND) 34b, and a pulse output signal (PULSE_OX_ANALOG_OUT) 34c. It should be appreciated that various forms of oxygen detection sensors may be utilized without departing from the teachings of the present disclosure.
  • the pulse oximeter 2A such as containing an LED and photodetector, within the pulse oximeter are configured for attachment to the finger of the disclosed glove structure, such as to the index finger of the glove.
  • the output of the photodetector is a photo-plethysmograph and is acquired by the controller circuit, such as by Analog-to-Digital voltage conversion.
  • the galvanic skin response (GSR) sensor 50 of FIG. 2B measures the change in skin-resistance due to sweat produced by the eccrine sweat glands, densely populated in the volar regions of the hand and feet. In one embodiment, this is accomplished by using two conductive pads, each approximately one-inch in diameter retained (e.g., sewn) inside of the fingertips of the middle and ring finger 18c, 18b (as seen in FIG. 1A and FIG. 1 B) of the glove, which attach to a small circuit 26, by utilizing flexible conductive pathways (e.g., conductive thread) 24.
  • GSR galvanic skin response
  • the senor has a bridge circuit 52 comprising 1 M Ohm resistor 54a and two 10kOhm resistors 54b, 54c, which are coupled into a Wheatstone-bridge configuration with the conductive pads, to allow the circuit to readily discern changes in skin resistance 56 between the two pads to output signal 58.
  • signal 58 is amplified and
  • the conversion to digital storage of the data is performed by an analog-to-digital converter (ADC) within a microcontroller, or an ADC coupled to a processor of the controller circuit, or an ADC otherwise coupled to circuitry of the controller for reading the storing GSR information.
  • ADC analog-to-digital converter
  • FIG. 3A through FIG. 3C illustrate an example embodiment 70 of a wireless communication circuit 70, shown by way of example and not limitation as a Bluetooth Low Energy (BLE) integrated circuit (chip) for providing wireless communication capabilities between the glove apparatus and a nearby electronic device configured for displaying information from the glove apparatus, and preferably also performing additional data collection and processing functionality.
  • BLE Bluetooth Low Energy
  • other wireless circuitry and communication protocols e.g., IEEE802.1 1x, ZigBee, WiFi, QAMx, ASK, and other short range wireless technologies
  • a nearby electronic device such as a mobile device typically a smart phone device having application programming configured for processing the signals from the disclosed glove apparatus.
  • FIG. 4 illustrates an example microcontroller circuit embodiment 1 10, shown by way of example as an ATmega32U4 (Atmel Corporation®) microcontroller 1 12 with oscillator crystal circuit 1 14, and interface signals 1 16.
  • the control functionality is performed using this microcontroller in-conj unction with a wireless communication circuit.
  • the main purpose of the microcontroller is to acquire the signals 1 18 obtained by the sensor subsystem and perform signal analysis before sending the information through interface 1 18 to a wireless communication circuit to be output over the wireless communication path (e.g., BLE) to a mobile device which is hosting application programming for receiving, processing and displaying the information from the glove apparatus.
  • a wireless communication path e.g., BLE
  • a first ADC (PFO(ADCO)) is coupled to the oxygen sensor signal line (PULSE_OX_ANALOG_OUT) as was seen being output from FIG. 2A, while a second ADC (PF1 )ADC1 )) is coupled to the GSR signal line (GSR_ANALOG_OUT) as was seen in FIG. 2B.
  • Signals REQN, SCK, MOSI and MISO 1 16 are configured for coupling to control and communicate with wireless communication circuit of FIG. 3.
  • FIG. 5 illustrates miscellaneous circuitry in relation to the use of the circuits shown in FIG. 3 and FIG. 4.
  • a set of signals 152 is coupled to connector 154, a VDD voltage regulator circuit 156 and a buffer 158 for buffering the signals between the microcontroller and wireless communication circuit.
  • the wireless communication circuit of FIG. 3 and the microcontroller circuit of FIG. 4 in this example are configured for communicating through a Serial Peripheral Interface (Mode 0) with several external hardware interrupts being provided to manage flow- control.
  • Mode 0 Serial Peripheral Interface
  • these two integrated circuits (chips) are interconnected on a single PCB and connect to the sensors via conductive pathways (e.g., thread sewn) integrated into the glove.
  • a thin and well-fitted glove holds the main components of the device.
  • the glove's fit should be such that each of the sensors provide sufficient compression so as to not move around at the fingertip, yet not be so snug that it occludes circulation.
  • the bulk material of the glove is preferably made of non-conductive thread, including but not limited to cotton or elastane. Conductive material is formed or threaded (sewn) into the glove to provide electrical contacts between the various sensors and microcontroller described previously.
  • the microcontroller sends the data received from the sensors to the BLE chip for wireless communication to a mobile device, such as an Apple iPhone, to be processed by a mobile application.
  • This mobile application receives the data received from the microcontroller and displays information regarding the specific emotional status of the person wearing the glove, as related to the autism spectrum disorder, or Alzheimer's. This can include, but is not limited to, graphs displaying skin-conductance, heart rate, heart rate variability, push notifications regarding sudden increases, or threshold crossing events in skin-conductance, heart rates, heart rate variability.
  • notifications are generated by the apparatus, either at the controller within the wearable sensor mesh, or from the mobile device, or in combination with them both. These push notifications alert the wearer or primary care giver to arousal situations detected by the system, for example as may lead to impulsive or self-harming actions.
  • the push notification is generated when sensors in the sensing mesh apparatus exceed thresholds of skin-conductance and heart rate to help primary care givers (e.g., users of the mobile device) to gain insight to any external stimuli triggers that could be causing the increase in emotional arousal and to take timely corrective action. Since each person's emotional response to external stimuli can be unique, these thresholds for sending push notifications should be configured on a user-specific basis.
  • Table 1 and Table 2 found at the end of the specification, contain examples of computer program instructions that can be used in
  • Table 1 contains a coding example for a microcontroller on the sensor mesh apparatus, while Table 2 contains code written for an iOS mobile application.
  • the last GSR reading and heart rate information is sent via the Serial Peripheral Interface to the nRF8001 BLE chip, which then transmits the information to a mobile application via BLE using an onboard (PCB) antenna.
  • the ATmega32U4 and nRF8001 are together referred to herein as the embedded microcontroller, and the instructions (code) running on the embedded microcontroller are referred to herein as the embedded application to distinguish between the user application running on the mobile device and the code used for the data acquisition from the sensors.
  • Embodiments of the present technology may be described herein with reference to flowchart illustrations of methods and systems according to embodiments of the technology, and/or procedures, algorithms, steps, operations, formulae, or other computational depictions, which may also be implemented as computer program products.
  • each block or step of a flowchart, and combinations of blocks (and/or steps) in a flowchart, as well as any procedure, algorithm, step, operation, formula, or computational depiction can be implemented by various means, such as hardware, firmware, and/or software including one or more computer program instructions embodied in computer-readable program code.
  • any such computer program instructions may be executed by one or more computer processors, including without limitation a general purpose computer or special purpose computer, or other programmable processing apparatus to produce a machine, such that the computer program instructions which execute on the computer processor(s) or other programmable processing apparatus create means for
  • blocks of the flowcharts, and procedures, algorithms, steps, operations, formulae, or computational depictions described herein support combinations of means for performing the specified function(s), combinations of steps for performing the specified function(s), and computer program instructions, such as embodied in computer-readable program code logic means, for performing the specified function(s).
  • each block of the flowchart illustrations, as well as any procedures, algorithms, steps, operations, formulae, or computational depictions and combinations thereof described herein can be implemented by special purpose hardware-based computer systems which perform the specified function(s) or step(s), or combinations of special purpose hardware and computer-readable program code.
  • embodied in computer-readable program code may also be stored in one or more computer-readable memory or memory devices that can direct a computer processor or other programmable processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory or memory devices produce an article of manufacture including instruction means which implement the function specified in the block(s) of the flowchart(s).
  • the computer program instructions may also be executed by a computer processor or other programmable processing apparatus to cause a series of operational steps to be performed on the computer processor or other programmable processing apparatus to produce a computer-implemented process such that the instructions which execute on the computer processor or other programmable processing apparatus provide steps for implementing the functions specified in the block(s) of the flowchart(s), procedure (s) algorithm(s), step(s), operation(s), formula(e), or computational
  • program executable refer to one or more instructions that can be executed by one or more computer processors to perform one or more functions as described herein.
  • the instructions can be embodied in software, in firmware, or in a combination of software and firmware.
  • the instructions can be stored local to the device in non-transitory media, or can be stored remotely such as on a server, or all or a portion of the instructions can be stored locally and remotely. Instructions stored remotely can be downloaded (pushed) to the device by user initiation, or automatically based on one or more factors.
  • processors, hardware processor, computer processor, central processing unit (CPU), and computer are used synonymously to denote a device capable of executing the instructions and communicating with input/output interfaces and/or peripheral devices, and that the terms processor, hardware processor, computer processor, CPU, and computer are intended to encompass single or multiple devices, single core and multicore devices, and variations thereof.
  • present disclosure encompasses multiple embodiments which include, but are not limited to, the following list of embodiments:
  • a wearable emotional feedback apparatus comprising: (a) a
  • wearable sensing mesh configured for wearing by a user; (b) a plurality of physiological sensors integrated into said wearable sensing mesh, wherein said sensors are configured for determining response to emotional stimuli; (c) a controller configured for receiving inputs from said physiological sensors and for detecting fluctuations in the emotional arousal state of the user wearing said wearable sensing mesh; and (d) a communication circuit configured for wirelessly communicating the detected signs to a mobile device configured for displaying the physiological information.
  • apparatus is configured for detecting emotional arousal states in the user wearing said wearable sensing mesh associated with autism spectrum disorder (ASD).
  • ASD autism spectrum disorder
  • apparatus is configured for detecting states in the user wearing said wearable sensing mesh associated with Alzheimer's disease.
  • wearable sensing mesh comprises at least one wearable glove, or sock.
  • wearable sensing mesh comprises a wearable device retained in proximity to face, armpits, and/or crotch area of the user wearing said wearable sensing mesh.
  • heartrate (HR) and heart rate (HR) variability are detected using a pulse oximeter for obtaining a photo plethysmograph to detect the heart rate and heart rate variability of the user wearing said wearable sensing mesh.
  • galvanic skin response (GSR) sensors comprise conductive elements spanning one or more areas of said wearable mesh so that changes in skin resistance can be measured in response to an amount of sweat produced by the user wearing said wearable sensing mesh.
  • controller is configured for measuring said changes in skin resistance from said galvanic skin response (GSR) sensors by applying differential voltages between said conductive elements to measure resistance.
  • GSR galvanic skin response
  • Wheatstone bridge input to a differential amplifier whose output is either directly sent to an analog input of said controller, or directed to a circuit for extracting information and passing it to said controller.
  • plurality of physiological sensors includes an acceleration sensor.
  • plurality of physiological sensors includes an audio sensor or microphone input, wherein said apparatus transmits collected audio to the mobile device.
  • plurality of physiological sensors includes a global positioning sensor (GPS) configured for communicating location of the user wearing said wearable sensing mesh to said controller for communication to the mobile device.
  • GPS global positioning sensor
  • apparatus further comprises one or more output annunciators for
  • mobile device comprises a mobile phone, touch pad, or laptop computer.
  • apparatus is configured for generating push notifications to alert a user of the mobile device which is receiving physiological information from said wearable sensing mesh that threshold conditions are exceeded for the user wearing said wearable sensing mesh.
  • An emotional feedback apparatus for use during treatment of autism spectrum disorder (ASD), comprising: (a) a wearable sensing mesh configured for wearing by a user; (b) a plurality of physiological sensors integrated into said wearable sensing mesh, wherein said sensors are configured for determining response to emotional stimuli; (c) a controller configured for receiving inputs from said physiological sensors and for detecting fluctuations in the emotional arousal state of the wearer which are associated with impulsivity which could lead to self-harm by the user wearing said wearable sensing mesh having autism spectrum disorder (ASD); (d) a heartrate (HR) sensor, and galvanic skin response (GSR) sensor within said physiological sensors for detecting heart rate, heart rate variability, and level of perspiration of the user wearing the wearable sensing mesh; (e) wherein said controller is configured to correlate heartrate, heart rate variability, and level of perspiration, to determine a level of emotional arousal state; (f) a communication circuit configured for wirelessly communicating the emotional arousal state to a
  • apparatus further comprises one or more output annunciators for communicating visual or audio information from the user of the mobile device to the user wearing the wearable sensing mesh.
  • plurality of physiological sensors includes a fall sensing detector configured for communicating a suspected fall of the user wearing the wearable sensing mesh to said controller for communication to the mobile device.
  • mobile device comprises a mobile phone, touch pad, or laptop computer.
  • volatile boolean Pulse false; // "True” when User's live heartbeat is detected. "False' when not a "live beat”.
  • interruptSetupO // sets up to read Pulse Sensor signal every 2mS
  • OCR1A 0x3E; // count to 62
  • TEVISK1 0x02
  • rate[9] IBI; // add the latest IBI to the rate array
  • BPM 60000/runningTotal // how many beats can fit into a minute? that's BPM!
  • thresh amp/2 + T; // set thresh at 50% of the amplitude
  • thresh 512; // set thresh default
  • T 512; // set T default
  • advertisementData (NSDictionary *) advertisementData
  • selfcentralManager [[CBCentralManager alloc] initWithDelegate:self queue : di spatch_get_main_queue()] ;
  • selffoundPeripherals [[NSMutableArray alloc] init];
  • NSDictionary *options [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:NO] forKey:CBCentralManagerScanOptionAllowDuplicatesKey]; [self.centralManager scanForPeripheralsWithServices:nil options:options]; ⁇
  • window [[UlWindow alloc] initWithFrame:[[UIScreen mainScreen] bounds]]; // Override point for customization after application launch.
  • plotltems [[NSMutableArray alloc]
  • plotSections [[NSCountedSet alloc] ii
  • hostedGraph graph
  • CPTGraphHostingView *hostingView selfdefaultLayerHostingView; if ( hosting View ) ⁇
  • comparisonResult [self.title caseInsensitiveCompare:other.title];
  • graph.paddingTop MAX(graph.titleTextStyle.fontSize * CPTFloat(2.0), boundsPadding);
  • CGFloat labelSize graphTitleSize * CPTFloat(0.5); for ( CPTAxis *axis in graph. axisSet.axes ) ⁇
  • textStyle [axis.titleTextStyle mutableCopy]
  • textStyle.fontSize axisTitleSize
  • axis.titleTextStyle textStyle; // Axis labels
  • textStyle.fontSize labelSize
  • axis.labelTextStyle textStyle
  • textStyle [axis.minorTickLabelTextStyle mutableCopy];
  • CGRect imageFrame CGRectMake(0, 0, 400, 300);
  • CGContextRef context UIGraphicsGetCurrentContext()

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Human Computer Interaction (AREA)
  • Pathology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biophysics (AREA)
  • Psychiatry (AREA)
  • Child & Adolescent Psychology (AREA)
  • Developmental Disabilities (AREA)
  • Educational Technology (AREA)
  • Hospice & Palliative Care (AREA)
  • Psychology (AREA)
  • Social Psychology (AREA)
  • Measuring Pulse, Heart Rate, Blood Pressure Or Blood Flow (AREA)

Abstract

A wearable emotional feedback apparatus "AutiSense", for collecting and displaying emotional feedback information for use in working with those having Autism Spectrum Disorder (ASD). A wearable sensing mesh (e.g., glove or sock) is configured with physiological sensors, including skin response and heart rate, for detecting response to emotional stimuli and communicating the information to a mobile device for display and optional storage and additional processing.

Description

WEARABLE EMOTIONAL FEEDBACK APPARATUS
FOR AUTISM SPECTRUM DISORDER
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims priority to, and the benefit of, U.S. provisional patent application serial number 62/348,726 filed on June 10, 2016, and to U.S. provisional patent application serial number 62/363,157 filed on July 15, 2016, each incorporated herein by reference in their entirety.
STATEMENT REGARDING FEDERALLY SPONSORED RESEARCH OR DEVELOPMENT
Not Applicable
INCORPORATION-BY-REFERENCE OF
COMPUTER PROGRAM APPENDIX
Not Applicable
NOTICE OF MATERIAL SUBJECT TO COPYRIGHT PROTECTION
[0004] A portion of the material in this patent document may be subject to copyright protection under the copyright laws of the United States and of other countries. The owner of the copyright rights has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the United States Patent and Trademark Office publicly available file or records, but otherwise reserves all copyright rights whatsoever. The copyright owner does not hereby waive any of its rights to have this patent document maintained in secrecy, including without limitation its rights pursuant to 37 C.F.R. § 1 .14. BACKGROUND
[0005] 1 . Technical Field
[0006] The technology of this disclosure pertains generally to portable Autism Spectrum Disorder (ASD) devices, and more particularly to a sensing mesh configured for collecting and communicating ASD information to a mobile device.
[0007] 2. Background Discussion
[0008] Autism Spectrum Disorder (ASD) is a group of disorders commonly characterized by atypical behavioral responses to social situations. This disease impairs patients' ability to communicate and interact, and patients are suffering from impulsivity, repetitive movements and even self-harm. A child with ASD often develops a learning disability or speech delay. It is often important during treatment of ASD to assess the arousal state of the patient during treatment or normal activities.
[0009] Accordingly, a need exists for determining behavioral extent of
Autism Spectrum Disorder. The present disclosure fulfills that need while providing additional benefits.
BRIEF SUMMARY
[0010] This disclosure describes a technology embodied in a glove referred to herein as "AutiSense". The "AutiSense" glove is an emotional feedback wearable device for Autism Spectrum Disorder (ASD).
[0011] By integrating sensors into a glove made of stretchable fabric, the device can measure the physiological signals, in response to emotional stimuli, and send collected real time information to a mobile device. It is designed to be an early interaction and intervention tool for people, especially the children with ASD, to help better inform their primary care giver to take timely corrective action. This type of glove also has the potential value to collect emotional feedback information for people with Alzheimer's disease.
[0012] Further aspects of the technology described herein will be brought out in the following portions of the specification, wherein the detailed description is for the purpose of fully disclosing preferred embodiments of the technology without placing limitations thereon. BRIEF DESCRIPTION OF THE SEVERAL VIEWS
OF THE DRAWING(S)
[0013] The technology described herein will be more fully understood by reference to the following drawings which are for illustrative purposes only:
[0014] FIG. 1 A and FIG. 1 B are top and underside views of a glove
configured as an autism spectrum disorder wearable emotional feedback apparatus according to an embodiment of the present disclosure.
[0015] FIG. 2A and FIG. 2B are GSR and HR sensors utilized within an autism spectrum disorder wearable emotional feedback apparatus according to an embodiment of the present disclosure.
[0016] FIG. 3A through FIG. 3C is a schematic of a wireless communication circuit utilized for the autism spectrum disorder wearable emotional feedback apparatus according to an embodiment of the present disclosure.
[0017] FIG. 4 is a schematic of a microcontroller circuit utilized for the
autism spectrum disorder wearable emotional feedback apparatus according to an embodiment of the present disclosure.
[0018] FIG. 5 is a schematic of a connector and miscellaneous circuits
utilized for the autism spectrum disorder wearable emotional feedback apparatus according to an embodiment of the present disclosure.
DETAILED DESCRIPTION
[0019] For Autism Spectrum Disorder (ASD) patients, an early intervention is a principle key of treatment. The present disclosure can be similarly configured for working with Alzheimer patients toward improving their safety and lifestyle during treatment. In order to provide improved information to the patient, or their primary care giver, a wearable device is disclosed which provides an emotional feedback for the wearer, and/or primary caretaker, as an aid to understanding how their emotional arousal state is fluctuating so that early corrective actions may be taken. Information of this nature can be collected by measuring physiological changes from the autonomic nervous system and transmitted to a mobile platform for display.
[0020] The physiological parameters measured include galvanic skin response (GSR), heart rate (HR), and heart rate (HR) variability. GSR, also known as skin-conductance, detects activity in the sympathetic branch of the autonomic nervous system by measuring the change in skin-resistance due to sweat produced by the eccrine sweat glands, which are densely populated in the volar regions of the hand. In this way, when the apparatus measures skin conductance on the fingertip, it is affected by the emotional and sympathetic responses of the current objectives. The HR and HR variability (HRV) are rectificatory parameters measured in this case to make the emotional feedback more precise.
[0021] FIG. 1A and FIG. 1 B illustrate an example embodiment 10 of a mesh garment 12 of a wearable emotional feedback apparatus for autism spectrum disorder. In a preferred embodiment, the garment comprises a glove device, although in other embodiments it may comprise a sock device. It should also be appreciated that the garment may comprise any garment or be incorporated to attach to a garment, which is configured for retention proximal to the face, armpits, and/or crotch area of a user, where physiological sensing is preferred for assessing emotion states regarding ASD or Alzheimer's disease. The top 14 of this glove 10 is shown in FIG. 1 A with the underside 16 shown in FIG. 1 B, each of which by way of example has glove portions 18a, 18b, 18c, 18d, and 18e configured for receiving each finger of a wearer/user. Sensors are shown on underside 16 of glove 12, comprising contacts 20a, 20b for a GSR sensor to monitor skin resistance, and a pulse oximeter 22 for HR monitoring.
[0022] The example embodiment is fabricated from a stretchable material (e.g., stretchable fabric) that contains a control circuitry 26, such as a microcontroller circuit, application specific integrated circuit (ASIC), or similar device able to read, store and communicate data from the sensors. A conductive path (e.g., textile wire) 24 is shown connecting the sensors to control circuitry 26.
[0023] In at least one embodiment, control circuitry 26 incorporates at least one acceleration sensor, such as for sensing accelerations in all three dimensions. It should be appreciated that the disclosed apparatus is configured to utilize this optional acceleration sensor for discerning hand movement of the wearer to improve assessment of autism spectrum disorder (ASD). In addition, the acceleration sensor can be generally utilized for sensing a fall of the person wearing the sensing apparatus. In at least one embodiment control circuitry 26 incorporates a global positioning sensor (GPS) for communicating current location of the wearer to the mobile device. Location information can be utilized for discerning if the wearer is moving outside of a desired range from the person holding the mobile device while collecting data. By way of example and not limitation, that desired range may comprise the communication range between the wearable feedback apparatus and the mobile device. In addition, it should be appreciated that embodiments of the apparatus may comprise other sensors, such as skin temperature sensors, ambient temperature sensors, pressure sensors, fall sensors (detecting a fall by the person wearing the apparatus), audio sensing means, or even one or more image capture devices (cameras).
In addition, the control circuit 26 of the wearable apparatus may include output annunciators, such as indicators (e.g., LEDs), displays (e.g., LCD), and/or audio amplification and transduction. The use of indicators allows information to be communicated to the wearer of the apparatus, such as indicating the assessed arousal state associated with autism spectrum disorder (ASD). The assessed value being annunciated may be generated by the processor within the apparatus, and/or it may receive feedback from the mobile device. For example displaying an indicator as a threshold sensing state is exceeded. In at least one embodiment the annunciators are utilized for displaying information and directives from the person holding the mobile device which is collecting the information on arousal states for autism spectrum disorder (ASD). This annunciation can aid in giving the wearer of the apparatus some autonomy while still being able to receive direction and emotional help when needed. For example, the use of an audio microphone and audio amplification and transduction on the sensory mesh apparatus allows the user of the mobile device to hear verbalizations from the wearer which may indicate the need for directions or other intervention; to which the mobile device user can speak directives or give emotional comfort to the wearer of the sensor apparatus.
[0025] GSR measurement was conducted using a GSR sensor coupled to two contacts 22a, 22b. By way of example and not limitation, each said GSR sensor can be implemented comprising two conductive pads 22a, 22b that will be in contact with the palmar side of the hand at the distal phalanges, as shown in FIG. 1 B. A small voltage, under 1 .5V, is applied across these conductive pads, and the resulting resistance is measured, such as exemplified herein by utilizing a Wheatstone bridge circuit, in conjunction with a differential amplifier and buffered, then analyzed using an embedded board that contains signal analysis circuitry and a processing circuit with programming configured for performing the analysis.
[0026] Initial experiments of GSR function were conducted utilizing a
somewhat painful stimulus. For example, a pinprick can elicit a
sympathetic response by the sweat glands, increasing secretion. In one scenario we tested the GSR sensor when the volunteer wore the glove on the left hand and the volunteer was mildly punched on the right palm, with the action resulting in an obvious drop of the skin resistance on the fingertip. This was to test changes in sympathetic response which were akin to that experienced by an ASD patient in response to environmental and social interaction. The reason the skin resistance drops is that sweat contains water and electrolytes and when the object experienced a painful stimulus, the sweat secretion was increased, which thus increased electrical conductivity, ultimately resulting in lowering skin electrical resistance. Even though the change of the sweat secretion was small, it resulted in an obvious change in the skin resistance.
[0027] The pulse oximeter 22 is used to obtain a photo plethysmograph to detect the heart rate and heart rate variability of the volunteer. This sensor detects pulse rate by directing a green light-emitting diode (LED) onto the finger and analyzing the reflected light intensity pattern using a
photodetector and other signal analysis circuitry. This intensity pattern is then analyzed by a processor (e.g., microcontroller, ASIC, microchip) executing signal analysis routines to extract the heart-rate and heart-rate variability measurements. The LED, photodetector, and other circuit elements are located on a circuit board (e.g., printed circuit board(s), or flexible circuit board(s)) contained in or upon, or connected to glove 12.
[0028] Data is gathered from these two sensors through analog-to-digital converters (ADCs) within the control circuit, for example these ADCs may be within, or coupled to, a processor (e.g., microcontroller, ASIC, and/or other processor-enabled circuitry). In a prototype glove circuit, a
BlendMicro® board was utilized with Arduino® for programming. The microcontroller in this example is Bluetooth 4.0 (Bluetooth Low Energy, BLE) compatible, which can provide important benefits in view of the connection being made via BLE.
[0029] In addition to the glove components, including sensors and
processing circuitry, the present disclosure includes a mobile application hosted on a mobile device for receiving, processing and displaying the collected autism (or Alzheimer's) data. In this example, the data is wirelessly communicated to an external mobile device, of which a prototype has been developed for Apple iOS products (e.g., iPhone, iPad) that connects to the mesh sensors through a wireless Bluetooth Low Energy capable microchip located on the embedded circuit board. This allows the physiological signals (i.e., heart rate, heart rate variability, galvanic skin response) to be transmitted to the mobile device for display and analysis.
[0030] In at least one embodiment, the exposed circuit part of the glove is sewed and covered by one or more material layers (e.g., a thinner layer of cloth) in order to enhance the user's experience before testing was performed for the volunteers. In response to a slightly painful stimulus, response signal information collected at the glove are communicated and displayed on a remote device, preferably a mobile device (e.g., cellular smart phone) showing the rise of HR and drop of skin resistance because of additional sweat secretion.
[0031] FIG. 2A and FIG. 2B illustrate sensor example embodiments, in particular a pulse oximater 30 in FIG. 2A and an electrodermal response sensor, or galvanic skin response (GSR) sensor in FIG. 2B.
[0032] The sensors are configured to measure physiological signals due to activity of the sympathetic branch of the autonomic nervous system, in response to emotional stimuli. In a preferred embodiment, two
physiological signals are collected by the device: the galvanic skin response (GSR), and the pulse oximeter.
[0033] The pulse oximeter is used to obtain a photo-plethysmograph to
detect the heart rate and heart rate variability. This is achieved due to the optical absorption and scattering effects of hemoglobin. In one
embodiment, a green light-emitting diode (LED) is directed onto the finger and the reflected light intensity pattern is detected with a photodetector, and amplified through an operational amplifier. Numerous designs for this can be utilized without departing from the present disclosure. FIG. 2A depicts a circuit 30 having sensor 32 coupled through three lines power (VDD) 34a, ground (GND) 34b, and a pulse output signal (PULSE_OX_ANALOG_OUT) 34c. It should be appreciated that various forms of oxygen detection sensors may be utilized without departing from the teachings of the present disclosure. The circuit of FIG. 2A, such as containing an LED and photodetector, within the pulse oximeter are configured for attachment to the finger of the disclosed glove structure, such as to the index finger of the glove. The output of the photodetector is a photo-plethysmograph and is acquired by the controller circuit, such as by Analog-to-Digital voltage conversion.
[0034] The galvanic skin response (GSR) sensor 50 of FIG. 2B measures the change in skin-resistance due to sweat produced by the eccrine sweat glands, densely populated in the volar regions of the hand and feet. In one embodiment, this is accomplished by using two conductive pads, each approximately one-inch in diameter retained (e.g., sewn) inside of the fingertips of the middle and ring finger 18c, 18b (as seen in FIG. 1A and FIG. 1 B) of the glove, which attach to a small circuit 26, by utilizing flexible conductive pathways (e.g., conductive thread) 24. [0035] In one embodiment of the GSR, the sensor has a bridge circuit 52 comprising 1 M Ohm resistor 54a and two 10kOhm resistors 54b, 54c, which are coupled into a Wheatstone-bridge configuration with the conductive pads, to allow the circuit to readily discern changes in skin resistance 56 between the two pads to output signal 58.
[0036] In at least one preferred embodiment, signal 58 is amplified and
filtered, shown in the figure in response to a unity gain amplifier 60, depicted with negative feedback 63 coupled from output 62. Output from this unity gain amplifier is received at an inverting amplifier 66 having input resistor 64 and feedback resistor 67, to produce an analog output signal 68 (GSR_ANALOG_OUT). Although no noise filtering (e.g., low pass filtering) was shown for the sake of simplicity of illustration, it can be incorporated as needed for a given application. Output signal 68 is preferably converted from an analog waveform to a digital waveform, such as storing information about the waveform and/or points along the waveform. In at least one embodiment, the conversion to digital storage of the data is performed by an analog-to-digital converter (ADC) within a microcontroller, or an ADC coupled to a processor of the controller circuit, or an ADC otherwise coupled to circuitry of the controller for reading the storing GSR information.
[0037] FIG. 3A through FIG. 3C illustrate an example embodiment 70 of a wireless communication circuit 70, shown by way of example and not limitation as a Bluetooth Low Energy (BLE) integrated circuit (chip) for providing wireless communication capabilities between the glove apparatus and a nearby electronic device configured for displaying information from the glove apparatus, and preferably also performing additional data collection and processing functionality. It should be appreciated that other wireless circuitry and communication protocols (e.g., IEEE802.1 1x, ZigBee, WiFi, QAMx, ASK, and other short range wireless technologies), may be utilized in the present disclosure instead of Bluetooth for communicating with a nearby electronic device, such as a mobile device typically a smart phone device having application programming configured for processing the signals from the disclosed glove apparatus. [0038] In embodiment 70 a Bluetooth integrated circuit 72 (e.g., nRF8001 by Nordic Semiconductors®), is shown in with analog supply filtering 74, capacitors on regulated supply outputs 76, crystal oscillator circuitry 78, analog supplies and oscillator 80, antenna circuitry 82, thermal pad 84, and interface signals 86 for connection to a control circuit, such as the
microcontroller of FIG. 4.
[0039] FIG. 4 illustrates an example microcontroller circuit embodiment 1 10, shown by way of example as an ATmega32U4 (Atmel Corporation®) microcontroller 1 12 with oscillator crystal circuit 1 14, and interface signals 1 16. In at least one embodiment, the control functionality is performed using this microcontroller in-conj unction with a wireless communication circuit. The main purpose of the microcontroller is to acquire the signals 1 18 obtained by the sensor subsystem and perform signal analysis before sending the information through interface 1 18 to a wireless communication circuit to be output over the wireless communication path (e.g., BLE) to a mobile device which is hosting application programming for receiving, processing and displaying the information from the glove apparatus. It is seen in the figure that a first ADC (PFO(ADCO)) is coupled to the oxygen sensor signal line (PULSE_OX_ANALOG_OUT) as was seen being output from FIG. 2A, while a second ADC (PF1 )ADC1 )) is coupled to the GSR signal line (GSR_ANALOG_OUT) as was seen in FIG. 2B. Signals REQN, SCK, MOSI and MISO 1 16 are configured for coupling to control and communicate with wireless communication circuit of FIG. 3.
[0040] FIG. 5 illustrates miscellaneous circuitry in relation to the use of the circuits shown in FIG. 3 and FIG. 4. In particular, a set of signals 152 is coupled to connector 154, a VDD voltage regulator circuit 156 and a buffer 158 for buffering the signals between the microcontroller and wireless communication circuit.
[0041] By way of example and not limitation, the wireless communication circuit of FIG. 3 and the microcontroller circuit of FIG. 4 in this example are configured for communicating through a Serial Peripheral Interface (Mode 0) with several external hardware interrupts being provided to manage flow- control. In a preferred embodiment, these two integrated circuits (chips) are interconnected on a single PCB and connect to the sensors via conductive pathways (e.g., thread sewn) integrated into the glove.
[0042] A thin and well-fitted glove holds the main components of the device.
The glove's fit should be such that each of the sensors provide sufficient compression so as to not move around at the fingertip, yet not be so snug that it occludes circulation. In at least one embodiment, the bulk material of the glove is preferably made of non-conductive thread, including but not limited to cotton or elastane. Conductive material is formed or threaded (sewn) into the glove to provide electrical contacts between the various sensors and microcontroller described previously.
[0043] The microcontroller sends the data received from the sensors to the BLE chip for wireless communication to a mobile device, such as an Apple iPhone, to be processed by a mobile application. This mobile application receives the data received from the microcontroller and displays information regarding the specific emotional status of the person wearing the glove, as related to the autism spectrum disorder, or Alzheimer's. This can include, but is not limited to, graphs displaying skin-conductance, heart rate, heart rate variability, push notifications regarding sudden increases, or threshold crossing events in skin-conductance, heart rates, heart rate variability.
[0044] When those with autism spectrum disorder are feeling states of high emotional arousal but are unable to communicate and interact, those feelings can often lead to impulsivity and self-harm. However, states of high emotional arousal are often accompanied by increases in the sympathetic nervous system. This results in a significant increase in skin- conductance and heart rate from a wearer's baseline.
[0045] In at least one embodiment of the present disclosure, push
notifications are generated by the apparatus, either at the controller within the wearable sensor mesh, or from the mobile device, or in combination with them both. These push notifications alert the wearer or primary care giver to arousal situations detected by the system, for example as may lead to impulsive or self-harming actions. In at least one embodiment, the push notification is generated when sensors in the sensing mesh apparatus exceed thresholds of skin-conductance and heart rate to help primary care givers (e.g., users of the mobile device) to gain insight to any external stimuli triggers that could be causing the increase in emotional arousal and to take timely corrective action. Since each person's emotional response to external stimuli can be unique, these thresholds for sending push notifications should be configured on a user-specific basis.
[0046] Table 1 and Table 2, found at the end of the specification, contain examples of computer program instructions that can be used in
implementing the technology. Table 1 contains a coding example for a microcontroller on the sensor mesh apparatus, while Table 2 contains code written for an iOS mobile application.
[0047] Referring first to Table 1 , computer program instructions are
provided that can be executed by the microcontroller for operation of the sensor mesh device. The embedded program (uploaded to ATmega32U4) is called AutiSenseMain_RBL_main.c. This program begins by waiting for a BLE connection to be established with a mobile application. After a connection is established, it polls for data from the GSR sensor and pulse oximeter using ADC every two milliseconds. The pulse oximetry data is analyzed every two-milliseconds to determine if a heart beat was found and calculates a heart rate. When a heart rate can be calculated (by finding two consecutive heart beats) the last GSR reading and heart rate information is sent via the Serial Peripheral Interface to the nRF8001 BLE chip, which then transmits the information to a mobile application via BLE using an onboard (PCB) antenna. The ATmega32U4 and nRF8001 are together referred to herein as the embedded microcontroller, and the instructions (code) running on the embedded microcontroller are referred to herein as the embedded application to distinguish between the user application running on the mobile device and the code used for the data acquisition from the sensors.
[0048] Referring to Table 2, computer program instructions are provided for a mobile application, which by way of example and not limitation, was written for the Apple iPhone running iOS 7, and is called AutismApp. This code uses the Bluetooth module on the Apple iPhone to search for the embedded microcontroller described above. When found, it attempts to establish a connection. Upon success, it waits for information to be sent to it from the embedded application. Upon receiving information about the heart rate and GSR, the data is graphed on a line plot for visual
interpretation by the user. Two external libraries were used: CorePlot which is licensed under the BSD License and a Bluetooth LE example from Scott Gruby released under the MIT License.
[0049] Embodiments of the present technology may be described herein with reference to flowchart illustrations of methods and systems according to embodiments of the technology, and/or procedures, algorithms, steps, operations, formulae, or other computational depictions, which may also be implemented as computer program products. In this regard, each block or step of a flowchart, and combinations of blocks (and/or steps) in a flowchart, as well as any procedure, algorithm, step, operation, formula, or computational depiction can be implemented by various means, such as hardware, firmware, and/or software including one or more computer program instructions embodied in computer-readable program code. As will be appreciated, any such computer program instructions may be executed by one or more computer processors, including without limitation a general purpose computer or special purpose computer, or other programmable processing apparatus to produce a machine, such that the computer program instructions which execute on the computer processor(s) or other programmable processing apparatus create means for
implementing the function(s) specified.
[0050] Accordingly, blocks of the flowcharts, and procedures, algorithms, steps, operations, formulae, or computational depictions described herein support combinations of means for performing the specified function(s), combinations of steps for performing the specified function(s), and computer program instructions, such as embodied in computer-readable program code logic means, for performing the specified function(s). It will also be understood that each block of the flowchart illustrations, as well as any procedures, algorithms, steps, operations, formulae, or computational depictions and combinations thereof described herein, can be implemented by special purpose hardware-based computer systems which perform the specified function(s) or step(s), or combinations of special purpose hardware and computer-readable program code.
[0051] Furthermore, these computer program instructions, such as
embodied in computer-readable program code, may also be stored in one or more computer-readable memory or memory devices that can direct a computer processor or other programmable processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory or memory devices produce an article of manufacture including instruction means which implement the function specified in the block(s) of the flowchart(s). The computer program instructions may also be executed by a computer processor or other programmable processing apparatus to cause a series of operational steps to be performed on the computer processor or other programmable processing apparatus to produce a computer-implemented process such that the instructions which execute on the computer processor or other programmable processing apparatus provide steps for implementing the functions specified in the block(s) of the flowchart(s), procedure (s) algorithm(s), step(s), operation(s), formula(e), or computational
depiction(s).
[0052] It will further be appreciated that the terms "programming" or
"program executable" as used herein refer to one or more instructions that can be executed by one or more computer processors to perform one or more functions as described herein. The instructions can be embodied in software, in firmware, or in a combination of software and firmware. The instructions can be stored local to the device in non-transitory media, or can be stored remotely such as on a server, or all or a portion of the instructions can be stored locally and remotely. Instructions stored remotely can be downloaded (pushed) to the device by user initiation, or automatically based on one or more factors.
[0053] It will further be appreciated that as used herein, that the terms
processor, hardware processor, computer processor, central processing unit (CPU), and computer are used synonymously to denote a device capable of executing the instructions and communicating with input/output interfaces and/or peripheral devices, and that the terms processor, hardware processor, computer processor, CPU, and computer are intended to encompass single or multiple devices, single core and multicore devices, and variations thereof.
[0054] From the description herein, it will be appreciated that that the
present disclosure encompasses multiple embodiments which include, but are not limited to, the following list of embodiments:
[0055] 1 . A wearable emotional feedback apparatus, comprising: (a) a
wearable sensing mesh configured for wearing by a user; (b) a plurality of physiological sensors integrated into said wearable sensing mesh, wherein said sensors are configured for determining response to emotional stimuli; (c) a controller configured for receiving inputs from said physiological sensors and for detecting fluctuations in the emotional arousal state of the user wearing said wearable sensing mesh; and (d) a communication circuit configured for wirelessly communicating the detected signs to a mobile device configured for displaying the physiological information.
[0056] 2. The apparatus of any preceding embodiment, wherein said
apparatus is configured for detecting emotional arousal states in the user wearing said wearable sensing mesh associated with autism spectrum disorder (ASD).
[0057] 3. The apparatus of any preceding embodiment, wherein said
apparatus is configured for detecting states in the user wearing said wearable sensing mesh associated with Alzheimer's disease.
[0058] 4. The apparatus of any preceding embodiment, wherein said
wearable sensing mesh comprises at least one wearable glove, or sock.
[0059] 5. The apparatus of any preceding embodiment, wherein said
wearable sensing mesh comprises a wearable device retained in proximity to face, armpits, and/or crotch area of the user wearing said wearable sensing mesh.
[0060] 6. The apparatus of any preceding embodiment, wherein said
plurality of physiological sensors are selected from a group of sensors consisting of: heartrate (HR), heart rate (HR) variability, and galvanic skin response (GSR).
[0061] 7. The apparatus of any preceding embodiment, wherein said
heartrate (HR) and heart rate (HR) variability are detected using a pulse oximeter for obtaining a photo plethysmograph to detect the heart rate and heart rate variability of the user wearing said wearable sensing mesh.
[0062] 8. The apparatus of any preceding embodiment, wherein said pulse oximeter detects heartrate pulse in response to emitting light into the skin of the user of said wearable sensing mesh with reflection amplitudes being recorded and analyzed to determine heart rate (HR) and heart rate (HR) variability.
[0063] 9. The apparatus of any preceding embodiment, wherein said
galvanic skin response (GSR) sensors comprise conductive elements spanning one or more areas of said wearable mesh so that changes in skin resistance can be measured in response to an amount of sweat produced by the user wearing said wearable sensing mesh.
[0064] 10. The apparatus of any preceding embodiment, wherein said
controller is configured for measuring said changes in skin resistance from said galvanic skin response (GSR) sensors by applying differential voltages between said conductive elements to measure resistance.
[0065] 1 1 . The apparatus of any preceding embodiment, wherein said
resistance between said conductive elements is measured using a
Wheatstone bridge input to a differential amplifier whose output is either directly sent to an analog input of said controller, or directed to a circuit for extracting information and passing it to said controller.
[0066] 12. The apparatus of any preceding embodiment, wherein said
plurality of physiological sensors includes an acceleration sensor.
[0067] 13. The apparatus of any preceding embodiment, wherein said acceleration sensor is configured for sensing accelerations in three- dimensions.
[0068] 14. The apparatus of any preceding embodiment, wherein said
plurality of physiological sensors includes an audio sensor or microphone input, wherein said apparatus transmits collected audio to the mobile device.
[0069] 15. The apparatus of any preceding embodiment, wherein said
plurality of physiological sensors includes a global positioning sensor (GPS) configured for communicating location of the user wearing said wearable sensing mesh to said controller for communication to the mobile device.
[0070] 16. The apparatus of any preceding embodiment, wherein said
plurality of physiological sensors includes a fall sensing detector configured for communicating a suspected fall of the user wearing said wearable sensing mesh to said controller for communication to the mobile device.
[0071] 17. The apparatus of any preceding embodiment, wherein said
apparatus further comprises one or more output annunciators for
communicating visual or audio information from the user of the mobile device to the user wearing the wearable sensing mesh.
[0072] 18. The apparatus of any preceding embodiment, wherein said
mobile device comprises a mobile phone, touch pad, or laptop computer.
[0073] 19. The apparatus of any preceding embodiment, wherein said
apparatus is configured for generating push notifications to alert a user of the mobile device which is receiving physiological information from said wearable sensing mesh that threshold conditions are exceeded for the user wearing said wearable sensing mesh.
[0074] 20. An emotional feedback apparatus for use during treatment of autism spectrum disorder (ASD), comprising: (a) a wearable sensing mesh configured for wearing by a user; (b) a plurality of physiological sensors integrated into said wearable sensing mesh, wherein said sensors are configured for determining response to emotional stimuli; (c) a controller configured for receiving inputs from said physiological sensors and for detecting fluctuations in the emotional arousal state of the wearer which are associated with impulsivity which could lead to self-harm by the user wearing said wearable sensing mesh having autism spectrum disorder (ASD); (d) a heartrate (HR) sensor, and galvanic skin response (GSR) sensor within said physiological sensors for detecting heart rate, heart rate variability, and level of perspiration of the user wearing the wearable sensing mesh; (e) wherein said controller is configured to correlate heartrate, heart rate variability, and level of perspiration, to determine a level of emotional arousal state; (f) a communication circuit configured for wirelessly communicating the emotional arousal state to a mobile device configured for displaying the physiological information; and (g) wherein said apparatus is configured for generating push notifications to generate alerts, to a user of the mobile device, that threshold conditions for emotional arousal state have been exceeded.
[0075] 21 . The apparatus of any preceding embodiment, wherein said
plurality of physiological sensors includes an audio sensor or microphone input, wherein said apparatus transmits collected audio to the mobile device.
[0076] 22. The apparatus of any preceding embodiment, wherein said
apparatus further comprises one or more output annunciators for communicating visual or audio information from the user of the mobile device to the user wearing the wearable sensing mesh.
[0077] 23. The apparatus of any preceding embodiment, wherein said
plurality of physiological sensors includes a fall sensing detector configured for communicating a suspected fall of the user wearing the wearable sensing mesh to said controller for communication to the mobile device.
[0078] 24. The apparatus of any preceding embodiment, wherein said
mobile device comprises a mobile phone, touch pad, or laptop computer.
[0079] Although the description herein contains many details, these should not be construed as limiting the scope of the disclosure but as merely providing illustrations of some of the presently preferred embodiments. Therefore, it will be appreciated that the scope of the disclosure fully encompasses other embodiments which may become obvious to those skilled in the art.
In the claims, reference to an element in the singular is not intended to mean "one and only one" unless explicitly so stated, but rather "one or more." All structural and functional equivalents to the elements of the disclosed embodiments that are known to those of ordinary skill in the art are expressly incorporated herein by reference and are intended to be encompassed by the present claims. Furthermore, no element,
component, or method step in the present disclosure is intended to be dedicated to the public regardless of whether the element, component, or method step is explicitly recited in the claims. No claim element herein is to be construed as a "means plus function" element unless the element is expressly recited using the phrase "means for". No claim element herein is to be construed as a "step plus function" element unless the element is expressly recited using the phrase "step for".
Table 1
Program instructions "AutiSenseMain_RBL_main" for Sensor Mesh Device
/*
AutiSenseMain RBL main.c
Modified Example of chat program from RedBearLabs to fit emotion wearable device by Dan Fong
*/ //"SPI.h/Nordic_nRF8001.h/RBL_nRF8001.h" are needed in every new project include <SPI.h>
include <Nordic_nRF8001.h>
include <RBL nRF8001.h>
// Variables
int pulsePin = 0; // Pulse Sensor purple wire connected to analog pin 0 int gsrPin = 1; // gsr sensor connected to adcl int blinkPin = 13; // pin to blink led at each beat
int fadePin = 5; // pin to do fancy classy fading blink at each beat
//int fadeRate = 0; // used to fade LED on with PWM on fadePin
int fadeRate = 100; // used to fade LED on with PWM on fadePin
// Volatile Variables, used in the interrupt service routine!
volatile int BPM; // int that holds raw Analog in 0. updated every 2mS volatile int Signal; // holds the incoming raw data
volatile int IBI = 600; // int that holds the time interval between beats! Must be seeded!
volatile boolean Pulse = false; // "True" when User's live heartbeat is detected. "False' when not a "live beat".
volatile boolean QS = false; // becomes true when Arduoino finds a beat. // Regards Serial OutPut— Set This Up to your needs
static boolean serialVisual = true; // Set to 'true' by Default. Re-set to 'false' to sendDataToSerial instead. : )
#define NOOUTPUT; void setup() {
// Default pins set to 9 and 8 for REQN and RDYN
// Set your REQN and RDYN here before ble_begin() if you need
//ble_set_pins(3, 2); pinMode(blinkPin,OUTPUT); // pin that will blink to your heartbeat!
pinMode(fadePin,OUTPUT); // pin that will fade to your heartbeat!
Serial. begin(l 15200); // we agree to talk fast!
interruptSetupO; // sets up to read Pulse Sensor signal every 2mS
// UN-COMMENT THE NEXT LINE IF YOU ARE POWERING The Pulse Sensor AT LOW VOLTAGE,
// AND APPLY THAT VOLTAGE TO THE A-REF PIN
//analogReference(EXTERNAL);
// Set your BLE Shield name here, max. length 10
//ble_set_name("My Name");
// Init. and start BLE library.
ble_begin();
// Enable serial debug
Serial.begin(57600);
}
unsigned char buf[16] = {0};
unsigned char len = 0; void loop() {
if ( ble_available() ) {
while ( ble_available() ) {
unsigned char abyte =
Serial, write(abyte);
ble_write(abyte);
}
Serial. println();
}
if ( Serial. available() ) {
delay(5); while ( Serial. available() )
ble_write( Serial. read() );
}
ble do eventsQ; #ifndefNOOUTPUT
serialOutput() ;
#endif
if (QS == true){ // A Heartbeat Was Found
// BPM and IBI have been Determined
// Quantified Self "QS" true when arduino finds a heartbeat digitalWrite(blinkPin,HIGH); // Blink LED, we got a beat.
fadeRate = 255; // Makes the LED Fade Effect Happen
// Set 'fadeRate' Variable to 255 to fade LED with pulse
#ifndefNOOUTPUT
serialOutputWhenBeatHappens(); // A Beat Happened, Output that to serial. #endif if (ble_connected()) {
ble write(BPM);
}
QS = false; // reset the Quantified Self flag for next time
}
else { digitalWrite(blinkPin,LOW); // There is not beat, turn off pin 13 LED
}
ledFadeToBeat(); // Makes the LED Fade Effect Happen
delay(20); // take a break
}
volatile int rate[10]; // array to hold last ten IBI values
volatile unsigned long sampleCounter = 0; // used to determine pulse timing volatile unsigned long lastBeatTime = 0; // used to find IBI
volatile int P =512; // used to find peak in pulse wave, seeded
volatile int T = 512; // used to find trough in pulse wave, seeded volatile int thresh = 525; // used to find instant moment of heart beat, seeded volatile int amp = 100; // used to hold amplitude of pulse waveform, seeded volatile boolean firstBeat = true; // used to seed rate array so we startup with reasonable BPM
volatile boolean secondBeat = false; // used to seed rate array so we startup with reasonable BPM volatile int gsrAdc; // ARDUINO Flora, Fio v3 (or any other board with ATmega32u4 running at 8MHz)
//
// » Timer 1
//
// Use of Timer 1 interferes with PWM on pins 9 and 10. void interruptSetup(){
TCCRlA = 0x00;
TCCR1B = OxOC; // prescaler = 256
OCR1A = 0x3E; // count to 62
TEVISK1 = 0x02;
sei();
}
// The only other thing you will need is the correct ISR vector in the next step.
ISR(TIMERl COMPA vect) {
cli(); // disable interrupts while we do this
Signal = analogRead(pulsePin); // read the Pulse Sensor
sampleCounter += 2; // keep track of the time in mS with this variable int N = sampleCounter - lastBeatTime; // monitor the time since the last beat to avoid noise gsrAdc = analogRead(gsrPin);
Serial. println(gsrAdc);
// find the peak and trough of the pulse wave
if(Signal < thresh && N > (IBI/5)*3){ // avoid dichrotic noise by waiting 3/5 of last IBI
if (Signal < T){ // T is the trough
T = Signal; // keep track of lowest point in pulse wave
}
}
if(Signal > thresh && Signal > P){ // thresh condition helps avoid noise
P = Signal; // P is the peak
} // keep track of highest point in pulse wave
// NOW IT'S TIME TO LOOK FOR THE HEART BEAT
// signal surges up in value every time there is a pulse
if (N > 250){ // avoid high frequency noise
if ( (Signal > thresh) && (Pulse == false) && (N > (IBI/5)*3) ){ Pulse = true; // set the Pulse flag when we think there is a pulse
// digitalWrite(blinkPin,HIGH); // turn on pin 13 LED
IBI = sampleCounter - lastBeatTime; // measure time between beats in mS lastBeatTime = sampleCounter; // keep track of time for next pulse if(secondBeat){ // if this is the second beat, if secondBeat == TRUE secondBeat = false; // clear secondBeat flag
for(int i=0; i<=9; i++){ // seed the running total to get a realisitic BPM at startup
rate[i] = IBI;
}
}
if(firstBeat){ // if it's the first time we found a beat, if firstBeat ==
TRUE
firstBeat = false; // clear firstBeat flag
secondBeat = true; // set the second beat flag
sei(); // enable interrupts again
return; // IBI value is unreliable so discard it
}
// keep a running total of the last 10 IBI values
word runningTotal = 0; // clear the runningTotal variable for(int i=0; i<=8; i++){ // shift data in the rate array
rate[i] = rate[i+l]; // and drop the oldest IBI value
runningTotal += rate[i]; // add up the 9 oldest IBI values
}
rate[9] = IBI; // add the latest IBI to the rate array
runningTotal += rate[9]; // add the latest IBI to runningTotal
runningTotal /= 10; // average the last 10 IBI values
BPM = 60000/runningTotal // how many beats can fit into a minute? that's BPM!
QS = true; // set Quantified Self flag
// QS FLAG IS NOT CLEARED INSIDE THIS ISR
}
}
if (Signal < thresh && Pulse == true){ // when the values are going down, the beat is over
// digitalWrite(blinkPin,LOW); // turn off pin 13 LED
Pulse = false; // reset the Pulse flag so we can do it again
amp = P - T; // get amplitude of the pulse wave
thresh = amp/2 + T; // set thresh at 50% of the amplitude
P = thresh; // reset these for next time T = thresh;
}
if (N > 2500){ // if 2.5 seconds go by without a beat
thresh = 512; // set thresh default
P = 512; // set P default
T = 512; // set T default
lastBeatTime = sampleCounter; // bring the lastBeatTime up to date firstBeat = true; // set these to avoid noise
secondBeat = false; // when we get the heartbeat back
}
sei(); // enable interrupts when youre done!
}// end isr
void ledFadeToBeat(){
fadeRate -= 15; // set LED fade value
fadeRate = constrain(fadeRate,0,255); // keep LED fade value from going into negative numbers!
analogWrite(fadePin,fadeRate); // fade LED
}
Table 2
Program instructions "AutismApp" for the Mobile Application
======================
// Code below this line was written for the iOS mobile application
//
// BluetoothLEManager.h
// SensorApp
//
// Created by Scott Gruby on 12/12/12.
// Copyright (c) 2012 Scott Gruby. All rights reserved.
// #import <Foundation/Foundation.h>
#imp ort <CoreB luetooth/ C oreB luetooth . h>
@protocol BluetoothLEManagerDelegateProtocol <NSObject>
@required
- (void) didDiscoverPeripheral:(CBPeripheral *) peripheral
advertisementData:(NSDictionary *) advertisementData;
- (void) didConnectPeripheral:(CBPeripheral *) peripheral error: (NSError *) error;
- (void) didDisconnectPeripheral:(CBPeripheral *) peripheral error: (NSError *) error;
- (void) didChangeState:(CBCentralManagerState) newState;
@end
@interface BluetoothLEManager : NSObject <CBPeripheralDelegate>
+ (BluetoothLEManager *)
sharedManagerWithDelegate:(id<BluetoothLEManagerDelegateProtocol>)delegate; + (BluetoothLEManager *) sharedManager;
- (void) discoverDevices;
- (void) connectPeripheral:(CBPeripheral *) peripheral;
- (void) disconnectPeripheral:(CBPeripheral*)peripheral;
- (void) stopScanning;
@end
//
// BluetoothLEManag
// SensorApp
// // Created by Scott Gruby on 12/12/12.
// Copyright (c) 2012 Scott Gruby. All rights reserved.
// #import "BluetoothLEManager.h"
@interface BluetoothLEManager () <CBCentralManagerDelegate>
@property (nonatomic, weak) id<BluetoothLEManagerDelegateProtocol> delegate; @property (nonatomic, strong) CBCentralManager *centralManager;
@property (nonatomic, assign) BOOL pendinglnit;
@property (nonatomic, strong) NSMutableArray *foundPeripherals;
@property (nonatomic, copy) NSString *deviceName;
@end @implementation BluetoothLEManager
+ (BluetoothLEManager *) sharedManager {
return [self sharedManagerWithDelegate:nil];
}
+ (BluetoothLEManager *)
sharedManagerWithDelegate:(id<BluetoothLEManagerDelegateProtocol>)delegate { static dispatch once t once;
static id sharedManager;
dispatch_once(&once, A{
sharedManager = [[self alloc] initWithDelegate:delegate];
});
return sharedManager;
}
- (id) initWithDelegate:(id<BluetoothLEManagerDelegateProtocol>) delegate { self = [super init];
if (self) {
self, pendinglnit = YES;
self, delegate = delegate;
selfcentralManager = [[CBCentralManager alloc] initWithDelegate:self queue : di spatch_get_main_queue()] ; selffoundPeripherals = [[NSMutableArray alloc] init];
}
return self;
}
#pragma mark - Restoring
****/
/* Settings */
****/
/* Reload from file. */
- (void) loadSavedDevices {
NS Array * storedDevices = [[NSUserDefaults standardUserDefaults] arrayForKey : @" StoredDevices" ]; if (! [storedDevices isKindOfClass:[NS Array class]]) {
return;
}
NSMutableArray *uuidArray = [[NSMutableArray alloc] init]; for (id deviceUUIDString in storedDevices) { if (! [deviceUUIDString isKindOfClass:[NSString class]]) {
continue;
}
CFUUIDRef uuid = CFUUIDCreateFromString(NULL,
(CFStringRef)deviceUUIDString);
if (!uuid) {
continue;
} if (! [uuidArray containsObject:( bridge id)uuid]) {
[uuidArray addObject:( bridge id)uuid];
}
CFRelease(uuid);
}
if ([uuidArray count]) {
[selfcentralManager retrievePeripherals:uuidArray];
}
}
// If we connect a device with the service we want, add it to our device list // so that we can automatically restore it later.
- (void) addSavedDevice:(CFUUIDRef) uuid {
NS Array * storedDevices = [[NSUserDefaults standardUserDefaults] arrayForKey :@"StoredDevices"];
NSMutableArray *newDevices = nil;
CFStringRef uuidString = NULL; if (! [storedDevices isKindOfClass:[NSArray class]] && storedDevices != nil) { return;
}
newDevices = [NSMutableArray arrayWithArray:storedDevices]; uuidString = CFUUIDCreateString(NlJLL, uuid);
if (uuidString) {
if (! [newDevices containsObject:( bridge NSString*)uuidString]) {
[newDevices addObject:( bridge NSString*)uuidString];
}
CFRel ease(uui d String) ;
}
/* Store */
[[NSUserDefaults standardUserDefaults] setObjec newDevices forKey : @ " StoredDevices " ] ;
[[NSUserDefaults standardUserDefaults] synchronize];
}
// If we explicitly disconnect a device, remove it from our device list
- (void) removeSavedDevice:(CFUUIDRef) uuid {
NS Array * StoredDevices = [[NSUserDefaults standardUserDefaults] arrayForKey:@"StoredDevices"];
NSMutableArray *newDevices = nil;
CFStringRef uuidString = NULL; if ([StoredDevices isKindOfClass:[NSArray class]]) {
newDevices = [NSMutableArray array With Array: StoredDevices]; uuidString = CFUUIDCreateString(NULL, uuid);
if (uuidString) {
[newDevices removeObject:( bridge NSString*)uuidString];
CFRel ease(uui d String) ;
}
/* Store */
[[NSUserDefaults standardUserDefaults] setObjectnewDevices forKey : @ " StoredDevices "] ;
[[NSUserDefaults standardUserDefaults] synchronize];
}
}
// Callback from retrieveConnectedPeripherals
- (void) centralManager:(CBCentralManager *)central
didRetrieveConnectedPeripherals:(NS Array *)peripherals {
/* Add to list. */
for (CBPeripheral *peripheral in peripherals) {
if ([peripheral isConnected]) {
// Basically retain the peripheral if (! [self.foundPeripherals containsObjectperipheral]) {
[selffoundPeripherals addObject:peripheral];
}
[self connectPeripheral:peripheral];
}
}
// After we get all the connected devices, get the devices that
// we stored before
[self loadSavedDevices];
// Nuke the list to clear out any devices that are no longer around.
// This will be rebuilt when devices are connected
[[NSUserDefaults standardUserDefaults] removeObjectForKey:@"StoredDevices"]; [[NSUserDefaults standardUserDefaults] synchronize];
}
// Callback from retrievePeripherals
- (void) centralManager:(CBCentralManager *)central didRetrievePeripherals:(NSArray *)peripherals {
for (CBPeripheral *peripheral in peripherals) {
if (! [selffoundPeripherals containsObjec peripheral]) {
[selffoundPeripherals addObjec peripheral];
}
if (! [peripheral isConnected]) {
[self connectPeripherakperipheral];
}
}
}
#pragma mark - Discovery
****/
/* Discovery
*/
****/
// This assume that the name is advertised
- (void) discoverDevices {
if (selfdelegate == nil) {
return;
}
NSDictionary *options = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:NO] forKey:CBCentralManagerScanOptionAllowDuplicatesKey]; [self.centralManager scanForPeripheralsWithServices:nil options:options]; }
- (void) stopScanning {
[self.centralManager stopScan];
}
- (void)centralManagen(CBCentralManager *)central
didDiscoverPeripheral:(CBPeripheral *)peripheral advertisementData:(NSDictionary *)advertisementData RSSI:(NSNumber *)RSSI {
if (! [self.foundPeripherals containsObject:peripheral]) {
[selffoundPeripherals addObject:peripheral];
}
[self.delegate didDiscoverPeripheral:peripheral advertisementData:advertisementData];
}
#pragma mark - Connection/Disconnection
****/
/* Connection/Disconnection */
****/
- (void) connectPeripheral:(CBPeripheral *) peripheral {
if (! [peripheral isConnected]) {
peripheral. delegate = self;
[self.centralManager connectPeripheral peripheral options:nil];
}
else {
[self.delegate didConnectPeripheral:peripheral erronnil];
}
}
- (void) disconnectPeripheral:(CBPeripheral*)peripheral {
[self removeSavedDevice:peripheral.UUID]; // Only remove if we explictly disconnected [self.centralManager cancelPeripheralConnectiomperipheral];
}
- (void) centralManager:(CBCentralManager *)central
didConnectPeripheral:(CBPeripheral *)peripheral {
if (! [selffoundPeripherals containsObjec peripheral]) {
[selffoundPeripherals addObject:peripheral];
}
[self addSavedDevice:peripheral.UUID]; [self.delegate didConnectPeripheral:peripheral erronnil];
} - (void) centralManager:(CBCentralManager *)central
didFailToConnectPeripheral:(CBPeripheral *)peripheral error: (NSError *)error { [self.delegate didConnectPeripheral peripheral error:error];
}
- (void) centralManager:(CBCentralManager *)central
didDisconnectPeripheral:(CBPeripheral *)peripheral error:(NSError *)error {
[self.delegate didDisconnectPeripheral:peripheral error:error];
}
- (void) clearDevices {
[selffoundPeripherals removeAHObjects];
}
- (void) centralManagerDidUpdateState:(CBCentralManager *)central { static CBCentralManager State previousState = -1 ; switch ([selfcentralManager state]) {
case CBCentralManagerStatePoweredOff: {
[self clearDevices];
break;
}
case CBCentralManagerStateUnauthorized: {
/* Tell user the app is not allowed. */
break;
}
case CBCentralManagerStateUnknown: {
/* Bad news, let's wait for another event. */
break;
}
case CBCentralManagerStateUnsupported: {
break;
}
case CBCentralManagerStatePoweredOn: {
selfpendinglnit = NO;
[selfcentralManager retrieveConnectedPeripherals];
break;
} case CBCentralManagerStateResetting: {
[self clearDevices];
self.pendinglnit = YES;
break;
}
}
previousState = [self.centralManager state];
[self.delegate didChangeState:previousState];
}
@end //
// AppDelegate. h
// MiNI Auti Sense
//
// Created by Daniel Fong
// Copyright (c) 2015 Daniel Fong. All rights reserved.
//
#import <UIKit/UIKit.h> @class ViewController;
@interface AppDelegate : UIResponder <UIApplicationDelegate> @property (strong, nonatomic) UlWindow *window;
@property (strong, nonatomic) ViewController *viewController; @end //
// AppDelegate. m
// MiNI Auti Sense
//
// Created by Daniel Fong
// Copyright (c) 2015 Daniel Fong. All rights reserved.
//
#import "AppDelegate. h" #import "ViewController.h"
©implementation AppDelegate - (BOOL)application:(UIApplication *)application
didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
{
//self. window = [[UlWindow alloc] initWithFrame:[[UIScreen mainScreen] bounds]]; // Override point for customization after application launch.
//selfviewController = [[ViewController alloc] initWithNibName:@"ViewController" bundle:nil];
//UINavigationController *navController = [[UINavigationController alloc]
initWithRoot ViewController: selfviewController];
//self.window.rootViewController = navController;
//[self, window makeKeyAndVisible];
return YES;
}
- (void)applicationWillResignActive:(UIApplication *)application
{
// Sent when the application is about to move from active to inactive state. This can occur for certain types of temporary interruptions (such as an incoming phone call or SMS message) or when the user quits the application and it begins the transition to the background state.
// Use this method to pause ongoing tasks, disable timers, and throttle down
OpenGL ES frame rates. Games should use this method to pause the game.
}
- (void)applicationDidEnterBackground:(UIApplication *)application
{
// Use this method to release shared resources, save user data, invalidate timers, and store enough application state information to restore your application to its current state in case it is terminated later.
// If your application supports background execution, this method is called instead of application WillTerminate: when the user quits.
}
- (void)applicationWillEnterForeground:(UIApplication *)application
{
// Called as part of the transition from the background to the inactive state; here you can undo many of the changes made on entering the background.
}
- (void)applicationDidBecomeActive:(UIApplication *)application
{
// Restart any tasks that were paused (or not yet started) while the application was inactive. If the application was previously in the background, optionally refresh the user interface.
}
- (void)applicationWillTerminate:(UIApplication *)application
{
// Called when the application is about to terminate. Save data if appropriate. See also applicationDidEnterBackground: .
}
@end
//
// PlotGallery.h
// CorePlotGallery
//
#import "Plotltem.h"
©interface PlotGallery : NSObject
@property (nonatomic, readonly) NSUInteger count;
@property (nonatomic, readonly) NSUInteger numberOfSections;
@property (nonatomic, readonly, strong) NSArray *sectionTitles;
+(PlotGallery *)sharedPlotGallery;
-(void)addPlotItem:(PlotItem *)plotItem;
-(void)sortBy Title;
-(Plotltem *)objectInSection:(NSUInteger)section atIndex:(NSUInteger)index;
-(NSUInteger)numberOfRowsInSection:(NSUInteger)section;
@end
//
// PlotGallery.m
// CorePlotGallery
//
#import "PlotGallery.h" @interface PlotGallery()
@property (nonatomic, readwrite, strong) NSMutableArray *plotItems;
@property (nonatomic, readwrite, strong) NSCountedSet *plotSections; @end
©implementation PlotGallery
@synthesize plotltems;
@synthesize plotSections; static PlotGallery *sharedPlotGallery = nil;
+(PlotGallery *)sharedPlotGallery
{
@synchronized(self)
{
if ( sharedPlotGallery == nil ) {
sharedPlotGallery = [[self alloc] init];
}
}
return sharedPlotGallery;
}
+(id)allocWithZone:(NSZone *)zone
{
@synchronized(self)
{
if ( sharedPlotGallery == nil ) {
return [super allocWithZone:zone];
}
}
return sharedPlotGallery;
}
-(id)init
{
Class thisClass = [self class];
@synchronized(thisClass)
{
if ( sharedPlotGallery == nil ) {
if ( (self = [super init]) ) {
sharedPlotGallery = self;
plotltems = [[NSMutableArray alloc] plotSections = [[NSCountedSet alloc] ii
}
}
}
return sharedPlotGallery; }
-(id)copyWithZone:(NSZone *)zone
{
return self;
}
-(void)addPlotItem:(PlotItem *)plotItem
{
[self.plotltems addObject:plotItem];
NS String * secti onName = plotltem. section;
if ( sectionName ) {
[ self . pi otS ecti ons addObj ect : secti onName] ;
}
}
-(NSUInteger)count
{
return self.plotltems. count;
}
-(NSUInteger)numberOfSections
{
return self. plotSecti ons. count;
}
-(NSUInteger)numb erOfRowsIn S ecti on : (NSUInteger) secti on
{
return [selfplotSections countForObj ect: self. secti onTitles[secti on]];
}
-(Plotltem *)objectInSection:(NSUInteger)section atIndex:(NSUInteger)index
{
NSUInteger offset = 0; for ( NSUInteger i = 0; i < section; i++ ) {
offset += [self numberOfRowsInSectiomi];
}
return selfplotItems[offset + index];
}
-(void)sortBy Title
{
[self.plotltems sortUsingSelector:@selector(titleCompare:)];
} -(NSArray *)sectionTitles
{
return [[self.plotSections allObjects]
sortedArrayUsingSelector:@selector(caseInsensitiveCompare:)];
}
@end //
// Plotltem.h
// CorePlotGallery
// #import <Foundation/Foundation.h>
#if TARGET IPHONE SIMULATOR || T ARGET O S IPHONE #import "CorePlot-CocoaTouch.h"
#import <UIKit/UIKit.h> typedef CGRect CGNSRect;
typedef UlView PlotGalleryNativeView;
#else
#import <CorePlot/CorePlot.h>
typedef NSRect CGNSRect;
ty pedef N S Vi ew PI otGall eryNati ve Vi ew;
#endif extern NSString *const kDemoPlots;
extern NSString *const kPieCharts;
extern NSString *const kLinePlots;
extern NSString *const kBarPlots;
extern NSString *const kFinancialPlots;
@class CPTGraph;
@class CPTTheme; ©interface Plotltem : NSObject
@property (nonatomic, readwrite, strong) CPTGraphHostingView *defaultLayerHostingView; @property (nonatomic, readwrite, strong) NSMutableArray *graphs; @property (nonatomic, readwrite, strong) NSString *section;
@property (nonatomic, readwrite, strong) NSString *title; @property (nonatomic, readonly) CGFloat titleSize; +(voi d)regi sterPl otltem : (i d)item;
-(void)renderInView:(PlotGalleryNativeView *)hostingView withTheme:(CPTTh *)theme animated:(BOOL)animated;
-(CPTNativelmage *)image;
#if TARGET IPHO E SIMULATOR 11 TARGET OS IPHO E
#else
-(void)setFrameSize:(NSSize)size;
#endif
-(void)renderInGraphHostingView:(CPTGraphHostingView *)hostingView withTheme:(CPTTheme *)theme animated:(BOOL)animated;
-(void)formatAHGraphs;
-(voi d)rel oadD ata;
-(void)applyTheme:(CPTTheme *)theme toGraph:(CPTGraph *)graph withDefault:(CPTTheme *)defaultTheme;
-(void)addGraph:(CPTGraph *)graph;
-(void)addGraph:(CPTGraph *)graph toHostingView:(CPTGraphHostingView
*)hostingView;
-(void)killGraph;
-(void)generateData;
-(NSComparisonResult)titleCompare:(PlotItem *)other; @end
//
// Plotltem.m
// CorePlotGallery
//
#import "PlotGallery.h"
#import <tgmath.h>
#if TARGET IPHONE SIMULATOR 11 TARGET OS IPHONE
#else
// For IKImageBrowser
#import <Quartz/Quartz.h>
#endif NSString *const kDemoPlots = @"Demos";
NSString *const kPieCharts = @"Pie Charts";
NSString *const kLinePlots = @"Line Plots";
NSString *const kBarPlots = @"Bar Plots";
NSString *const kFinancialPlots = @"Financial Plots";
@interface PlotItem() @property (nonatomic, readwrite, strong) CPTNativelmage *cachedlmage;
@end
#pragma mark -
@implementation Plotltem
@synthesize defaultLayerHostingView;
@synthesize graphs;
@synthesize section;
@synthesize title;
@synthesize cachedlmage;
@dynamic titleSize; +(voi d)regi sterPl otltem : (i d)item
{
NSLog(@"registerPlotItem for class %@", [item class]); Class itemClass = [item class]; if ( itemClass ) {
// There's no autorelease pool here yet...
Plotltem *plotItem = [[itemClass alloc] init];
if ( plotltem ) {
[[PlotGallery sharedPlotGallery] addPlotItem:plotItem];
}
}
}
-(id)init
{
if ( (self = [super init]) ) {
defaultLayerHostingView = nil; graphs = [[NSMutableArray alloc] init];
section = nil;
title = nil; }
return self;
}
-(void)addGraph:(CPTGraph *)graph toHostingView:(CPTGraphHostingView *)hostingView
{
[selfgraphs addObject:graph]; if ( hosting View ) {
hosting View. hostedGraph = graph;
}
}
-(void)addGraph:(CPTGraph *)graph
{
[self addGraph: graph toHostingView:nil];
}
-(void)killGraph
{
[[CPT Animation sharedlnstance] removeAllAnimationOperations]; // Remove the CPTLayerHostingView
CPTGraphHostingView *hostingView = selfdefaultLayerHostingView; if ( hosting View ) {
[hosting View removeFrom Superview] ; hosting View. hostedGraph = nil;
selfdefaultLayerHostingView = nil;
}
selfcachedlmage = nil; [selfgraphs removeAllObjects];
}
-(void)dealloc
{
[self killGraph];
}
// override to generate data for the plot if needed
-(void)generateData
{
} -(NSComparisonResult)titleCompare:(PlotItem *)other
{
NSComparisonResult comparisonResult = [self.section
caseInsensitiveCompare:other.section]; if ( comparisonResult == NSOrderedSame ) {
comparisonResult = [self.title caseInsensitiveCompare:other.title];
}
return comparisonResult;
}
-(CGFloat)titleSize
{
CGFloat size;
#if TARGET IPHONE SIMULATOR || T ARGET O S IPHONE switch ( UI_USER_INTERFACE_IDIOM() ) {
case UR7serInterfaceIdiomPad:
size = 24.0;
break; case UnjserlnterfaceldiomPhone:
size = 16.0;
break; default:
size = 12.0;
break;
}
#else
size = 24.0;
#endif return size;
}
-(void)setPaddingDefaultsForGraph:(CPTGraph *)graph
{
CGFloat boundsPadding = selftitleSize; graph. paddingLeft = boundsPadding; if ( graph. titleDisplacement.y > 0.0 ) {
graph. paddingTop = graph.titleTextStyle.fontSize * CPTFloat(2.0);
} else {
graph. paddingTop = boundsPadding;
}
graph. paddingRight = boundsPadding;
graph. paddingBottom = boundsPadding;
}
-(void)formatAHGraphs
{
CGFloat graphTitleSize = self.titleSize; for ( CPTGraph *graph in self.graphs ) {
// Title
CPTMutableTextStyle *textStyle = [CPTMutableTextStyle textStyle];
textStyle. color = [CPTColor grayColor];
textStyle.fontName = @ "Helvetica-Bold";
textStyle.fontSize = graphTitleSize; graph.title = (self.graphs. count == 1 ? selftitle : nil);
graph.titleTextStyle = textStyle;
graph.titleDisplacement = CPTPointMake( 0.0, textStyle.fontSize * CPTFloat(1.5) );
graph.titlePlotAreaFrameAnchor = CPTRectAnchorTop;
// Padding
CGFloat boundsPadding = graphTitleSize;
graph. paddingLeft = boundsPadding; if ( graph.title.length > 0 ) {
graph.paddingTop = MAX(graph.titleTextStyle.fontSize * CPTFloat(2.0), boundsPadding);
}
else {
graph.paddingTop = boundsPadding;
}
graph. paddingRight = boundsPadding;
graph. paddingBottom = boundsPadding;
// Axis labels
CGFloat axisTitleSize = graphTitleSize * CPTFloat(0.75);
CGFloat labelSize = graphTitleSize * CPTFloat(0.5); for ( CPTAxis *axis in graph. axisSet.axes ) {
// Axis title
textStyle = [axis.titleTextStyle mutableCopy]; textStyle.fontSize = axisTitleSize; axis.titleTextStyle = textStyle; // Axis labels
textStyle = [axis.labelTextStyle mutableCopy];
textStyle.fontSize = labelSize; axis.labelTextStyle = textStyle; textStyle = [axis.minorTickLabelTextStyle mutableCopy];
textStyle.fontSize = labelSize; axis.minorTickLabelTextStyle = textStyle;
}
// Plot labels
for ( CPTPlot *plot in graph. allPlots ) {
textStyle = [plot.labelTextStyle mutableCopy];
textStyle.fontSize = labelSize; plot.labelTextStyle = textStyle;
}
// Legend
CPTLegend *theLegend = graph. legend;
textStyle = [theLegend.textStyle mutableCopy];
textStyle.fontSize = labelSize; theLegend.textStyle = textStyle;
theLegend.swatchSize = CGSizeMake( labelSize * CPTFloat(1.5), labelSize * CPTFloat(1.5) ); theLegend.rowMargin = labelSize * CPTFloat(0.75);
theLegend.columnMargin = labelSize * CPTFloat(0.75); theLegend.paddingLeft = labelSize * CPTFloat(0.375);
theLegend.paddingTop = labelSize * CPTFloat(0.375);
theLegend.paddingRight = labelSize * CPTFloat(0.375);
theLegend.paddingBottom = labelSize * CPTFloat(0.375);
}
}
#if TARGET IPHO E SIMULATOR || T ARGET O S IPHO E
-(Ullmage *)image
{ if ( selfcachedlmage == nil ) {
CGRect imageFrame = CGRectMake(0, 0, 400, 300);
UlView *imageView = [[UlView alloc] initWithFrame: imageFrame];
[image View setOpaque:YES];
[imageView setUserInteractionEnabled:NO];
[self renderInView:imageView withTheme:nil animated:NO];
[imageView layoutlfNeeded]; CGSize boundsSize = image View.bounds. size;
UIGraphicsBeginImageContextWithOptions(boundsSize, YES, 0.0);
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSetAllowsAntialiasing(context, true); for ( UlView * sub View in imageView. subviews ) {
if ( [sub View isKindOfClass:[CPTGraphHostingView class]] ) {
CPTGraphHostingView *hostingView = (CPTGraphHostingView *)subView;
CGRect frame = hosting View.frame;
CGContextSaveGState(context); CGContextTranslateCTM(context, frame, on gin. x, frame, on gin. y + frame, size. height);
CGContextScaleCTM(context, 1.0, -1.0);
[hosting View.hostedGraph layoutAndRenderlnContextxontext]; CGContextRestoreGState(context);
}
}
CGContextSetAllowsAntialiasing(context, false); selfcachedlmage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndlmageContextO;
}
return selfcachedlmage;
}
#else // OSX
-(NSImage *)image
{
if ( selfcachedlmage == nil ) { CGRect imageFrame = CGRectMake(0, 0, 400, 300);
NSView *imageView = [[NSView alloc]
initWithFrame:NSRectFromCGRect(imageFrame)];
[image View setWantsLayer:YES];
[self renderInView:imageView withTheme:nil animated :NO]; CGSize boundsSize = imageFrame. size;
NSBitmapImageRep *layerlmage = [[NSBitmapImageRep alloc]
initWithBitmapDataPlanes:NULL
pixelsWide:(NSInteger)boundsSize.width pixelsHigh:(NSInteger)boundsSize. height bitsPerSample:8
samplesPerPixel:4
hasAlpha:YES
isPlanar:NO
colorSpaceName:NSCalibratedRGBColorSpace bytesPerRow:(NSInteger)boundsSize. width * 4 bitsPerPixel:32];
NSGraphicsContext *bitmapContext = [NSGraphicsContext
graphicsContextWithBitmapImageRep:layerImage];
CGContextRef context = (CGContextRef)[bitmapContext graphicsPort];
CGContextClearRect( context, CGRectMake(0.0, 0.0, boundsSize.width, boundsSize. height) );
CGContextSetAllowsAntialiasing(context, true);
CGContextSetShouldSmoothFonts(context, false);
[i mage Vi ew.1 ay er renderlnC ontext : context] ;
CGContextFlush(context); selfcachedlmage = [[NSImage alloc]
initWithSize:NSSizeFromCGSize(boundsSize)];
[selfcachedlmage addRepresentatiomlayerlmage];
}
return selfcachedlmage;
}
#endif
-(void)applyTheme:(CPTTheme *)theme toGraph:(CPTGraph *)graph
withDefault:(CPTTheme *)defaultTheme
{
if ( theme == nil ) {
[graph applyTheme:defaultTheme]; }
else if ( ! [theme isKindOfClass:[NSNull class]] ) {
[graph apply Theme:theme];
}
}
#if TARGET IPHONE SIMULATOR || T ARGET O S IPHONE
#else
-(void)setFrameSize:(NSSize)size
{
}
#endif
-(void)renderInView:(PlotGalleryNativeView *)inView withTheme:(CPTTheme *)theme animated:(BOOL)animated
{
[self killGraph];
CPTGraphHostingView *hostingView = [[CPTGraphHostingView alloc]
initWithFrame:inView.bounds];
[inView addSubview:hostingView];
#if TARGET IPHONE SIMULATOR || T ARGET O S IPHONE
hosting View.translatesAutoresizingMasklntoConstraints = NO;
[inView addConstraint: [NSLayoutConstraint constraintWithItem:hostingView
attribute :NSLay outAttributeLeft
relatedBy :NSLayoutRelationEqual
toItem:inView
attribute :NSLayoutAttributeLeading
multiplier: 1.0
constants.0]];
[inView addConstraint: [NSLayoutConstraint constraintWithItem:hostingView
attribute :NSLay outAttributeTop
relatedBy :NSLayoutRelationEqual
toItem:inView
attribute :NSLay outAttributeTop
multiplier: 1.0
constants.0]];
[inView addConstraint: [NSLayoutConstraint constraintWithItem:hostingView
attribute :NSLay outAttributeRight
relatedBy :NSLayoutRelationEqual
toItem:inView
attribute:NSLayoutAttributeTrailing
multiplier: 1.0
constants.0]];
[inView addConstraint: [NSLayoutConstraint constraintWithItem:hostingView attribute:NSLayoutAttributeBottom
relatedBy :NSLayoutRelationEqual
toItem:inView
attribute:NSLayoutAttributeBottom
multiplier: 1.0
constants.0]];
#else
[hosting View setAutoresizingMask:NSViewWidthSizable | NSViewHeightSizable]; [hosting Vi ew set Autoresize s Sub vi ews : YE S ] ;
#endif
[self generateData];
[self renderInGraphHostingView:hostingView withTheme:theme animated:animated]; [self formatAHGraphs]; self. defaultLayerHosting View = hosting View;
}
-(void)renderInGraphHostingView:(CPTGraphHostingView *)hostingView
withTheme:(CPTTheme *)theme animated:(BOOL)animated
{
NSLog(@"PlotItem:renderInLayer: Override me");
}
-(void)reloadData
{
for ( CPTGraph *graph in selfgraphs ) {
[graph reloadData];
}
}
#pragma mark -
#pragma mark IKImageBrowserltem methods
#if TARGET IPHONE SIMULATOR || T ARGET O S IPHONE
#else
-(NSString *)imageUID
{
return selftitle;
}
-(NSString *)imageRepresentationType
{
return IKImageBrowserNSImageRepresentationType;
} -(id)imageRepresentation
{
return [self image];
}
-(NS String *)imageTitle
{
return selftitle;
}
#endif @end //
// RealTimePlot.h
// CorePlotGallery
// #import "Plotltem.h"
©interface RealTimePlot : PlotItem<CPTPlotDataSource>
-(void)newData:(NSTimer *)theTimer;
-(void)addNewData:(NSNumber *)ydata;
@end
//
// RealTimePlot.m
// CorePlotGallery
//
#import "RealTimePlot.h" static const double kFrameRate = 5; // frames per second
static const double kAlpha = 0.25; // smoothing constant static const NSUInteger kMaxDataPoints = 52;
static NSString *const kPlotldentifier = @"Data Source Plot"; static const NSUInteger kYValueMin = 0;
static const NSUInteger kYValueMax = 100; //NSString *const SERVICEJJUID = @"0xl8CE";
//NSString *const SERVICE CHARACTERISTIC UUID = @"0x2A37"; @interface RealTimePlotQ
@property (nonatomic, readwrite, strong) NSMutableArray *plotData;
@property (nonatomic, readwrite, assign) NSUInteger currentlndex;
@property (nonatomic, readwrite, strong) NSTimer *dataTimer;
@end
@implementation RealTimePlot
@synthesize plotData;
@synthesize currentlndex;
@synthesize dataTimer; +(void)load {
[super registerPlotItem:self];
}
-(id)init
{
if ( (self = [super init]) ) {
plotData = [[NSMutableArray alloc] initWithCapacity:kMaxDataPoints]; dataTimer = nil; selftitle = @"Pressure (mmHg)";
self, section = kLinePlots;
}
return self;
}
-(void)killGraph {
[selfdataTimer invalidate];
self. dataTimer = nil;
[super killGraph];
}
-(void)generateData {
[self. plotData removeAllObjects];
self, currentlndex = 0;
}
-(void)renderInGraphHostingView:(CPTGraphHostingView *)hostingView withTheme:(CPTTheme *)theme animated:(BOOL)animated {
#if TARGET IPHONE SIMULATOR || T ARGET O S IPHONE
CGRect bounds = hosting View.bounds; #else
CGRect bounds = NSRectToCGRect(hostingView.bounds);
#endif
CPTGraph *graph = [[CPTXYGraph alloc] initWithFrame:bounds];
[self addGraph: graph toHostingView:hostingView];
[self apply Theme:theme toGraph:graph withDefault: [CPTTheme
themeNamed : kCPTDarkGradientTheme] ] ; graph. plotAreaFrame.paddingTop = selftitleSize * CPTFloat(0.5);
graph. plotAreaFrame.paddingRight = selftitleSize * CPTFloat(0.5);
graph. plotAreaFrame.paddingBottom = selftitleSize * CPTFloat(2.625);
graph. plotAreaFrame.paddingLeft = selftitleSize * CPTFloat(2.5);
graph. plotAreaFrame.masksToBorder = NO;
// Grid line styles
CPTMutableLineStyle *majorGridLineStyle = [CPTMutableLineStyle lineStyle]; majorGridLineStyle.lineWidth = 0.75;
majorGridLineStyle.lineColor = [[CPTColor colorWithGenericGray:CPTFloat(0.2)] colorWithAlphaComponent:CPTFloat(0.75)];
CPTMutableLineStyle *minorGridLineStyle = [CPTMutableLineStyle lineStyle]; minorGridLineStyle.lineWidth = 0.25;
minorGridLineStyle.lineColor = [[CPTColor whiteColor]
colorWithAlphaComponent: CPTFloat(0.1 )] ;
// Axes
// X axis
CPTXYAxisSet *axisSet = (CPTXYAxisSet *)graph. axis Set;
CPTXYAxis *x = axisSet.xAxis;
x.labelingPolicy = CPTAxisLabelingPolicy Automatic;
x.orthogonalCoordinateDecimal = CPTDecimalFromUnsignedlnteger(O);
x.majorGridLineStyle = majorGridLineStyle;
x.minorGridLineStyle = minorGridLineStyle;
x. minor TicksPerlnterval = 9;
x.labelOffset = selftitleSize * CPTFloat(0.25);
x.title = @"X Axis";
x.titleOffset = selftitleSize * CPTFloat(l .5);
NSNumberFormatter *labelFormatter = [[NSNumberFormatter alloc] init];
labelFormatter.numberStyle = NSNumberFormatterNoStyle;
x.labelFormatter = labelFormatter; // <— make this a date formatter to show time sample was taken.
// Y axis
CPTXYAxis *y = axisSet.yAxis;
y.labelingPolicy = CPTAxisLabelingPolicy Automatic;
y.orthogonalCoordinateDecimal = CPTDecimalFromUnsignedlnteger(O); y.majorGridLineStyle = majorGridLineStyle;
y.minorGridLineStyle = minorGridLineStyle;
y. minor TicksPerlnterval = 3;
y.labelOffset = self.titleSize * CPTFloat(0.25);
y.title = @"Y Axis";
y.titleOffset = self.titleSize * CPTFloat(l 00.25);
y.axisConstraints = [CPTConstraints constraintWithLowerOffse O.O];
// Rotate the labels by 45 degrees, just to show it can be done.
x.labelRotation = CPTFloat(M_PI_4);
// Create the plot
CPTScatterPlot *dataSourceLinePlot = [[CPTScatterPlot alloc] init];
dataSourceLinePlot.identifier = kPlotldentifier;
dataSourceLinePlot.cachePrecision = CPTPlotCachePrecisionDouble;
CPTMutableLineStyle *lineStyle = [dataSourceLinePlot.dataLineStyle mutableCopy]; HneStyle.lineWidth = 3.0;
HneStyle.lineColor = [CPTColor blueColor];
dataSourceLinePlot.dataLineStyle = lineStyle; dataSourceLinePlot.dataSource = self;
[graph addPl ot : dataS ourceLinePl ot] ; // Plot space
CPTXYPlotSpace *plotSpace = (CPTXYPlotSpace *)graph.defaultPlotSpace;
plotSpace.xRange = [CPTPlotRange
plotRangeWithLocation:CPTDecimalFromUnsignedInteger(0)
length:CPTDecimalFromUnsignedInteger(kMaxDataPoints - 2)];
plotSpace.yRange = [CPTPlotRange
plotRangeWithLocation:CPTDecimalFromUnsignedInteger(kYValueMin)
length:CPTDecimalFromUnsignedInteger(kYValueMax)];
[selfdataTimer invalidate]; if ( animated ) {
selfdataTimer = [NSTimer timerWithTimelnterval: 1.0 / kFrameRate
target: self
selector:@selector(newData:)
userInfo:nil
repeats: YES];
[[NSRunLoop mainRunLoop] addTimer: selfdataTimer
forMode:NSRunLoopCommonModes];
}
else {
selfdataTimer = nil;
} }
-(void)dealloc {
[dataTimer invalidate];
}
#pragma mark -
#pragma mark Timer callback -(void)newData:(NSTimer *)theTimer {
//[self.plotData addObject:@( (1.0 - kAlpha) * [[self.plotData lastObject] doubleValue] + kAlpha * arc4random() / (double)UINT32_MAX )];
[self addNewDataInternal:[NSNumber numberWithFloat: 100.55]];
}
-(void)addNewData:(NSNumber *)ydata {
[self addNewDatalnternakydata];
}
-(void) addNewDataInternal:(NSNumber *)ydata {
CPTGraph *theGraph = (self, graphs) [0];
CPTPlot *thePlot = [theGraph plotWithIdentifier:kPlotIdentifier]; if ( thePlot ) {
if ( self.plotData.count >= kMaxDataPoints ) {
[self.plotData removeObjectAtIndex:0];
[thePlot deleteData ndexRange:NSMakeRange(0, 1)];
}
CPTXYPlotSpace *plotSpace = (CPTXYPlotSpace *)theGraph.defaultPlotSpace;
NSUInteger location = (selfcurrentlndex >= kMaxDataPoints ? selfcurrentlndex - kMaxDataPoints + 2 : 0);
CPTPlotRange *oldRange = [CPTPlotRange
plotRangeWithLocation:CPTDecimalFromUnsignedInteger( (location > 0) ? (location - 1) : 0 ) length:CPTDecimalFromUnsignedInteger(kMaxDataPoints - 2)];
CPTPlotRange *newRange = [CPTPlotRange
plotRangeWithLocation:CPTDecimalFromUnsignedInteger(location) length:CPTDecimalFromUnsignedInteger(kMaxDataPoints - 2)];
[CPTAnimation animate:plotSpace
property:@"xRange"
fromPl otRange : ol dRange
toPl otRange : newRange duration:CPTFloat(1.0 / kFrameRate)]; self. currentlndex++;
// [self.plotData addObject:@( (1.0 - kAlpha) * [[self.plotData lastObject] doubleValue] + kAlpha * arc4random() / (double)UINT32_MAX )];
// [self.plotData addObject:@( 0.1 )];
[self.plotData addObjectydata];
[thePlot insertDataAtIndex:self.plotData.count - 1 numberOfRecords: l];
}
}
#pragma mark -
#pragma mark Plot Data Source Methods -(NSUInteger)numberOfRecordsForPlot:(CPTPlot *)plot {
return self.plotData.count;
}
-(id)numberForPlot:(CPTPlot *)plot field:(NSUInteger)fieldEnum
recordIndex:(NSUInteger)index {
NSNumber *num = nil; switch ( fieldEnum ) {
case CPTScatterPlotFieldX:
num = @(index + self.currentlndex - self.plotData.count);
break; case CPTScatterPlotFieldY:
num = selfplolData[index];
break; default:
break;
}
return num;
}
@end
==============================
//
// ViewController.h
// MiNI Auti Sense
//
// Created by Daniel Fong
// Copyright (c) 2015 Daniel Fong. All rights reserved.
// #import <UIKit/UIKit.h>
#import "CorePlot-CocoaTouch.h"
#import "Plotltem.h" @interface ViewController : UlViewController
@property (nonatomic, strong) Plotltem *detailltem;
@property (nonatomic, readwrite, strong) NSTimer * batteryUpdateTimer;
@end
===============================
//
// ViewController.m
// MiNI Auti Sense
//
// Created by Daniel Fong
// Copyright (c) 2015 Daniel Fong. All rights reserved.
//
#import "ViewController.h"
#import "BluetoothLEManager.h"
#import "BluetoothLEService.h"
#import "Plotltem.h"
#import "RealTimePlot.h"
@interface ViewController () <BluetoothLEManagerDelegateProtocol,
BluetoothLEServiceProtocol> {
uintl6_t bpm;
uintl6_t gsr;
CBUUID * pressureServiceUUID;
CBUUID * pressureServiceCharacteristicUUID; CBUUID * batteryServiceUUID;
CBUUID * batteryServiceCharacteristicUUID;
}
-(CPTTheme *)currentTheme;
-(void)setupView;
@property (nonatomic, copy) NSString *currentThemeName;
@property (nonatomic, assign) CBPeripheral *peripheral; // We only connect with 1 device at a time
@property (nonatomic, strong) BluetoothLEService * service; @property (strong, nonatomic) IBOutlet UILabel *legendLabel; @property (strong, nonatomic) IBOutlet UILabel *beatsPerMinute;
@property (strong, nonatomic) IBOutlet UITextView * text View;
@property (strong, nonatomic) IBOutlet UILabel *voltsUnitLabel;
@property (strong, nonatomic) IBOutlet UILabel *voltsValueLabel;
@property (weak, nonatomic) IBOutlet UlView *graphView;
@property (strong, nonatomic) IBOutlet UILabel *appVersionLabel;
//@property (strong, nonatomic) IBOutlet UILabel *batteryLevelLabel;
@end
@implementation ViewController
@synthesize detailltem;
@synthesize graph View;
@synthesize currentThemeName;
//NSString *const SERVICE_UUID = @"0xl8CE";
NSString *const APP_VERSION = @"v20150718";
NSString *const SERVICE_UUID_ADVERTISED = @"0x713D0000-503E-4C75-BA94- 3148F18D941E";// <-- NOTE: it is 0xl80D over here because firmware uploaded on VenoSense boards are improperly advertising the heart-rate monitor service (0x180D), but the service it actually has is 0xl8CE (custom for Minilab emotion... Should make another number in IAR).
NSString *const SERVICE_UUID = @"0x713D0000-503E-4C75-BA94- 3148F18D941E"; // <— 0xl8CE is being used for the emotion app... need to make different.
NSString *const SERVICE_CHARACTERISTIC_UUID = @"0x713D0002-503E-4C75- BA94-3148F18D941E"; // READ
NSString * const BATT SERVICE UUID = @"0xl80F";
NSString *const B ATT SER VICE CH ARAC TERI S TIC UUID = @"0x2A19";
BOOL const ANIMATED
-(void)setupView {
[self.detailltem renderin View: self, graph View withTheme:[self currentTheme] animated: ANIMATED];
} -(CPTTheme *)currentTheme {
CPTTheme *theme;
theme = [CPTTheme themeNamed:@"Plain White"];
return theme;
}
-(void)setDetailItem:(PlotItem *)newDetailItem {
if ( detailltem != newDetailltem ) {
[detailltem killGraph]; detailltem = newDetailltem; if ( selfgraphView ) {
[detailltem renderInView:self.graphView withTheme: [self current Theme] animated : ANIMATED] ;
}
}
}
- (void)viewDidLoad {
[super viewDidLoad];
bpm = 0; pressureServiceUUID = [CBUUID UUIDWithString: SERVICE UUID];
pressureServiceCharacteristicUUID = [CBUUID
UUIDWithString:SERVICE_CHARACTERISTIC_UUID]; batteryServiceUUID = [CBUUID UUIDWithString: BATT SERVICE UUID]; batteryServiceCharacteristicUUID = [CBUUID
UUID With String : B ATT SER VICE CH ARAC TERI S TIC UUID] ;
selftitle = NSLocalizedString(@"Pressure Sensor", nil);
[selftextView setEditable:NO];
[selfappVersionLabel setTex APP VERSION];
[BluetoothLEManager sharedManagerWithDelegate: sell
[self setupConnectButton];
[self setDetailItem:[[RealTimePlot alloc] init]];
}
- (void)drawLineGraphWithContext:(CGContextRef)ctx {
NSLog(@"HERE");
} - (void)viewWillTransitionToSize:(CGSize)size
withTransitionCoordinator:(id<UIViewControllerTransitionCoordinator>)coordinator
{
NSLog(@ "Rotated: %@", coordinator);
}
- (void) updateBatteryLevel {
NSLog(@"Update battery level");
if (self.service && self. peripheral && [self isPeripheralConnected]) {
[self.service readValueForServiceUUID:BATT_SERVICE_UUID
andCharacteristicUUID:BATT_SERVICE_CHARACTERISTIC_UUID];
}
if (YES) {
NSString * a = @"a";
NSData * aData = [a dataUsingEncoding:NSUTF8StringEncoding];
[self.service setValueWithoutResponse:aData forServiceUUID:@"0x713D0000-503E- 4C75-BA94-3148F18D941E" andCharacteristicUUID:@"0x713D0003-503E-4C75- BA94-3148F18D941E"];
}
}
- (void) didDiscoverPeripheral:(CBPeripheral *) peripheral
advertisementData:(NSDictionary *) advertisementData {
// Determine if this is the peripheral we want. If it is,
// we MUST stop scanning before connecting
CBUUID *heartRateUUID = [CBUUID
UUIDWithString : SERVICE UUID AD VERTISED] ;
//CBUUID *heartRateUUID = [CBUUID UUIDWithString:@"Heart Rate"];
NS Array *advertisementUUIDs = [advertisementData
objectForKey:CBAdvertisementDataServiceUUIDsKey];
NSLog(@"found advertisement UUIDs %@", advertisementUUIDs);
//[self, text View setText: [NSString stringWithFormat:@"%@", advertisementUUIDs]]; for (CBUUID *uuid in advertisementUUIDs) {
if ([uuid isEqual:pressureServiceUUID] || [uuid isEqual:heartRateUUID]) {
[selftextView setText: [NSString stringWithFormat:@"Found pressure sensor."]]; NSLog(@" found heartrate monitor");
[[BluetoothLEManager sharedManager] stopScanning];
if (selfperipheral == nil) {
self, peripheral = peripheral;
[self setupConnectButton];
}
break;
}
}
} - (void) didConnectPeripheral:(CBPeripheral *) peripheral error: (NSError *)error {
DebugLog(@"didConnectPeripheral: %@ - %@", peripheral, error);
[UIApplication sharedApplication].idleTimerDisabled = YES; self, peripheral = peripheral;
[self setupConnectButton]; self. service = [[BluetoothLEService alloc] initWithPeripheral: self. peripheral withServiceUUIDs:@[SERVICE_UUID, BATT SERVICE UUID] delegate: self];
[self.service discoverServices]; if (self.batteryUpdateTimer) {
[self. battery Update Timer invalidate];
self.batteryUpdateTimer = nil;
}
self.batteryUpdateTimer = [NSTimer timerWithTimeInterval:5.0
target: self
selector:@selector(updateBatteryLevel)
userInfo:nil
repeats:YES];
[[NSRunLoop mainRunLoop] addTimer: self.batteryUpdateTimer
forMode:NSRunLoopCommonModes];
[UIApplication sharedApplication].idleTimerDisabled = YES;
}
- (void) didDisconnectPeripheral:(CBPeripheral *) peripheral error:(NSError *)error { DebugLog(@"didDisconnect: %@ %@", peripheral, error);
[UIApplication sharedApplication].idleTimerDisabled = NO; self.service = nil;
if (self.batteryUpdateTimer) {
[self.batteryUpdateTimer invalidate];
self.batteryUpdateTimer = nil;
}
if (self.peripheral != nil && [[NSUserDefaults standardUserDefaults]
b oolF orKey : k Automati calllyReconnect] ) {
[[BluetoothLEManager sharedManager] connectPeripheral:self.peripheral];
}
else {
[self.textView setText:[NS String stringWithFormat:@"Disconnected! Searching for pressure sensor"]];
[[BluetoothLEManager sharedManager] discoverDevices]; self.peripheral = nil;
} [self setupConnectButton];
}
- (void) didChangeState:(CBCentralManagerState) newState {
DebugLog(@" state changed: %ld", (long)newState); if (newState == CBCentralManagerStatePoweredOn) {
if (selfperipheral == nil) {
[[BluetoothLEManager sharedManager] discoverDevices];
}
}
else {
selfperipheral = nil;
}
[self setupConnectButton];
- (void) didUpdateValue:(BluetoothLEService *) service forServiceUUID:(CBUUID *) serviceUUID withCharacteristicUUID:(CBUUID *) characteristicUUID
withData:(NSData *) data {
NSLog(@"didUpdate Value :%@, serviceUUID: %@, characteristicUUID: %@", data, serviceUUID, characteristicUUID);
// update was for a battery read. Only update the battery level info.
if ([batteryServiceUUID isEqual: serviceUUID]) {
const uint8_t *dataArray = [data bytes]; // <— note, always need to read out like this.
// uint8_t aByte = [data bytes]; will give you garbage (the address where the data is located).
uint8_t battPercentage = dataArray[0];
NSLog(@"battery: %d", battPercentage);
self.textView.text = [NSString stringWithFormat:@"Battery Level: %d%%", battPercentage];
return;
}
const uint8_t *reportData = [data bytes];
//uintl6_t bpm = 0; uint8_t lowBytes = reportData[2];
uint8_t highBytes = reportData[ 1 ];
bpm = ((uintl6_t)(highBytes « 8)) | lowBytes; //NSLog(@"didUpdateDalue:%d", bpm);
uintl6_t rawvolts = bpm;
float full voltage = 3. Of;
float voltage = rawvolts * full_voltage / 8192.f;
float pressure = (voltage < 1.495f) ? (((-32.434)*(voltage)) + 55.383) : (-500.f*voltage + 750.f);
pressure = MAX(0, pressure); self.beatsPerMinute.text = [NSString stringWithFormat:@"%. lf', pressure];
self. volts ValueLabel. text = [NSString stringWithFormat:@"%f voltage];
selfbeatsPerMinute. hidden = NO;
selflegendLabel. hidden = NO;
self, volts ValueLabel. hidden = NO;
self. voltsUnitLabel. hidden = NO;
//self.textView.text = [NSString stringWithFormat:@"%@", data];
self, text View.hidden = NO; if ([selfdetailltem isKindOfClass:[RealTimePlot class]]) {
[((RealTimePlot *)self.detailltem) addNewData:[NSNumber
numb erWithFl oat : pres sure] ] ;
//[((RealTimePlot *)self.detailltem) addNewData:[NSNumber numberWithln gsr]];
}
}
- (void) didDiscoverCharacterisics:(BluetoothLEService *) service {
[service startNotifyingForServiceUUID : SERVICE UUID
andCharacteristicUUID:SERVICE_CHARACTERISTIC_UUID];
[service StartNotifyingForServiceUUID: SERVICE UUID
andCharacteristicUUID:@"0x713D0002-503E-4C75-BA94-3148F18D941E"];
[service readValueForServiceUUID :BATT_SERVICE_UUID
andCharacteristicUUID:BATT_SERVICE_CHARACTERISTIC_UUID];
DebugLog(@"finished discovering: %@", service);
}
- (void) setupConnectButton {
//UIBarButtonltem *item = [[UIBarButtonltem alloc]
initWithTitle:self.peripheral.isConnected ? NSLocalizedString(@"Disconnect", nil) : NSLocalizedString(@"Connect", nil) style : UIBarButtonltem Sty leB ordered target: self acti on : @ sel ector(connect Acti on : )] ; //self.navigationltem.leftBarButtonltem = item;
self.navigationltem.leftBarButtonltem. enabled = (self. peripheral != nil); [self.navigationltem.leftBarButtonltem setTitle:([self isPenpheralConnected] ? NSLocalizedString(@"Disconnect", nil) : NSLocalizedString(@"Connect", nil))];
//selfbeatsPerMinute. hidden = I self.peripheral.isConnected;
//selflegendLabel. hidden = NO;//! self.peripheral.isConnected;
//self, volts ValueLabel. hidden = I self.peripheral.isConnected;
//self. voltsUnitLabel. hidden = I self.peripheral.isConnected;
}
- (BOOL) isPeripheralConnected {
if (selfperipheral) {
return (self. peripheral. state == CBPeripheralStateConnected);
}
return NO;
}
- (IBAction)connectAction:(id)sender { if (selfperipheral) { if ([self isPenpheralConnected]) {
NSLog(@"Disconnecting from to %@", selfperipheral);
[ [B luetoothLEManager sharedManager]
di sconnectPeripheral : self . peripheral] ;
selfperipheral = nil;
}
else {
NSLog(@" Connecting to %@", selfperipheral);
[ [B luetoothLEManager sharedManager]
connectPeripheral : selfperipheral] ;
}
}
}
@end
//
// BluetoothLEService.h
// SensorApp //
// Created by Scott Gruby on 12/13/12.
// Copyright (c) 2012 Scott Gruby. All rights reserved.
//
#import <Foundation/Foundation.h>
#imp ort <CoreB luetooth/ C oreB luetooth . h>
@class BluetoothLEService;
@protocol BluetoothLEServiceProtocol <NSObject>
@required
- (void) didDiscoverCharacterisics:(BluetoothLEService *) service;
- (void) didUpdate Value: (BluetoothLEService *) service forServiceUUID:(CBUUID *) serviceUUID withCharacteristicUUID:(CBUUID *) characteristicUUID
withData:(NSData *) data;
@end
@interface BluetoothLEService : NSObject
- (id) initWithPeripheral:(CBPeripheral *)peripheral withServiceUUIDs:(NSArray *) serviceUUIDs delegate:(id<BluetoothLEServiceProtocol>) delegate;
- (void) discoverServices;
- (void) startNotifyingForServiceUUID:(NS String *) serviceUUID
andCharacteristicUUID:(NSString *) charUUID;
- (void) stopNotifyingForServiceUUID:(NS String *) serviceUUID
andCharacteristicUUID:(NSString *) charUUID;
- (void) setValue:(NSData *) data forServiceUUID:(NS String *) serviceUUID andCharacteristicUUID:(NSString *) charUUID;
- (void) setValueWithoutResponse:(NSData *) data forServiceUUID:(NSString *) serviceUUID andCharacteristicUUID:(NSString *) charUUID;
- (void) readValueForServiceUUID:(NS String *) serviceUUID
andCharacteristicUUID:(NSString *) charUUID;
@end //
// BluetoothLEService. m
// SensorApp
//
// Created by Scott Gruby on 12/13/12.
// Copyright (c) 2012 Scott Gruby. All rights reserved.
//
#import "BluetoothLEService. h"
#import "BluetoothLEManager.h"
@interface BluetoothLEService () <CBPeripheralDelegate>
@property (nonatomic, weak) CBPeripheral *peripheral;
@property (nonatomic, weak) id<BluetoothLEServiceProtocol> delegate; @property (nonatomic, strong) NSArray * serviceUUIDs;
@property (nonatomic, assign) NSUInteger remainingServicesToDiscover;
@end
@impl ementati on B luetoothLE S ervi ce
- (id) initWithPeripheral:(CBPeripheral *)peripheral withServiceUUIDs:(NSArray *) serviceUUIDs delegate:(id<BluetoothLEServiceProtocol>) delegate {
if (self = [super init]) {
self, peripheral = peripheral;
self, delegate = delegate;
self. serviceUUIDs = serviceUUIDs;
self, peripheral, delegate = self;
}
return self;
}
- (void) dealloc {
if (selfperipheral) {
self. peripheral. delegate = [BluetoothLEManager sharedManager];
selfperipheral = nil;
}
}
****/
/* Service Interactions
*/
****/
- (void) discoverServices {
NSMutable Array * serviceArray = NSMutableArray array];
for (NS String *str in self. serviceUUIDs) {
[serviceArray addObject:[CBUUID UUIDWithString:str]];
}
[selfperipheral discoverServices:serviceArray];
}
- (void) peripheral :(CBPeripheral *)peripheral didDiscoverServices:(NSError *)error { if (peripheral != selfperipheral) {
return;
}
// DebugLog(@" discover error: %@", error);
if (error != nil) { return;
}
NS Array * services = [peripheral services];
if (! services || ! [services count]) {
return;
}
self.remainingServicesToDiscover = [services count]; for (CB Service * service in services) {
[peripheral discoverCharacteristics:nil forService: service];
}
}
- (void) peripheral :(CBPeripheral *)peripheral
didDiscoverCharacteristicsForService:(CBService *)service error: (NSError *)error; {
if (error != nil) {
return;
}
//DebugLog(@"discovered: %@", service.UUID);
selfremainingServicesToDiscover--; if (selfremainingServicesToDiscover == 0) {
[self. delegate didDi scoverCharacteri sics : self] ;
}
}
#pragma mark - Utilities
- (void) startNotifyingForServiceUUID:(NS String *) serviceUUID
andCharacteristicUUID:(NSString *) charUUID {
CB Characteristic * characteristic = [self
findCharacteri stic With ServiceUUID : serviceUUID andCharacteristicUUID : charUUID] ; if (characteristic) {
[self.peripheral setNotify Value: YES forCharacteristicxharacteristic];
}
}
- (void) stopNotifyingForServiceUUID:(NS String *) serviceUUID
andCharacteristicUUID:(NSString *) charUUID {
CB Characteristic * characteristic = [self
findCharacteri stic With ServiceUUID : serviceUUID andCharacteristicUUID : charUUID] ; if (characteristic) {
[self.peripheral setNotify Value :NO forCharacteri stic: character! stic]; }
}
- (void) setValue:(NSData *) data forServiceUUID:(NS String *) serviceUUID andCharacteristicUUID:(NSString *) charUUID {
CBCharacteristic * characteristic = [self
findCharacteri stic With ServiceUUID : serviceUUID andCharacteristicUUID : charUUID] ; if (characteristic) {
[selfperipheral writeValue:data forCharacteristicxharacteristic
type:CBCharacteristicWriteWithResponse];
}
}
- (void) setValueWithoutResponse:(NSData *) data forServiceUUID:(NSString *) serviceUUID andCharacteristicUUID:(NSString *) charUUID {
CBCharacteristic * characteristic = [self
findCharacteri stic With ServiceUUID : serviceUUID andCharacteristicUUID : charUUID] ; if (characteristic) {
[selfperipheral writeValue:data forCharacteristicxharacteristic
type : CB Characteri sticWriteWithoutResponse] ;
}
}
- (void) readValueForServiceUUID:(NS String *) serviceUUID
andCharacteristicUUID:(NSString *) charUUID {
CBCharacteristic * characteri stic = [self
fi ndCharacteri sti cWith S ervi ceUUID : servi ceUUID
andCharacteri sticUUID : charUUID] ;
if (characteristic) {
[selfperipheral readValueForCharacteristicxharacteristic];
}
}
- (CBCharacteristic *) findCharacteristicWithServiceUUID:(NSString *) serviceUUID andCharacteristicUUID:(NSString *) charUUID {
for (CBService *service in [selfperipheral services]) {
if ([[service UUID] isEqual:[CBUUID UUIDWithString: serviceUUID]] ) { for (CBCharacteristic * characteri stic in [service characteristics]) {
if ( [[characteristic UUID] isEqual:[CBUUID UUIDWithStringxharUUID]] ) { return characteristic;
}
}
}
}
return nil;
} - (void) peripheral :(CBPeripheral *)peripheral
didUpdateValueForCharacteristic:(CBCharacteristic ^characteristic error: (NSError *)error {
if (peripheral != self, peripheral) {
return ;
}
if ([error code] != 0) {
return ;
}
[self. delegate didUpdateValue:self
for S ervi ceUUID : characteri sti c . servi ce . UUID
withCharacteri sti cUUID : characteri sti c . UUID
withData: [characteristic value]];
}
(¾end
#import " CPTAnimationCGFloatPeriod.h"
#import "NSNumberExtensions.h" /// @cond
@interface _CPTAnimationCGFloatPeriod()
CGFloat currentFloatValue(id boundObject, SEL boundGetter); @end
/// @endcond
#pragma mark -
@implementation CPTAnimationCGFloatPeriod CGFloat currentFloatValue(id boundObject, SEL boundGetter)
{
NSInvocation invocation = [NSInvocation
invocationWithMethodSignature: [boundObj ect
method SignatureF or S el ector : b oundGetter] ] ;
[invocation setTargetboundObj ect];
[invocati on set S el ector : b oundGetter] ;
[invocation invoke];
CGFloat value;
[invocati on getReturn Value : & value] ; return value;
}
-(void)setStartValueFromObject:(id)boundObject propertyGetter:(SEL)boundGetter
{
self, start Value = @( currentFloatValue(boundObject, boundGetter) );
}
-(BOOL)canStartWithValueFromObject:(id)boundObject
property Getter: (SEL)boundGetter
{
CGFloat current = currentFloatValue(boundObject, boundGetter);
CGFloat start;
CGFloat end;
[self.startValue getValue:& start];
[self.endValue getValue:&end]; return ( (current >= start) && (current <= end) ) || ( (current >= end) && (current <= start) );
}
-(NSValue *)tweenedValueForProgress:(CGFloat)progress
{
CGFloat start;
CGFloat end;
[self.startValue getValue:& start];
[self.endValue getValue:&end];
CGFloat tweenedValue = start + progress * (end - start); return @(tweenedValue);
}
@end
#import " CPTAnimationCGPointPeriod.h" /// @cond
@interface _CPTAnimationCGPointPeriod()
CGPoint currentPointValue(id boundObject, SEL boundGetter);
@end
/// @endcond #pragma mark - implementation CPTAnimationCGPointPeriod
CGPoint currentPointValue(id boundObject, SEL boundGetter)
{
NSInvocation invocation = [NSInvocation
invocationWithMethodSignature: [boundObj ect
method SignatureF or S el ector : b oundGetter] ] ;
[invocation setTargetboundObj ect];
[invocati on set S el ector : b oundGetter] ;
[invocation invoke];
[invocation invoke];
CGPoint value;
[invocati on getReturn Value : & value] ; return value;
}
-(void)setStartValueFromObject:(id)boundObject propertyGetter:(SEL)boundGetter
{
CGPoint start = currentPointValue(boundObject, boundGetter); self, start Value = [NSValue valueWithBytes:& start objCType:@encode(CGPoint)];
}
-(BOOL)canStartWithValueFromObject:(id)boundObject
property Getter: (SEL)b oundGetter
{
CGPoint current = currentPointValue(boundObject, boundGetter);
CGPoint start;
CGPoint end;
[self.startValue getValue:& start];
[self.endValue getValue:&end]; return ( ( (current.x >= start.x) && (current.x <= end.x) ) || ( (current.x >= end.x) && (current.x <= start.x) ) ) &&
( ( (current.y >= start.y) && (current.y <= end.y) ) || ( (current.y >= end.y) && (current.y <= start.y) ) );
} -(NSValue *)tweenedValueForProgress:(CGFloat)progress
{
CGPoint start;
CGPoint end;
[self.startValue getValue:& start];
[self.endValue getValue:&end];
CGFloat tweenedXValue = start.x + progress * (end.x - start.x);
CGFloat tweenedYValue = start.y + progress * (end.y - start.y);
CGPoint tweenedPoint = CGPointMake(tweenedXValue, tweenedYValue); return [NSValue valueWithBytes:&tweenedPoint objCType:@encode(CGPoint)]; }
@end
#import " CPTAnimationCGRectPeriod.h"
/// @cond
@interface _CPTAnimationCGRectPeriod()
CGRect currentRectValue(id boundObject, SEL boundGetter);
@end
/// @endcond #pragma mark -
@implementation CPTAnimationCGRectPeriod
CGRect currentRectValue(id boundObject, SEL boundGetter)
{
NSInvocation invocation = [NSInvocation
invocationWithMethodSignature: [boundObj ect
method SignatureF or S el ector : b oundGetter] ] ;
[invocation setTarget:boundObj ect];
[invocation setSelectonboundGetter];
[invocation invoke];
CGRect value;
[invocation getReturn Value :& value]; return value; }
-(void)setStartValueFromObject:(id)boundObject propertyGetter:(SEL)boundGetter
{
CGRect start = currentRectValue(boundObject, boundGetter); self, start Value = [NS Value valueWithBytes:& start objCType:@encode(CGRect)];
}
-(BOOL)canStartWithValueFromObject:(id)boundObject
property Getter: (SEL)boundGetter
{
CGRect current = currentRectValue(boundObject, boundGetter);
CGRect start;
CGRect end;
[self.startValue getValue:& start];
[self.endValue getValue:&end]; return ( ( (current. on gin.x >= start. origin. x) && (current.origin.x <= end. origin. x) ) || ( (current. ori gin. x >= end. ori gin.x) && (current.origin.x <= start. origin. x) ) ) &&
( ( (current. ori gin.y >= start.origin.y) && (current.origin.y <= end. origin. y) ) || ( (current, ori gin. y >= end. ori gin.y) && (current.origin.y <= start.origin.y) ) ) &&
( ( (current. size.width >= start. size.width) && (current. size.width <=
end. size.width) ) || ( (current. size.width >= end. size.width) && (current. size.width <= start, size.width) ) ) &&
( ( (current, size. height >= start, size. height) && (current. size. height <=
end. size. height) ) || ( (current. size. height >= end. size. height) && (current. size. height <= start. size. height) ) );
}
-(NSValue *)tweenedValueForProgress:(CGFloat)progress
{
CGRect start;
CGRect end;
[self.startValue getValue:& start];
[self.endValue getValue:&end]; CGFloat tweenedXValue = start, ori gin.x + progress * (end.origin.x - start.origin.x); CGFloat tweenedYValue = start.origin.y + progress * (end.origin.y - start.origin.y); CGFloat tweenedWidth = start, size.width + progress * (end. size.width - start, size.width);
CGFloat tweenedHeight = start. size. height + progress * (end. size. height - start, size. height);
CGRect tweenedRect = CGRectMake(tweenedX Value, tweenedYValue, tweenedWidth, tweenedHeight); return [NS Value valueWithBytes:&tweenedRect objCType:@encode(CGRect)];
}
@end
#import " CPTAnimationCGSizePeriod.h" /// @cond
@interface _CPTAnimationCGSizePeriod()
CGSize currentSizeValue(id boundObject, SEL boundGetter);
@end
/// @endcond #pragma mark -
@implementation CPTAnimationCGSizePeriod
CGSize currentSizeValue(id boundObject, SEL boundGetter)
{
NSInvocation invocation = [NSInvocation
invocationWithMethodSignature: [boundObj ect
method SignatureF or S el ector : b oundGetter] ] ;
[invocation setTargetboundObj ect];
[invocati on set S el ector : b oundGetter] ;
[invocation invoke];
CGSize value;
[invocati on getReturn Value : & value] ; return value;
}
-(void)setStartValueFromObject:(id)boundObject propertyGetter:(SEL)boundGetter
{
CGSize start = currentSizeValue(boundObject, boundGetter); self, start Value = [NS Value valueWithBytes:& start objCType:@encode(CGSize)];
}
-(BOOL)canStartWithValueFromObject:(id)boundObject
property Getter: (SEL)b oundGetter {
CGSize current = currentSizeValue(boundObject, boundGetter);
CGSize start;
CGSize end;
[self.startValue getValue:& start];
[self.endValue getValue:&end]; return ( ( (current. width >= start.width) && (current. width <= end.width) ) || ( (current. width >= end.width) && (current. width <= start.width) ) ) &&
( ( (current. height >= start.height) && (current.height <= end. height) ) || ( (current. height >= end. height) && (current.height <= start.height) ) );
}
-(NSValue *)tweenedValueForProgress:(CGFloat)progress
{
CGSize start;
CGSize end; [self.startValue getValue:& start];
[self.endValue getValue:&end];
CGFloat tweenedWidth = start.width + progress * (end.width - start.width); CGFloat tweenedHeight = start.height + progress * (end. height - start.height);
CGSize tweenedSize = CGSizeMake(tweenedWidth, tweenedHeight); return [NSValue valueWithBytes:&tweenedSize objCType:@encode(CGSize)];
}
@end
#import " CPTAnimationNSDecimalPeriod.h" #import "CPTUtilities.h" /// @cond
@interface _CPTAnimationNSDecimalPeriod() NSDecimal currentDecimalValue(id boundObject, SEL boundGetter); @end
/// @endcond #pragma mark -
@implementation CPTAnimationNSDecimalPenod NSDecimal currentDecimalValue(id boundObject, SEL boundGetter)
{
NSInvocation invocation = [NSInvocation
invocationWithMethodSignature: [boundObj ect
method SignatureF or S el ector : b oundGetter] ] ;
[invocation setTargetboundObj ect];
[invocati on set S el ector : b oundGetter] ;
[invocation invoke];
NSDecimal value;
[invocati on getReturn Value : & value] ; return value;
}
-(void)setStartValueFromObject:(id)boundObject propertyGetter:(SEL)boundGetter
{
NSDecimal start = currentDecimalValue(boundObject, boundGetter); self, start Value = [NSDecimalNumber decimalNumberWithDecimal: start];
}
-(BOOL)canStartWithValueFromObject:(id)boundObject
property Getter: (SEL)b oundGetter
{
NSDecimal current = currentDecimalValue(boundObject, boundGetter);
NSDecimal start = [(NSDecimalNumber *)self. start Value decimal Value];
NSDecimal end = [(NSDecimalNumber *)self.endValue decimal Value]; return ( CPTDecimalGreaterThanOrEqualTo(current, start) &&
CPTDecimalLessThanOrEqualTo(current, end) ) ||
( CPTDecimalGreaterThanOrEqualTo(current, end) &&
CPTDecimalLessThanOrEqualTo(current, start) );
}
-(NSValue *)tweenedValueForProgress:(CGFloat)progress
{
NSDecimal start = [(NSDecimalNumber *)self. start Value decimal Value];
NSDecimal end = [(NSDecimalNumber *)self.endValue decimal Value];
NSDecimal length = CPTDecimalSubtract(end, start);
NSDecimal tweenedValue = CPTDecimalAdd( start,
CPTDecimalMultiply(CPTDecimalFromCGFloat(progress), length) ); return [NSDecimalNumber decimalNumberWithDecimal:tweenedValue];
#import " CPTAnimationPlotRangePeriod.h"
#import "CPTPlotRange.h"
#import "CPTUtilities.h"
©implementation CPTAnimationPlotRangePeriod
-(void)setStartValueFromObject:(id)boundObject propertyGetter:(SEL)boundGetter
{
typedef NS Value *(*GetterType)(id, SEL);
Getter Type getterMethod = (Getter Type)[boundObject methodForSelector : boundGetter]; self, start Value = getterMethod(boundObject, boundGetter);
}
-(BOOL)can StartWith ValueFromObj ect : (i d)boundObj ect
property Getter:(SEL)boundGetter
{
typedef CPTPlotRange *(*GetterType)(id, SEL);
Getter Type getterMethod = (Getter Type)[boundObj ect methodForSelector : boundGetter];
CPTPlotRange * current = getterMethod(boundObject, boundGetter);
CPTPlotRange *start = (CPTPlotRange *)self.startValue;
CPTPlotRange *end = (CPTPlotRange *)self.endValue;
NSDecimal currentLoc = current.location;
NSDecimal startLoc = start.location;
NSDecimal endLoc = end. location; return ( CPTDecimalGreaterThanOrEqualTo(currentLoc, startLoc) &&
CPTDecimalLessThanOrEqualTo(currentLoc, endLoc) ) ||
( CPTDecimalGreaterThanOrEqualTo(currentLoc, endLoc) &&
CPTDecimalLessThanOrEqualTo( currentLoc, startLoc) );
}
-(NSValue *)tweenedValueForProgress:(CGFloat)progress
{
CPTPlotRange * start = (CPTPlotRange *)self.startValue;
CPTPlotRange *end = (CPTPlotRange *)self.endValue; NSDecimal progressDecimal = CPTDecimalFromCGFloat(progress);
NSDecimal locationDiff = CPTDecimalSubtract(end. location, start.location);
NSDecimal tweenedLocation = CPTDecimalAdd( start.location,
CPTDecimalMultiply(progressDecimal, locationDiff) );
NSDecimal lengthDiff = CPTDecimalSubtract(end. length, start. length);
NSDecimal tweenedLength = CPTDecimalAdd( start.length,
CPTDecimalMultiply(progressDecimal, lengthDiff) ); return (NSValue *)[CPTPlotRange plotRangeWithLocation:tweenedLocation
1 ength : tweenedLength] ;
}
@end
#import " CPTAnimationTimingFunctions.h"
#import "CPTDefinitions.h"
#import <tgmath.h>
// elapsedTime should be between 0 and duration for all timing functions
#pragma mark Linear
/**
* @brief Computes a linear animation timing function.
* @param elapsedTime The elapsed time of the animation between zero (@num{0}) and @par{duration}.
* @param duration The overall duration of the animation in seconds.
* @return The animation progress in the range zero (@num{0}) to one (@num{ 1 }) at the given @par{ elapsedTime}.
**/
CGFloat CPTAnimationTimingFunctionLinear(CGFloat elapsedTime, CGFloat duration) {
if ( elapsedTime <= CPTFloat(O.O) ) {
return CPTFloat(O.O);
}
elapsedTime /= duration; if ( elapsedTime >= CPTFloat(l .O) ) {
return CPTFloat(l .O);
}
return elapsedTime;
} #pragma mark - #pragma mark Back /**
* @brief Computes a backing in animation timing function.
* @param elapsedTime The elapsed time of the animation between zero (@num{0}) and @par{duration}.
* @param duration The overall duration of the animation in seconds.
* @return The animation progress in the range zero (@num{0}) to one (@num{ 1 }) at the given @par{elapsedTime}.
**/
CGFloat CPTAnimationTimingFunctionBackIn(CGFloat elapsedTime, CGFloat duration)
{
const CGFloat s = CPTFloat(l .70158); if ( elapsedTime <= CPTFloat(O.O) ) {
return CPTFloat(O.O);
}
elapsedTime /= duration; if ( elapsedTime >= CPTFloat(l .O) ) {
return CPTFloat(l .O);
} return elapsedTime * elapsedTime * ( ( s + CPTFloat(l .O) ) * elapsedTime - s );
}
/**
* @brief Computes a backing out animation timing function.
* @param elapsedTime The elapsed time of the animation between zero (@num{0}) and @par{duration}.
* @param duration The overall duration of the animation in seconds.
* @return The animation progress in the range zero (@num{0}) to one (@num{ 1 }) at the given @par{elapsedTime}.
**/
CGFloat CPTAnimationTimingFunctionBackOut(CGFloat elapsedTime, CGFloat duration)
{
const CGFloat s = CPTFloat(l .70158); if ( elapsedTime <= CPTFloat(O.O) ) {
return CPTFloat(O.O);
} elapsedTime = elapsedTime / duration - CPTFloat(l .O); if ( elapsedTime >= CPTFloat(O.O) ) {
return CPTFloat(l .O);
}
return elapsedTime * elapsedTime * ( ( s + CPTFloat(l .O) ) * elapsedTime + s ) + CPTFloat(l .O);
}
/**
* @brief Computes a backing in and out animation timing function.
* @param elapsedTime The elapsed time of the animation between zero (@num{0}) @par{duration}.
* @param duration The overall duration of the animation in seconds.
* @return The animation progress in the range zero (@num{0}) to one (@num{ 1 }) at given @par{elapsedTime}.
**/
CGFloat CPTAnimationTimingFunctionBackInOut(CGFloat elapsedTime, CGFloat duration)
{
const CGFloat s = CPTFloatQ .70158 * 1.525); if ( elapsedTime <= CPTFloat(O.O) ) {
return CPTFloat(O.O);
}
elapsedTime /= duration * CPTFloat(0.5); if ( elapsedTime >= CPTFloat(2.0) ) {
return CPTFloat(l .O);
}
if ( elapsedTime < CPTFloat(l .O) ) {
return CPTFloat(0.5) * ( elapsedTime * elapsedTime * ( ( s + CPTFloat(l .O) ) * elapsedTime - s ) );
}
else {
elapsedTime -= CPTFloat(2.0); return CPTFloat(0.5) * ( elapsedTime * elapsedTime * ( ( s + CPTFloat(l .O) ) * elapsedTime + s ) + CPTFloat(2.0) );
}
}
#pragma mark - #pragma mark Bounce /**
* @brief Computes a bounce in animation timing function.
* @param elapsedTime The elapsed time of the animation between zero (@num{0}) and @par{duration}.
* @param duration The overall duration of the animation in seconds.
* @return The animation progress in the range zero (@num{0}) to one (@num{ 1 }) at the given @par{ elapsedTime}.
**/
CGFloat CPTAnimationTimingFunctionBounceIn(CGFloat elapsedTime, CGFloat duration)
{
return CPTFloat(l .O) - CPTAnimationTimingFunctionBounceOut(duration - elapsedTime, duration);
}
/**
* @brief Computes a bounce out animation timing function.
* @param elapsedTime The elapsed time of the animation between zero (@num{0}) and @par{duration}.
* @param duration The overall duration of the animation in seconds.
* @return The animation progress in the range zero (@num{0}) to one (@num{ 1 }) at the given @par{ elapsedTime}.
**/
CGFloat CPTAnimationTimingFunctionBounceOut(CGFloat elapsedTime, CGFloat duration)
{
if ( elapsedTime <= CPTFloat(O.O) ) {
return CPTFloat(O.O);
}
elapsedTime /= duration; if ( elapsedTime >= CPTFloat(l .O) ) {
return CPTFloat(l .O);
} if ( elapsedTime < CPTFloatQ .O / 2.75) ) {
return CPTFloat(7.5625) * elapsedTime * elapsedTime;
}
else if ( elapsedTime < CPTFloat(2.0 / 2.75) ) {
elapsedTime -= CPTFloat(1.5 / 2.75); return CPTFloat(7.5625) * elapsedTime * elapsedTime + CPTFloat(0.75);
}
else if ( elapsedTime < CPTFloat(2.5 / 2.75) ) {
elapsedTime -= CPTFloat(2.25 / 2.75); return CPTFloat(7.5625) * elapsedTime * elapsedTime + CPTFloat(0.9375);
}
else {
elapsedTime -= CPTFloat(2.625 / 2.75); return CPTFloat(7.5625) * elapsedTime * elapsedTime + CPTFloat(0.984375);
}
}
/**
* @brief Computes a bounce in and out animation timing function.
* @param elapsedTime The elapsed time of the animation between zero (@num{0}) and @par{duration}.
* @param duration The overall duration of the animation in seconds.
* @return The animation progress in the range zero (@num{0}) to one (@num{ 1 }) at the given @par{elapsedTime}.
**/
CGFloat CPTAnimationTimingFunctionBounceInOut(CGFloat elapsedTime, CGFloat duration)
{
if ( elapsedTime < duration * CPTFloat(0.5) ) {
return CPTAnimationTimingFunctionBounceIn(elapsedTime * CPTFloat(2.0), duration) * CPTFloat(0.5);
}
else {
return CPTAnimationTimingFunctionBounceOut(elapsedTime * CPTFloat(2.0) - duration, duration) * CPTFloat(0.5) +
CPTFloat(0.5);
}
}
#pragma mark - #pragma mark Circular /**
* @brief Computes a circular in animation timing function.
* @param elapsedTime The elapsed time of the animation between zero (@num{0}) and @par{duration}.
* @param duration The overall duration of the animation in seconds.
* @return The animation progress in the range zero (@num{0}) to one (@num{ 1 }) at the given @par{elapsedTime}.
**/
CGFloat CPTAnimationTimingFunctionCircularIn(CGFloat elapsedTime, CGFloat duration)
{
if ( elapsedTime <= CPTFloat(O.O) ) {
return CPTFloat(O.O); }
elapsedTime /= duration; if ( elapsedTime >= CPTFloatQ .0) ) {
return CPTFloatQ .0);
}
return -( sqrt(CPTFloat(1.0) - elapsedTime * elapsedTime) - CPTFloatQ .0) );
}
/**
* @brief Computes a circular out animation timing function.
* @param elapsedTime The elapsed time of the animation between zero (@num{0}) and @par{duration}.
* @param duration The overall duration of the animation in seconds.
* @return The animation progress in the range zero (@num{0}) to one (@num{ 1 }) at the given @par{ elapsedTime}.
**/
CGFloat CPTAnimationTimingFunctionCircularOut(CGFloat elapsedTime, CGFloat duration)
{
if ( elapsedTime <= CPTFloat(O.O) ) {
return CPTFloat(O.O);
} elapsedTime = elapsedTime / duration - CPTFloatQ .0); if ( elapsedTime >= CPTFloat(O.O) ) {
return CPTFloatQ .0);
}
return sqrt(CPTFloatQ .O) - elapsedTime * elapsedTime);
}
/**
* @brief Computes a circular in and out animation timing function.
* @param elapsedTime The elapsed time of the animation between zero (@num{0}) and @par{duration}.
* @param duration The overall duration of the animation in seconds.
* @return The animation progress in the range zero (@num{0}) to one (@num{ 1 }) at the given @par{ elapsedTime}.
**/
CGFloat CPTAnimationTimingFunctionCircularInOut(CGFloat elapsedTime, CGFloat duration)
{
if ( elapsedTime <= CPTFloat(O.O) ) { return CPTFloat(O.O);
}
elapsedTime /= duration * CPTFloat(0.5); if ( elapsedTime >= CPTFloat(2.0) ) {
return CPTFloat(l .O);
}
if ( elapsedTime < CPTFloat(l .O) ) {
return CPTFloat(-0.5) * ( sqrt(CPTFloat(1.0) - elapsedTime * elapsedTime) - CPTFloat(l .O) );
}
else {
elapsedTime -= CPTFloat(2.0); return CPTFloat(0.5) * ( sqrt(CPTFloat(1.0) - elapsedTime * elapsedTime) +
CPTFloat(l .O) );
}
}
#pragma mark - #pragma mark Elastic
/**
* @brief Computes a elastic in animation timing function.
* @param elapsedTime The elapsed time of the animation between zero (@num{0}) and @par{duration}.
* @param duration The overall duration of the animation in seconds.
* @return The animation progress in the range zero (@num{0}) to one (@num{ 1 }) at the given @par{elapsedTime}.
**/
CGFloat CPTAnimationTimingFunctionElasticIn(CGFloat elapsedTime, CGFloat duration)
{
if ( elapsedTime <= CPTFloat(O.O) ) {
return CPTFloat(O.O);
}
elapsedTime /= duration; if ( elapsedTime >= CPTFloat(l .O) ) {
return CPTFloat(l .O);
}
CGFloat period = duration * CPTFloat(0.3);
CGFloat s = period * CPTFloat(0.25); elapsedTime -= CPTFloat(l .O); return -( pow(CPTFloat(2.0), CPTFloat(lO.O) * elapsedTime) * sin( (elapsedTime * duration - s) * CPTFloat(2.0 * M PI) / period ) );
}
/**
* @brief Computes a elastic out animation timing function.
* @param elapsedTime The elapsed time of the animation between zero (@num{0}) and @par{duration}.
* @param duration The overall duration of the animation in seconds.
* @return The animation progress in the range zero (@num{0}) to one (@num{ 1 }) at the given @par{ elapsedTime}.
**/
CGFloat CPTAnimationTimingFunctionElasticOut(CGFloat elapsedTime, CGFloat duration)
{
if ( elapsedTime <= CPTFloat(O.O) ) {
return CPTFloat(O.O);
}
elapsedTime /= duration; if ( elapsedTime >= CPTFloatQ .0) ) {
return CPTFloatQ .0);
}
CGFloat period = duration * CPTFloat(0.3);
CGFloat s = period * CPTFloat(0.25); return pow(CPTFloat(2.0), CPTFloat(-lO.O) * elapsedTime) * sin( (elapsedTime * duration - s) * CPTFloat(2.0 * M PI) / period ) + CPTFloatQ .0);
}
/**
* @brief Computes a elastic in and out animation timing function.
* @param elapsedTime The elapsed time of the animation between zero (@num{0}) and @par{duration}.
* @param duration The overall duration of the animation in seconds.
* @return The animation progress in the range zero (@num{0}) to one (@num{ 1 }) at the given @par{ elapsedTime}.
**/
CGFloat CPTAnimationTimingFunctionElasticInOut(CGFloat elapsedTime, CGFloat duration)
{
if ( elapsedTime <= CPTFloat(O.O) ) { return CPTFloat(O.O);
}
elapsedTime /= duration * CPTFloat(0.5); if ( elapsedTime >= CPTFloat(2.0) ) {
return CPTFloat(l .O);
}
CGFloat period = duration * CPTFloat(0.3 * 1.5);
CGFloat s = period * CPTFloat(0.25); elapsedTime -= CPTFloat(l .O); if ( elapsedTime < CPTFloat(O.O) ) {
return CPTFloat(-0.5) * ( pow(CPTFloat(2.0), CPTFloat(lO.O) * elapsedTime) * sin( (elapsedTime * duration - s) * CPTFloat(2.0 * M PI) / period ) );
}
else {
return pow(CPTFloat(2.0), CPTFloat(-lO.O) * elapsedTime) * sin( (elapsedTime * duration - s) * CPTFloat(2.0 * M PI) / period ) * CPTFloat(0.5) + CPTFloat(l .O);
}
}
#pragma mark -
#pragma mark Exponential
/**
* @brief Computes a exponential in animation timing function.
* @param elapsedTime The elapsed time of the animation between zero (@num{0}) and @par{duration}.
* @param duration The overall duration of the animation in seconds.
* @return The animation progress in the range zero (@num{0}) to one (@num{ 1 }) at the given @par{ elapsedTime}.
**/
CGFloat CPTAnimationTimingFunctionExponentialIn(CGFloat elapsedTime, CGFloat duration)
{
if ( elapsedTime <= CPTFloat(O.O) ) {
return CPTFloat(O.O);
}
elapsedTime /= duration; if ( elapsedTime >= CPTFloat(l .O) ) {
return CPTFloat(l .O);
} return pow( CPTFloat(2.0), CPTFloat(lO.O) * ( elapsedTime - CPTFloat(l .O) ) );
}
/**
* @brief Computes a exponential out animation timing function.
* @param elapsedTime The elapsed time of the animation between zero (@num{0}) and @par{duration}.
* @param duration The overall duration of the animation in seconds.
* @return The animation progress in the range zero (@num{0}) to one (@num{ 1 }) at the given @par{elapsedTime}.
**/
CGFloat CPTAnimationTimingFunctionExponentialOut(CGFloat elapsedTime, CGFloat duration)
{
if ( elapsedTime <= CPTFloat(O.O) ) {
return CPTFloat(O.O);
}
elapsedTime /= duration; if ( elapsedTime >= CPTFloat(l .O) ) {
return CPTFloat(l .O);
}
return -pow(CPTFloat(2.0), CPTFloat(-lO.O) * elapsedTime) + CPTFloat(l .O);
}
/**
* @brief Computes a exponential in and out animation timing function.
* @param elapsedTime The elapsed time of the animation between zero (@num{0}) and @par{duration}.
* @param duration The overall duration of the animation in seconds.
* @return The animation progress in the range zero (@num{0}) to one (@num{ 1 }) at the given @par{ elapsedTime}.
**/
CGFloat CPTAnimationTimingFunctionExponentialInOut(CGFloat elapsedTime, CGFloat duration)
{
if ( elapsedTime <= CPTFloat(O.O) ) {
return CPTFloat(O.O);
}
elapsedTime /= duration * CPTFloat(0.5);
elapsedTime -= CPTFloatQ .0); if ( elapsedTime >= 1.0 ) { return CPTFloat(l .O);
}
if ( elapsedTime < CPTFloat(O.O) ) {
return CPTFloat(0.5) * pow(CPTFloat(2.0), CPTFloat(lO.O) * elapsedTime);
}
else {
return CPTFloat(0.5) * ( -pow(CPTFloat(2.0), CPTFloat(-lO.O) * elapsedTime) + CPTFloat(2.0) );
}
}
#pragma mark - #pragma mark Sinusoidal
/**
* @brief Computes a sinusoidal in animation timing function.
* @param elapsedTime The elapsed time of the animation between zero (@num{0}) and @par{duration}.
* @param duration The overall duration of the animation in seconds.
* @return The animation progress in the range zero (@num{0}) to one (@num{ 1 }) at the given @par{ elapsedTime}.
**/
CGFloat CPTAnimationTimingFunctionSinusoidalIn(CGFloat elapsedTime, CGFloat duration)
{
if ( elapsedTime <= CPTFloat(O.O) ) {
return CPTFloat(O.O);
}
elapsedTime /= duration; if ( elapsedTime >= CPTFloat(l .O) ) {
return CPTFloat(l .O);
} return -cos( elapsedTime * CPTFloat(M_PI_2) ) + CPTFloat(l .O);
}
/**
* @brief Computes a sinusoidal out animation timing function.
* @param elapsedTime The elapsed time of the animation between zero (@num{0}) and @par{duration}.
* @param duration The overall duration of the animation in seconds.
* @return The animation progress in the range zero (@num{0}) to one (@num{ 1 }) at the given @par{elapsedTime}.
**/ CGFloat CPTAnimationTimingFunctionSinusoidalOut(CGFloat elapsedTime, CGFloat duration)
{
if ( elapsedTime <= CPTFloat(O.O) ) {
return CPTFloat(O.O);
}
elapsedTime /= duration; if ( elapsedTime >= CPTFloatQ .0) ) {
return CPTFloatQ .0);
}
return sin( elapsedTime * CPTFloat(M_PI_2) );
}
/**
* @brief Computes a sinusoidal in and out animation timing function.
* @param elapsedTime The elapsed time of the animation between zero (@num{0}) and @par{duration}.
* @param duration The overall duration of the animation in seconds.
* @return The animation progress in the range zero (@num{0}) to one (@num{ 1 }) at the given @par{ elapsedTime}.
**/
CGFloat CPTAnimationTimingFunctionSinusoidalInOut(CGFloat elapsedTime, CGFloat duration)
{
if ( elapsedTime <= CPTFloat(O.O) ) {
return CPTFloat(O.O);
} elapsedTime /= duration; if ( elapsedTime >= CPTFloatQ .0) ) {
return CPTFloatQ .0);
}
return CPTFloat(-0.5) * ( cos(CPTFloat(M_PI) * elapsedTime) - CPTFloatQ .0) );
}
#pragma mark - #pragma mark Cubic
/**
* @brief Computes a cubic in animation timing function.
* @param elapsedTime The elapsed time of the animation between zero (@num{0}) and @par{duration}. * @param duration The overall duration of the animation in seconds.
* @return The animation progress in the range zero (@num{0}) to one (@num{ 1 }) at the given @par{elapsedTime}.
**/
CGFloat CPTAnimationTimingFunctionCubicIn(CGFloat elapsedTime, CGFloat duration)
{
if ( elapsedTime <= CPTFloat(O.O) ) {
return CPTFloat(O.O);
}
elapsedTime /= duration; if ( elapsedTime >= CPTFloat(l .O) ) {
return CPTFloat(l .O);
} return elapsedTime * elapsedTime * elapsedTime;
}
/**
* @brief Computes a cubic out animation timing function.
* @param elapsedTime The elapsed time of the animation between zero (@num{0}) and @par{duration}.
* @param duration The overall duration of the animation in seconds.
* @return The animation progress in the range zero (@num{0}) to one (@num{ 1 }) at the given @par{elapsedTime}.
**/
CGFloat CPTAnimationTimingFunctionCubicOut(CGFloat elapsedTime, CGFloat duration)
{
if ( elapsedTime <= CPTFloat(O.O) ) {
return CPTFloat(O.O);
}
elapsedTime = elapsedTime / duration - CPTFloat(l .O); if ( elapsedTime >= CPTFloat(O.O) ) {
return CPTFloat(l .O);
}
return elapsedTime * elapsedTime * elapsedTime + CPTFloat(l .O);
}
/**
* @brief Computes a cubic in and out animation timing function.
* @param elapsedTime The elapsed time of the animation between zero (@num{0}) and @par{duration}. * @param duration The overall duration of the animation in seconds.
* @return The animation progress in the range zero (@num{0}) to one (@num{ 1 }) at the given @par{elapsedTime} .
**/
CGFloat CPTAnimationTimingFunctionCubicInOut(CGFloat elapsedTime, CGFloat duration)
{
if ( elapsedTime <= CPTFloat(O.O) ) {
return CPTFloat(O.O);
} elapsedTime /= duration * CPTFloat(0.5); if ( elapsedTime >= CPTFloat(2.0) ) {
return CPTFloat(l .O);
}
if ( elapsedTime < CPTFloat(l .O) ) {
return CPTFloat(0.5) * elapsedTime * elapsedTime * elapsedTime;
}
else {
elapsedTime -= CPTFloat(2.0); return CPTFloat(0.5) * ( elapsedTime * elapsedTime * elapsedTime + CPTFloat(2.0) );
}
}
#pragma mark - #pragma mark Quadratic
/**
* @brief Computes a quadratic in animation timing function.
* @param elapsedTime The elapsed time of the animation between zero (@num{0}) and @par{duration} .
* @param duration The overall duration of the animation in seconds.
* @return The animation progress in the range zero (@num{0}) to one (@num{ 1 }) at the given @par{ elapsedTime} .
**/
CGFloat CPTAnimationTimingFunctionQuadraticIn(CGFloat elapsedTime, CGFloat duration)
{
if ( elapsedTime <= CPTFloat(O.O) ) {
return CPTFloat(O.O);
} elapsedTime /= duration; if ( elapsedTime >= CPTFloat(l .O) ) {
return CPTFloat(l .O);
}
return elapsedTime * elapsedTime;
}
/**
* @brief Computes a quadratic out animation timing function.
* @param elapsedTime The elapsed time of the animation between zero (@num{0}) and @par{duration}.
* @param duration The overall duration of the animation in seconds.
* @return The animation progress in the range zero (@num{0}) to one (@num{ 1 }) at the given @par{ elapsedTime}.
**/
CGFloat CPTAnimationTimingFunctionQuadraticOut(CGFloat elapsedTime, CGFloat duration)
{
if ( elapsedTime <= CPTFloat(O.O) ) {
return CPTFloat(O.O);
}
elapsedTime /= duration; if ( elapsedTime >= CPTFloat(l .O) ) {
return CPTFloat(l .O);
}
return -elapsedTime * ( elapsedTime - CPTFloat(2.0) );
}
/**
* @brief Computes a quadratic in and out animation timing function.
* @param elapsedTime The elapsed time of the animation between zero (@num{0}) and @par{duration}.
* @param duration The overall duration of the animation in seconds.
* @return The animation progress in the range zero (@num{0}) to one (@num{ 1 }) at the given @par{ elapsedTime}.
**/
CGFloat CPTAnimationTimingFunctionQuadraticInOut(CGFloat elapsedTime, CGFloat duration)
{
if ( elapsedTime <= CPTFloat(O.O) ) {
return CPTFloat(O.O);
} elapsedTime /= duration * CPTFloat(0.5); if ( elapsedTime >= CPTFloat(2.0) ) {
return CPTFloat(l .O);
}
if ( elapsedTime < CPTFloat(l .O) ) {
return CPTFloat(0.5) * elapsedTime * elapsedTime;
}
else {
elapsedTime -= CPTFloat(l .O); return CPTFloat(-0.5) * ( elapsedTime * ( elapsedTime - CPTFloat(2.0) ) - CPTFloat(l .O) );
}
}
#pragma mark - #pragma mark Quartic
/**
* @brief Computes a quartic in animation timing function.
* @param elapsedTime The elapsed time of the animation between zero (@num{0}) and @par{duration}.
* @param duration The overall duration of the animation in seconds.
* @return The animation progress in the range zero (@num{0}) to one (@num{ 1 }) at the given @par{ elapsedTime}.
**/
CGFloat CPTAnimationTimingFunctionQuarticIn(CGFloat elapsedTime, CGFloat duration)
{
if ( elapsedTime <= CPTFloat(O.O) ) {
return CPTFloat(O.O);
}
elapsedTime /= duration; if ( elapsedTime >= CPTFloat(l .O) ) {
return CPTFloat(l .O);
} return elapsedTime * elapsedTime * elapsedTime * elapsedTime;
}
/* *
* @brief Computes a quartic out animation timing function.
* @param elapsedTime The elapsed time of the animation between zero (@num{0}) and @par{duration}.
* @param duration The overall duration of the animation in seconds.
* @return The animation progress in the range zero (@num{0}) to one (@num{ 1 }) at the given @par{elapsedTime}.
**/
CGFloat CPTAnimationTimingFunctionQuarticOut(CGFloat elapsedTime, CGFloat duration)
{
if ( elapsedTime <= CPTFloat(O.O) ) {
return CPTFloat(O.O);
}
elapsedTime = elapsedTime / duration - CPTFloat(l .O); if ( elapsedTime >= CPTFloat(O.O) ) {
return CPTFloat(l .O);
}
return -( elapsedTime * elapsedTime * elapsedTime * elapsedTime - CPTFloat(l .O) ); }
/**
* @brief Computes a quartic in and out animation timing function.
* @param elapsedTime The elapsed time of the animation between zero (@num{0}) and @par{duration}.
* @param duration The overall duration of the animation in seconds.
* @return The animation progress in the range zero (@num{0}) to one (@num{ 1 }) at the given @par{ elapsedTime}.
**/
CGFloat CPTAnimationTimingFunctionQuarticInOut(CGFloat elapsedTime, CGFloat duration)
{
if ( elapsedTime <= CPTFloat(O.O) ) {
return CPTFloat(O.O);
} elapsedTime /= duration * CPTFloat(0.5); if ( elapsedTime >= CPTFloat(2.0) ) {
return CPTFloat(l .O);
}
if ( elapsedTime < CPTFloat(l .O) ) {
return CPTFloat(0.5) * elapsedTime * elapsedTime * elapsedTime * elapsedTime; }
else {
elapsedTime -= CPTFloat(2.0); return CPTFloat(-0.5) * ( elapsedTime * elapsedTime * elapsedTime * elapsedTime - CPTFloat(2.0) );
}
}
#pragma mark - #pragma mark Quintic /**
* @brief Computes a quintic in animation timing function.
* @param elapsedTime The elapsed time of the animation between zero (@num{0}) and @par{duration}.
* @param duration The overall duration of the animation in seconds.
* @return The animation progress in the range zero (@num{0}) to one (@num{ 1 }) at the given @par{elapsedTime}.
**/
CGFloat CPTAnimationTimingFunctionQuinticIn(CGFloat elapsedTime, CGFloat duration)
{
if ( elapsedTime <= CPTFloat(O.O) ) {
return CPTFloat(O.O);
}
elapsedTime /= duration; if ( elapsedTime >= CPTFloat(l .O) ) {
return CPTFloat(l .O);
}
return elapsedTime * elapsedTime * elapsedTime * elapsedTime * elapsedTime;
}
/**
* @brief Computes a quintic out animation timing function.
* @param elapsedTime The elapsed time of the animation between zero (@num{0}) and @par{duration}.
* @param duration The overall duration of the animation in seconds.
* @return The animation progress in the range zero (@num{0}) to one (@num{ 1 }) at the given @par{ elapsedTime}.
**/
CGFloat CPTAnimationTimingFunctionQuinticOut(CGFloat elapsedTime, CGFloat duration)
{
if ( elapsedTime <= CPTFloat(O.O) ) {
return CPTFloat(O.O);
} elapsedTime = elapsedTime / duration - CPTFloat(l .O); if ( elapsedTime >= CPTFloat(O.O) ) {
return CPTFloat(l .O);
}
return elapsedTime * elapsedTime * elapsedTime * elapsedTime * elapsedTime + CPTFloat(l .O);
}
/**
* @brief Computes a quintic in and out animation timing function.
* @param elapsedTime The elapsed time of the animation between zero (@num{0}) and @par{duration}.
* @param duration The overall duration of the animation in seconds.
* @return The animation progress in the range zero (@num{0}) to one (@num{ 1 }) at the given @par{ elapsedTime}.
**/
CGFloat CPTAnimationTimingFunctionQuinticInOut(CGFloat elapsedTime, CGFloat duration)
{
if ( elapsedTime <= CPTFloat(O.O) ) {
return CPTFloat(O.O);
}
elapsedTime /= duration * CPTFloat(0.5); if ( elapsedTime >= CPTFloat(2.0) ) {
return CPTFloat(l .O);
}
if ( elapsedTime < CPTFloat(l .O) ) {
return CPTFloat(0.5) * elapsedTime * elapsedTime * elapsedTime * elapsedTime * elapsedTime;
}
else {
elapsedTime -= CPTFloat(2.0); return CPTFloat(0.5) * ( elapsedTime * elapsedTime * elapsedTime * elapsedTime * elapsedTime + CPTFloat(2.0) );
}
}
#import " CPTBorderLayer.h" #import "CPTBorderedLayer.h" /**
* @brief A utility layer used to draw the fill and border of a CPTBorderedLayer.
*
* This layer is always the superlayer of a single CPTBorderedLayer. It draws the fill and
* border so that they are not clipped by the mask applied to the sublayer.
**/
@implementation CPTBorderLayer /** @property CPTBorderedLayer *maskedLayer
* @brief The CPTBorderedLayer masked being masked.
* Its fill and border are drawn into this layer so that they are outside the mask applied to the @par{maskedLayer}.
**/
@synthesize maskedLayer;
#pragma mark -
#pragma mark Init/Dealloc /// @name Initialization
/// @{
/** @brief Initializes a newly allocated CPTBorderLayer object with the provided frame rectangle.
*
* This is the designated initializer. The initialized layer will have the following properties:
* - @ref maskedLayer = @nil
* - @ref needsDisplayOnBoundsChange = @YES
*
* @param newFrame The frame rectangle.
* @return The initialized CPTBorderLayer object.
**/
-(instancetype)initWithFrame:(CGRect)newFrame
{
if ( (self = [super initWithFrame: newFrame]) ) {
maskedLayer = nil; selfneedsDisplayOnBoundsChange = YES;
}
return self;
}
III ®}
/// @cond -(instancetype)initWithLayer:(id)layer
{
if ( (self = [super initWithLayenlayer]) ) {
CPTBorderLayer *theLayer = (CPTBorderLayer *)layer; maskedLayer = theLayer->maskedLayer;
}
return self;
}
/// @endcond #pragma mark -
#pragma mark NSCoding Methods /// @cond
-(void)encodeWithCoden(NSCoder *)coder
{
[super encodeWithCoder: coder];
[coder encodeObjec selfmaskedLayer forKey:@"CPTBorderLayer.maskedLayer"];
}
-(instancetype)initWithCoder:(NSCoder *)coder
{
if ( (self = [super initWithCoder: coder]) ) {
maskedLayer = [coder decodeObjectForKey:@"CPTBorderLayer.maskedLayer"];
}
return self;
}
/// @endcond
#pragma mark - #pragma mark Drawing
/// @cond
-(void)renderAsVectorInContext:(CGContextRef)context
{
if ( self.hidden ) {
return;
}
CPTBorderedLayer *theMaskedLayer = self. maskedLayer; if ( theMaskedLayer ) {
[super renderAs VectorlnContext: context] ;
[theMaskedLayer renderBorderedLay er As VectorlnContext: context] ;
}
}
/// @endcond
#pragma mark - #pragma mark Layout
/// @cond
-(void)layoutSublayers
{
[super layoutSublayers];
CPTBorderedLayer *theMaskedLayer = self.maskedLayer; if ( theMaskedLayer ) {
CG ect newBounds = selfbounds;
// undo the shadow margin so the masked layer is always the same size if ( self, shadow ) {
CGSize sizeOffset = self shadowMargin; newBounds. origin. x -= sizeOffset.width;
newBounds. origin. y -= sizeOffset.height;
newBounds. size. width += sizeOffset.width * CPTFloat(2.0);
newBounds. size. height += sizeOffset.height * CPTFloat(2.0);
}
theMaskedLayer.inLayout = YES;
theMaskedLayer.frame = newBounds;
theMaskedLayer.inLayout = NO;
}
}
-(NSSet *)sublayersExcludedFromAutomaticLayout
{
CPTBorderedLayer *excludedLayer = self.maskedLayer; if ( excludedLayer ) {
NSMutableSet *excludedSublayers = [[super
sublayersExcludedFromAutomaticLayout] mutableCopy];
if ( !excludedSublayers ) {
excludedSublayers = [NSMutableSet set]; }
[excludedSublayers addObjectexcludedLayer];
return excludedSublayers;
}
else {
return [super sublayersExcludedFromAutomaticLayout];
}
}
/// @endcond
#pragma mark - #pragma mark Accessors
/// @cond
-(void)setMaskedLayer:(CPTBorderedLayer *)newLayer {
if ( newLayer != maskedLayer ) {
maskedLayer = newLayer;
[self setNeedsDisplay];
}
}
-(void)setBounds:(CGRect)newBounds
{
if ( !CGRectEqualToRect(newBounds, self.bounds) ) { [super setBounds:newBounds];
[self setNeedsLayout];
}
}
/// @endcond @end
==================#import " CPTConstraintsFixed.h"
#import "NSCoderExtensions.h" /// @cond
@interface _CPTConstraintsFixed()
@property (nonatomic, readwnte) CGFloat offset;
@property (nonatomic, readwnte) BOOL isFixedToLower;
@end /// @endcond #pragma mark - /** @brief Implements a one-dimensional constrained position within a given numeric range.
*
* Supports fixed distance from either end of the range and a proportional fraction of the range.
**/
©implementation CPTConstraintsFixed
@synthesize offset;
@synthesize isFixedToLower;
#pragma mark -
#pragma mark Init/Dealloc
/** @brief Initializes a newly allocated CPTConstraints instance initialized with a fixed offset from the lower bound.
* @param newOffset The offset.
* @return The initialized CPTConstraints object.
**/
-(instancetype)initWithLowerOffset:(CGFloat)newOffset
{
if ( (self = [super init]) ) {
offset = newOffset;
isFixedToLower = YES;
}
return self;
}
/** @brief Initializes a newly allocated CPTConstraints instance initialized with a fixed offset from the upper bound.
* @param newOffset The offset.
* @return The initialized CPTConstraints object.
**/
-(instancetype)initWithUpperOffset:(CGFloat)newOffset
{
if ( (self = [super init]) ) {
offset = newOffset;
isFixedToLower = NO;
}
return self;
} #pragma mark -
#pragma mark Comparison -(BOOL)isEqualToConstraint:(CPTConstraints *)otherConstraint
{
if ( [self class] != [otherConstraint class] ) {
return NO;
}
return (self. offset == ( ( CPTConstraintsFixed *)otherConstraint ). offset) &&
(selfisFixedToLower == ( ( CPTConstraintsFixed *)otherConstraint
).isFixedToLower);
}
#pragma mark -
#pragma mark Positioning
/** @brief Compute the position given a range of values.
* @param lowerBound The lower bound; must be less than or equal to the upperBound. * @param upperBound The upper bound; must be greater than or equal to the lowerBound.
* @return The calculated position.
**/
-(CGFloat)positionForLowerBound:(CGFloat)lowerBound
upperBound:(CGFloat)upperBound
{
NSAssert(lowerBound <= upperBound, @"lowerBound must be less than or equal to upperBound"); CGFloat position; if ( selfisFixedToLower ) {
position = lowerBound + selfoffset;
}
else {
position = upperBound - selfoffset;
}
return position;
}
#pragma mark -
#pragma mark NSCopying Methods /// @cond
-(id)copyWithZone:(NSZone *)zone {
CPTConstraintsFixed *copy = [[[self class] allocWithZone:zone] init]; copy. offset = self, offset;
copy.isFixedToLower = selfisFixedToLower; return copy;
}
/// @endcond #pragma mark -
#pragma mark NSCoding Methods /// @cond
-(Class)classForCoder
{
return [CPTConstraints class];
}
-(void)encodeWithCoden(NSCoder *)coder
{
[coder encodeCGFloat: self, offset forKey:@"_CPTConstraintsFixed. offset"]; [coder encodeBool: selfisFixedToLower
forKey:@"_CPTConstraintsFixed. isFixedToLower"];
}
/// @endcond
/** @brief Returns an object initialized from data in a given unarchiver.
* @param coder An unarchiver object.
* @return An object initialized from data in a given unarchiver.
*/
-(instancetype)initWithCoder:(NSCoder *)coder
{
if ( (self = [super init]) ) {
offset = [coder decodeCGFloatForKey:@"_CPTConstraintsFixed. offset"]; isFixedToLower = [coder
decodeBoolForKey:@"_CPTConstraintsFixed. isFixedToLower"];
}
return self;
}
@end
#import "CPTXYAxis.h" #import "CPTConstraints.h"
#import "CPTFill.h"
#import "CPTLimitBand.h"
#import "CPTLineCap.h"
#import "CPTLineStyle.h"
#import "CPTMutablePlotRange.h"
#import "CPTPlotArea.h"
#import "CPTPlotSpace.h"
#import "CPTUtilities.h"
#import "CPTXYPlotSpace.h"
#import "NSCoderExtensions.h"
#import <tgmath.h> /// @cond
©interface CPTXYAxis()
-(void)drawTicksInContext:(CGContextRef)context atLocations:(NSSet *)locations withLength:(CGFloat)length inRange:(CPTPlotRange *)labeledRange
isMajor:(BOOL)major;
-(void)orthogonalCoordinateViewLowerBound:(CGFloat *)lower upperBound:(CGFloat *)upper;
-(CGPoint)viewPointForOrthogonalCoordinateDecimal:(NSDecimal)orthogonalCoord axisCoordinateDecimal:(NSDecimal)coordinateDecimalNumber;
@end
/// @endcond
#pragma mark -
/**
* @brief A 2-dimensional cartesian (X-Y) axis class.
* */
@implementation CPTXYAxis
/** @property NSDecimal orthogonalCoordinateDecimal
* @brief The data coordinate value where the axis crosses the orthogonal axis.
* If the @ref axisConstraints is non-nil, the constraints take priority and this property is ignored.
* @see @ref axisConstraints
**/
@synthesize orthogonalCoordinateDecimal;
/** @property CPTConstraints * axisConstraints
* @brief The constraints used when positioning relative to the plot area. * If @nil (the default), the axis is fixed relative to the plot space coordinates,
* crossing the orthogonal axis at @ref orthogonalCoordinateDecimal and moves only
* whenever the plot space ranges change.
* @see @ref orthogonalCoordinateDecimal
* */
@synthesize axisConstraints;
#pragma mark -
#pragma mark Init/Dealloc
/// @name Initialization
/// @{
/** @brief Initializes a newly allocated CPTXYAxis object with the provided frame rectangle.
*
* This is the designated initializer. The initialized layer will have the following properties:
* - @ref orthogonalCoordinateDecimal = @num{0}
* - @ref axisConstraints = @nil
*
* @param newFrame The frame rectangle.
* @return The initialized CPTXYAxis object.
**/
-(instancetype)initWithFrame:(CGRect)newFrame
{
if ( (self = [super initWithFrame: newFrame]) ) {
orthogonalCoordinateDecimal = CPTDecimalFromlnteger(O);
axisConstraints = nil;
selftickDirection = CPTSignNone;
}
return self;
}
/// @}
/// @cond
-(instancetype)initWithLayer:(id)layer
{
if ( (self = [super initWithLayenlayer]) ) {
CPTXYAxis *theLayer = (CPTXYAxis *)layer; orthogonalCoordinateDecimal = theLayer->orthogonalCoordinateDecimal; axisConstraints = theLayer->axisConstraints;
}
return self; }
/// @endcond #pragma mark -
#pragma mark NSCoding Methods
/// @cond -(void)encodeWithCoden(NSCoder *)coder
{
[super encodeWithCoder: coder];
[coder encodeDecimal: self. orthogonalCoordinateDecimal
forKey :@"CPTXY Axis. orthogonalCoordinateDecimal"];
[coder encodeObjec selfaxisConstraints forKey :@"CPTXY Axis. axisConstraints"];
}
-(instancetype)initWithCoder:(NSCoder *)coder
{
if ( (self = [super initWithCoder: coder]) ) {
orthogonalCoordinateDecimal = [coder
decodeDecimalForKey : @ " CPTXYAxi s. orthogonalCoordinateDecimal " ] ;
axisConstraints = [coder
decodeObjectForKey:@"CPTXYAxis.axisConstraints"];
}
return self;
}
/// @endcond #pragma mark -
#pragma mark Coordinate Transforms /// @cond
-(void)orthogonalCoordinateViewLowerBound:(CGFloat *)lower upperBound:(CGFloat *)upper
{
CPTCoordinate orthogonalCoordinate = CPTOrthogonalCoordinate(self. coordinate); CPTXYPlotSpace *xyPlotSpace = (CPTXYPlotSpace *)self.plotSpace;
CPTPlotRange *orthogonalRange = [xyPlotSpace
plotRangeForCoordinate:orthogonalCoordinate]; NSAssert(orthogonalRange != nil, @"The orthogonalRange was nil in
orthogonal Coordinate Vi ewLowerB ound : upperB ound :") ; NSDecimal zero = CPTDecimalFromlnteger(O);
CGPoint lowerBoundPoint = [self
viewPointForOrthogonalCoordinateDecimal:orthogonalRange. location
axi sCoordinateDecimal : zero] ;
CGPoint upperBoundPoint = [self
viewPointForOrthogonalCoordinateDecimal:orthogonalRange.end
axi sCoordinateDecimal : zero] ; switch ( self, coordinate ) {
case CPTCoordinateX:
*lower = lowerBoundPoint.y;
*upper = upperBoundPoint.y;
break; case CPTCoordinateY:
*lower = lowerBoundPoint.x;
*upper = upperBoundPoint.x;
break; default:
*lower = NAN;
*upper = NAN;
break;
}
}
-(CGPoint)viewPointForOrthogonalCoordinateDecimal:(NSDecimal)orthogonalCoord axisCoordinateDecimal:(NSDecimal)coordinateDecimalNumber
{
CPTCoordinate myCoordinate = self. coordinate;
CPTCoordinate orthogonalCoordinate = CPTOrthogonalCoordinate(myCoordinate);
NSDecimal plotPoint[2]; plotPoint[myCoordinate] = coordinateDecimalNumber;
plotPoint[orthogonalCoordinate] = orthogonalCoord;
CPTPlotArea *thePlotArea = selfplotArea; return [self convertPoint:[self.plotSpace plotAreaViewPointForPlotPoin plotPoint numberOfCoordinates:2] fromLayer:thePlotArea];
} (CGPoint)viewPointForCoordinateDecimalNumber:(NSDecimal)coordinateDecimalNumb er
{ CGPoint point = [self
viewPointForOrthogonalCoordinateDecimal:self.orthogonalCoordinateDecimal
axisCoordinateDecimalxoordinateDecimalNumber]; CPTConstraints *theAxisConstraints = selfaxisConstraints; if ( theAxisConstraints ) {
CGFloat lb, ub;
[self orthogonalCoordinateViewLowerBound:&lb upperBound:&ub];
CGFloat constrainedPosition = [theAxisConstraints positionForLowerBound:lb upperBound:ub]; switch ( self, coordinate ) {
case CPTCoordinateX:
point.y = constrainedPosition;
break; case CPTCoordinateY:
point.x = constrainedPosition;
break; default:
break;
}
} if ( isnan(point.x) || isnan(point.y) ) {
NSLog( @"[CPTXYAxis viewPointForCoordinateDecimalNumber:%@] was %@", NSDecimalString(&coordinateDecimalNumber, nil), CPTStringFromPoint(point) ); if ( isnan(point.x) ) {
point.x = CPTFloat(O.O);
}
if ( isnan(point.y) ) {
point.y = CPTFloat(O.O);
}
}
return point;
}
/// @endcond
#pragma mark - #pragma mark Drawing
/// @cond -(void)drawTicksInContext:(CGContextRef)context atLocations:(NSSet *)locations withLength:(CGFloat)length inRange:(CPTPlotRange *)labeledRange
i sMaj or : (B OOL)maj or
{
CPTLineStyle *lineStyle = (major ? self.majorTickLineStyle :
self.minorTickLineStyle); if ( !lineStyle ) {
return;
}
CGFloat lineWidth = HneStyle.lineWidth; CPTAlignPointFunction alignmentFunction = NULL;
if ( ( selfcontentsScale > CPTFloat(l .O) ) && (round(lineWidth) == lineWidth) ) { alignmentFunction = CPTAlignlntegralPointToUserSpace;
}
else {
alignmentFunction = CPTAlignPointToUserSpace;
}
[lineStyle setLineStylelnContext: context];
CGC ontextB eginPath(context) ; for ( NSDecimalNumber *tickLocation in locations ) {
NSDecimal locationDecimal = tickLocati on. decimal Value; if ( labeledRange && ! [labeledRange contains:locationDecimal] ) {
continue;
}
// Tick end points
CGPoint baseViewPoint = [self
viewPointForCoordinateDecimalNumber:locationDecimal];
CGPoint start ViewPoint = baseViewPoint;
CGPoint endViewPoint = baseViewPoint;
CGFloat startF actor = CPTFloat(O.O);
CGFloat endFactor = CPTFloat(O.O);
switch ( selftickDirection ) {
case CPTSignPositive:
endFactor = CPTFloat(l .O);
break; case CPTSignNegative:
endFactor = CPTFloat(-l .O); break; case CPTSignNone:
startF actor = CPTFloat(-0.5);
endF actor = CPTFloat(0.5);
break;
}
switch ( self, coordinate ) {
case CPTCoordinateX:
startViewPoint.y += length * startF actor;
endViewPoint.y += length * endF actor;
break; case CPTCoordinateY:
startViewPoint.x += length * startF actor;
endViewPoint.x += length * endF actor;
break; default:
NSLog(@"Invalid coordinate in [CPTXYAxis drawTicksInContext:]");
}
start ViewPoint = alignmentFunction(context, start ViewPoint);
endViewPoint = alignmentFunction(context, endViewPoint);
// Add tick line
CGContextMoveToPoint(context, startViewPoint.x, startViewPoint.y); CGContextAddLineToPoint(context, endViewPoint.x, endViewPoint.y);
}
// Stroke tick line
[lineStyle strokePathlnContext: context] ;
}
-(void)renderAsVectorInContext:(CGContextRef)context
{
if ( self.hidden ) {
return;
}
[ super render As VectorlnC ontext : context] ; [self relabel];
CPTPlotRange *thePlotRange = [selfplotSpace
plotRangeForCoordinate:self. coordinate];
CPTMutablePlotRange *range = [thePlotRange mutableCopy]; CPTPlotRange *theVisibleRange = self.visibleRange;
if ( theVisibleRange ) {
[range intersect! onPl otRange : the Vi sibl eRange] ;
CPTMutablePlotRange *labeledRange = nil; switch ( selflabelingPolicy ) {
case CPTAxisLabelingPolicyNone:
case CPTAxisLabelingPolicyLocationsProvided:
label edRange = range;
break; default:
break;
}
// Ticks
[self drawTicksInContext: context atLocations:self.minorTickLocations
withLength:self.minorTickLength inRange: lab el edRange isMajor:NO];
[self drawTicksInContext: context atLocations:self.majorTickLocations
withLength:self.majorTickLength inRange:labeledRange isMajor:YES];
// Axis Line
CPTLineStyle *theLineStyle = selfaxisLineStyle;
CPTLineCap *minCap = selfaxisLineCapMin;
CPTLineCap *maxCap = selfaxisLineCapMax; if ( theLineStyle || minCap || maxCap ) {
// If there is a separate axis range given then restrict the axis to that range, overriding the visible range
// given for grid lines and ticks.
CPTPlotRange *theVisibleAxisRange = self, vi sibl eAxisRange;
if ( theVisibleAxisRange ) {
range = [theVisibleAxisRange mutableCopy];
}
CPTAlignPointFunction alignmentFunction = CPTAlignPointToUserSpace;
if ( theLineStyle ) {
CGFloat lineWidth = theLineStyle.lineWidth;
if ( ( selfcontentsScale > CPTFloat(l .O) ) && (round(lineWidth) == lineWidth) ) alignmentFunction = CPTAlignlntegralPointToUserSpace;
}
CGPoint start ViewPoint = alignmentFunction(context, [self
viewPointForCoordinateDecimalNumbenrange. location]);
CGPoint endViewPoint = alignmentFunction(context, [self
viewPointForCoordinateDecimalNumbenrange.end]); [theLine Sty 1 e setLine Sty 1 elnC ontext : context] ;
CGContextBeginPath(context);
CGContextMoveToPoint(context, start ViewPoint.x, start ViewPoint.y); CGContextAddLineToPoint(context, endViewPoint.x, endViewPoint.y); [theLineStyle strokePathlnContext: context];
}
CGPoint axisDirection = CGPointZero;
if ( minCap || maxCap ) {
switch ( self, coordinate ) {
case CPTCoordinateX:
axisDirection = ( range. lengthDouble >= CPTFloat(O.O) ) ?
CPTPointMakeQ .O, 0.0) : CPTPointMake(-1.0, 0.0);
break; case CPTCoordinateY:
axisDirection = ( range. lengthDouble >= CPTFloat(O.O) ) ?
CPTPointMake(0.0, 1.0) : CPTPointMake(0.0, -1.0);
break; default:
break;
}
}
if ( minCap ) {
NSDecimal endPoint = range. minLimit;
CGPoint viewPoint = alignmentFunction(context, [self
viewPointForCoordinateDecimalNumbenendPoint]);
[minCap renderAsVectorlnContext: context atPoint: viewPoint inDirection:CPTPointMake(-axisDirection.x, -axisDirection.y)];
}
if ( maxCap ) {
NSDecimal endPoint = range. maxLimit;
CGPoint viewPoint = alignmentFunction(context, [self
viewPointForCoordinateDecimalNumbenendPoint]);
[maxCap renderAsVectorlnContext: context atPoint: viewPoint inDirection : axi sDirection] ;
}
}
}
/// @endcond
#pragma mark - #pragma mark Grid Lines /// @cond
-(void)drawGridLinesInContext:(CGContextRef)context isMajor:(BOOL)major
{
CPTLineStyle *lineStyle = (major ? self.majorGridLineStyle :
self.minorGridLineStyle); if ( lineStyle ) {
[super renderAs VectorlnContext: context] ;
[self relabel];
CPTPlotSpace *thePlotSpace = self.plotSpace;
NSSet locations = (major ? self.majorTickLocations :
selfminorTickLocations);
CPTCoordinate selfCoordinate = self. coordinate;
CPTCoordinate orthogonalCoordinate = CPTOrthogonalCoordinate(selfCoordinate); CPTMutablePlotRange * orthogonalRange = [[thePlotSpace
plotRangeForCoordinate:orthogonalCoordinate] mutableCopy];
CPTPlotRange *theGridLineRange = selfgridLinesRange;
CPTMutablePlotRange *labeledRange = nil; switch ( selflabelingPolicy ) {
case CPTAxisLabelingPolicyNone:
case CPTAxisLabelingPolicyLocationsProvided:
{
labeledRange = [[self.plotSpace plotRangeForCoordinate:self.coordinate] mutableCopy];
CPTPlotRange *theVisibleRange = self.visibleRange;
if ( theVisibleRange ) {
[1 ab el edRange inter secti onPl otRange : the Vi sibl eRange] ;
}
}
break; default:
break; if ( theGridLineRange ) {
[orthogonalRange intersectionPlotRange:theGridLineRange];
}
CPTPlotArea *thePlotArea = self.plotArea;
NSDecimal startPlotPoint[2];
NSDecimal endPlotPoint[2]; startPlotPoint[orthogonalCoordinate] = orthogonalRange. location; endPlotPoint[orthogonalCoordinate] = orthogonalRange. end;
CGPoint originTransformed = [self convertPoin selfbounds. origin
fromLayenthePlotArea];
CGFloat lineWidth = HneStyle.lineWidth;
CPTAlignPointFunction alignmentFunction = NULL;
if ( ( selfcontentsScale > CPTFloat(l .O) ) && (round(lineWidth) == lineWidth) ) { alignmentFunction = CPTAlignlntegralPointToUserSpace;
}
else {
alignmentFunction = CPTAlignPointToUserSpace;
}
CGContextBeginPath(context); for ( NSDecimalNumber *location in locations ) {
NSDecimal locationDecimal = location. decimalValue; if ( labeledRange && ! [labeledRange contains :locationDecimal] ) {
continue;
}
startPlotPoint[selfCoordinate] = locationDecimal;
endPlotPoint[selfCoordinate] = locationDecimal;
// Start point
CGPoint start ViewPoint = [thePlotSpace
plotAreaViewPointForPlotPointstartPlotPoint numberOfCoordinates:2];
start ViewPoint.x += originTransformed. x;
start ViewPoint.y += originTransformed. y;
// End point
CGPoint endViewPoint = [thePlotSpace
plotAreaViewPointForPlotPoin endPlotPoint numberOfCoordinates:2];
endViewPoint.x += originTransformed.x;
endViewPoint.y += originTransformed.y;
// Align to pixels
start ViewPoint = alignmentFunction(context, startViewPoint);
endViewPoint = alignmentFunction(context, endViewPoint);
// Add grid line
CGContextMoveToPoint(context, start ViewPoint.x, start ViewPoint.y);
CGContextAddLineToPoint(context, endViewPoint.x, endViewPoint.y); // Stroke grid lines
[line Sty 1 e setLine Sty 1 elnC ontext : context] ;
[lineStyle strokePathlnContext: context] ;
}
}
/// @endcond #pragma mark -
#pragma mark Background Bands
/// @cond -(void)drawBackgroundBandsInContext:(CGContextRef)context
{
NS Array *bandArray = self.alternatingBandFills;
NSUInteger bandCount = band Array, count; if ( bandCount > 0 ) {
NS Array locations = [self.majorTickLocations allObjects]; if ( locations. count > 0 ) {
CPTPlotSpace *thePlotSpace = selfplotSpace;
CPTCoordinate selfCoordinate = self, coordinate;
CPTMutablePlotRange *range = [[thePlotSpace
plotRangeForCoordinate: selfCoordinate] mutableCopy];
if ( range ) {
CPTPlotRange *theVisibleRange = selfvisibleRange;
if ( theVisibleRange ) {
[range intersectionPlotRange:theVisibleRange];
}
}
CPTCoordinate orthogonalCoordinate =
CPTOrthogonalCoordinate(selfCoordinate);
CPTMutablePlotRange * orthogonalRange = [[thePlotSpace plotRangeForCoordinate:orthogonalCoordinate] mutableCopy];
CPTPlotRange *theGridLineRange = selfgridLinesRange; if ( theGridLineRange ) {
[orthogonalRange intersectionPlotRange:theGridLineRange];
}
NSDecimal zero = CPTDecimalFromlnteger(O);
NSSortDescriptor *sortDescriptor = nil; if ( range ) {
if ( CPTDecimalGreaterThanOrEqualTo(range. length, zero) ) {
sortDescriptor = [[NSSortDescriptor alloc] initWithKey:nil ascending:YES];
}
else {
sortDescriptor = [[NSSortDescriptor alloc] initWithKey:nil ascending :NO];
}
}
else {
sortDescriptor = [[NSSortDescriptor alloc] initWithKey:nil ascending:YES];
}
locations = [locations sortedArrayUsingDescriptors:@[sortDescriptor]];
NSUInteger bandlndex = 0;
id null = [NSNull null];
NSDecimal lastLocation;
if ( range ) {
lastLocation = range. location;
}
else {
lastLocation = CPTDecimalNaN();
}
NSDecimal startPlotPoint[2];
NSDecimal endPlotPoint[2];
if ( orthogonalRange ) {
startPlotPoint[orthogonalCoordinate] = orthogonalRange. location;
endPlotPoint[orthogonalCoordinate] = orthogonalRange. end;
}
else {
startPlotPoint[orthogonalCoordinate] = CPTDecimalNaN();
endPlotPoint[orthogonalCoordinate] = CPTDecimalNaN();
}
for ( NSDecimalNumber *location in locations ) {
NSDecimal currentLocation = [location decimal Value];
if ( !CPTDecimalEquals(CPTDecimalSubtract(currentLocation, lastLocation), zero) ) {
CPTFill *bandFill = bandArray[bandIndex++];
bandlndex %= bandCount; if ( bandFill != null ) {
// Start point
startPlotPoint[selfCoordinate] = currentLocation;
CGPoint start ViewPoint = [thePlotSpace
plotAreaViewPointForPlotPointstartPlotPoint numberOfCoordinates:2]; // End point
endPlotPoint[selfCoordinate] = lastLocation;
CGPoint endViewPoint = [thePlotSpace
plotAreaViewPointForPlotPointendPlotPoint numberOfCoordinates:2];
// Fill band
CGRect fillRect = CPTRectMake( MIN(startViewPoint.x, endViewPoint.x),
MIN(startViewPoint.y, endViewPoint.y),
ABS(endViewPoint.x - startViewPoint.x),
ABS(endViewPoint.y - start ViewPoint.y) );
[bandFill fillRect:CPTAlignIntegralRectToUserSpace(context, fillRect) i n C ontext : context] ;
}
} lastLocation = currentLocation;
}
// Fill space between last location and the range end
NSDecimal endLocation;
if ( range ) {
endLocation = range, end;
}
else {
endLocation = CPTDecimalNaN();
}
if ( !CPTDecimalEquals(lastLocation, endLocation) ) {
CPTFill *bandFill = bandArray[bandIndex]; if ( bandFill != null ) {
// Start point
startPlotPoint[selfCoordinate] = endLocation;
CGPoint start ViewPoint = [thePlotSpace
plotAreaViewPointForPlotPointstartPlotPoint numberOfCoordinates:2];
// End point
endPlotPoint[selfCoordinate] = lastLocation;
CGPoint endViewPoint = [thePlotSpace
plotAreaViewPointForPlotPoint:endPlotPoint numberOfCoordinates:2];
// Fill band
CGRect fillRect = CPTRectMake( MIN(startViewPoint.x, endViewPoint.x),
MIN(startViewPoint.y, endViewPoint.y),
ABS(endViewPoint.x - startViewPoint.x),
ABS(endViewPoint.y - start ViewPoint.y) );
[bandFill fillRect :CPTAlignIntegralRectToUserSpace(context, fillRect) i nC ontext : context] ;
}
-(void)drawBackgroundLimitsInContext:(CGContextRef)context
{
NSArray *limitArray = self.backgroundLimitBands; if ( limitArray. count > 0 ) {
CPTPlotSpace *thePlotSpace = selfplotSpace;
CPTCoordinate selfCoordinate = self, coordinate;
CPTMutablePlotRange *range = [[thePlotSpace
plotRangeForCoordinate: selfCoordinate] mutableCopy]; if ( range ) {
CPTPlotRange *theVisibleRange = selfvisibleRange;
if ( theVisibleRange ) {
[range inter secti onPl otRange : the Vi sibl eRange] ;
}
}
CPTCoordinate orthogonalCoordinate = CPTOrthogonalCoordinate(selfCoordinate); CPTMutablePlotRange * orthogonalRange = [[thePlotSpace
plotRangeForCoordinate:orthogonalCoordinate] mutableCopy];
CPTPlotRange *theGridLineRange = selfgridLinesRange; if ( theGridLineRange ) {
[orthogonalRange intersectionPlotRange:theGridLineRange];
}
NSDecimal startPlotPoint[2];
NSDecimal endPlotPoint[2];
startPlotPoint[orthogonalCoordinate] = orthogonalRange. location;
endPlotPoint[orthogonalCoordinate] = orthogonalRange. end; for ( CPTLimitBand *band in self.backgroundLimitBands ) {
CPTFill *bandFill = band.fill; if ( bandFill ) {
CPTMutablePlotRange *bandRange = [band. range mutableCopy];
if ( bandRange ) {
[b andRange inter secti onPl otRange : range] ; // Start point
startPlotPoint[selfCoordinate] = bandRange. location;
CGPoint start ViewPoint = [thePlotSpace
plotAreaViewPointForPlotPointstartPlotPoint numberOfCoordinates:2];
// End point
endPlotPoint[selfCoordinate] = bandRange. end;
CGPoint endViewPoint = [thePlotSpace
plotAreaViewPointForPlotPoin endPlotPoint numberOfCoordinates:2];
// Fill band
CGRect fillRect = CPTRectMake( MIN(startViewPoint.x, endViewPoint.x),
MIN(startViewPoint.y, endViewPoint.y),
ABS(endViewPoint.x - start ViewPoint.x),
ABS(endViewPoint.y - start ViewPoint.y) );
[bandFill fillRect: CPTAlignIntegralRectToUserSpace(context, fillRect) inContext: context] ;
}
}
}
}
}
/// @endcond
#pragma mark - #pragma mark Description
/// @cond
-(NS String *)description
{
CPTPlotRange *range = [self.plotSpace plotRangeForCoordinate:self.coordinate]; CGPoint start ViewPoint = [self
viewPointForCoordinateDecimalNumbenrange. location];
CGPoint endViewPoint = [self viewPointForCoordinateDecimalNumbenrange.end]; return [NSString stringWithFormat:@"<%@ with range: %@ viewCoordinates: %@ to
%@>",
[super description],
range,
CPTStringFromPoint( start ViewPoint),
CPTStringFromPoint(endViewPoint)];
}
/// @endcond #pragma mark - #pragma mark Titles
/// @cond
// Center title in the plot range by default
-(NSDecimal)defaultTitleLocation
{
NSDecimal location;
CPTPlotSpace *thePlotSpace = self.plotSpace;
CPTCoordinate theCoordinate = self.coordinate;
CPTPlotRange *axisRange = [thePlotSpace plotRangeForCoordinate:theCoordinate]; if ( axisRange ) {
CPTScaleType scaleType = [thePlotSpace scaleTypeForCoordinate:theCoordinate]; switch ( scaleType ) {
case CPTScaleTypeLinear:
location = axisRange. midPoint;
break; case CPTScaleTypeLog:
{
double loc = axisRange. locationDouble;
double end = axisRange. endDouble; if ( (loc > 0.0) && (end >= 0.0) ) {
location = CPTDecimalFromDouble( powQO.O, ( loglO(loc) + loglO(end) ) /
2.0) );
}
else {
location = axisRange. midPoint;
}
}
break; default:
location = axisRange. midPoint;
break;
}
}
else {
location = CPTDecimalFromlnteger(O);
} return location;
}
/// @endcond
#pragma mark - #pragma mark Accessors
/// (¾cond
-(void)setAxisConstraints:(CPTConstraints *)newConstraints
{
if ( ! [axisConstraints isEqualToConstraint:newConstraints] ) {
axisConstraints = newConstraints;
[self setNeedsDisplay];
[self setNeedsLayout];
}
}
-(void)setOrthogonalCoordinateDecimal:(NSDecimal)newCoord
{
if ( NSDecimalCompare(&orthogonalCoordinateDecimal, &newCoord) NSOrderedSame ) {
orthogonalCoordinateDecimal = newCoord;
[self setNeedsDisplay];
[self setNeedsLayout];
}
}
-(void)setCoordinate:(CPTCoordinate)newCoordinate
{
if ( self, coordinate != newCoordinate ) {
[super setCoordinate:newCoordinate];
switch ( newCoordinate ) {
case CPTCoordinateX:
switch ( selflabelAlignment ) {
case CPTAlignmentLeft:
case CPTAlignmentCenter:
case CPTAlignmentRight:
// ok~do nothing
break; default:
selflabelAlignment = CPTAlignmentCenter; break;
}
break; case CPTCoordinateY:
switch ( self.labelAlignment ) {
case CPTAlignmentTop:
case CPTAlignmentMiddle:
case CPTAlignmentBottom:
// ok~do nothing
break; default:
self.labelAlignment = CPTAlignmentMiddle;
break;
}
break; default:
[NSException raise:NSInvalidArgumentException format:@"Invalid coordinate: (unsigned long)newCoordinate];
break;
/// @endcond (¾end
#import "CPTXYAxisSet.h" #import "CPTLineStyle.h"
#import "CPTPathExtensions.h"
#import "CPTUtilities.h"
#import "CPTXYAxis.h" /**
* @brief A set of cartesian (X-Y) axes.
**/
©implementation CPTXYAxisSet /** ©property CPTXYAxis *xAxis
* @brief The x-axis.
**/
@dynamic xAxis; /** @property CPTXYAxis *yAxis
* @brief The y-axis.
**/ @dynamic yAxis;
#pragma mark -
#pragma mark Init/Dealloc
/// @name Initialization
/// @{
/** @brief Initializes a newly allocated CPTXYAxisSet object with the provided frame rectangle.
*
* This is the designated initializer. The @ref axes array
* will contain two new axes with the following properties:
*
* <table>
* <tr><td>@bold{Axis}</td><td>@link CPT Axis: : coordinate coordinate
@endlink</td><td>@link CPTAxis: :tickDirection tickDirection @endlink</td></tr>
* <tr><td>@ref
xAxis</td><td>#CPTCoordinateX</td><td>#CPTSignNegative</td></tr>
* <tr><td>@ref
y Axi s</td><td>#CPTCoordinate Y</td><td>#CPT SignNegative</td></tr>
* </table>
*
* @param newFrame The frame rectangle.
* @return The initialized CPTXYAxisSet object.
**/
-(instancetype)initWithFrame:(CGRect)newFrame
{
if ( (self = [super initWithFrame: newFrame]) ) {
CPTXYAxis *xAxis = [[CPTXYAxis alloc] initWithFrame: newFrame];
x Axis, coordinate = CPTCoordinateX;
xAxis. tickDirection = CPTSignNegative;
CPTXYAxis *yAxis = [[CPTXYAxis alloc] initWithFrame:newFrame];
yAxis. coordinate = CPTCoordinateY;
yAxis. tickDirection = CPTSignNegative; selfaxes = @[xAxis, yAxis];
}
return self;
}
III ®}
#pragma mark - #pragma mark Drawing /// @cond
-(void)renderAsVectorInContext:(CGContextRef)context
{
if ( self.hidden ) {
return;
}
CPTLineStyle *theLineStyle = self.borderLineStyle;
if ( theLineStyle ) {
[super renderAs VectorlnContext: context] ;
CALayer *superlayer = self superlayer;
CGRect borderRect = CPTAlignRectToUserSpace(context, [self
convertRect: superlayer.bounds fromLayer : superlayer]);
[theLine Sty 1 e setLine Sty 1 elnContext : context] ;
CGFloat radius = superlayer. cornerRadius; if ( radius > CPTFloat(O.O) ) {
CGContextBeginPath(context);
AddRoundedRectPath(context, borderRect, radius);
[theLineStyle strokePathlnContext: context] ;
}
else {
[theLineStyle strokeRect:borderRect inContext: context];
}
}
}
/// @endcond
#pragma mark - #pragma mark Layout
/**
* @brief Updates the layout of all sublayers. Sublayers (the axes) fill the plot area frame&rsquo;s bounds.
*
* This is where we do our custom replacement for the Mac-only layout manager and autoresizing mask.
* Subclasses should override this method to provide a different layout of their own sublayers.
**/
-(void)layoutSublayers {
// If we have a border, the default layout will work. Otherwise, the axis set layer zero size
// and we need to calculate the correct size for the axis layers,
if ( selfborderLineStyle ) {
[super layoutSublayers];
}
else {
CALayer *plotAreaFrame = self.superlayer.superlayer;
CGRect sublayerBounds = [self convertRectplotAreaFrame.bounds fromLayenplotAreaFrame];
sublayerBounds. origin = CGPointZero;
CGPoint sublayerPosition = [self convert? oint: self. bounds. origin
toLayenplotAreaFrame];
sublayerPosition = CGPointMake(-sublayerPosition.x, -sublayerPosition. y);
CGRect subLayerFrame = CGRectMake(sublayerPosition.x, sublayerPosition. sublayerBounds. size.width, sublayerBounds. size. height);
NSSet *excludedSublayers = [self sublayersExcludedFromAutomaticLayout]; Class layerClass = [CPTLayer class];
for ( CALayer * subLayer in self, sublayers ) {
if ( [subLayer isKindOfClass:layerClass] && ! [excludedSublayers containsObject: subLayer] ) {
subLayer. frame = subLayerFrame;
}
}
}
}
#pragma mark - #pragma mark Accessors
/// @cond
-(CPTXYAxis *)xAxis
{
return (CPTXYAxis *)[self axisForCoordinate:CPTCoordinateX atlndex:0];
}
-(CPTXYAxis *)yAxis
{
return (CPTXYAxis *)[self axisForCoordinate:CPTCoordinateY atlndex:0];
}
/// @endcond @end #import "CPTXYGraph.h"
#import "CPTXYAxis.h"
#import "CPTXYAxisSet.h"
#import "CPTXYPlotSpace.h"
/// @cond
©interface CPTXYGraph()
@property (nonatomic, readwrite, assign) CPTScaleType xScaleType;
@property (nonatomic, readwrite, assign) CPTScaleType yScaleType;
@end
/// @endcond #pragma mark - /* *
* @brief A graph using a cartesian (X-Y) plot space.
@implementation CPTXYGraph /** @property CPTScaleType xScaleType
* @brief The scale type for the x-axis.
@synthesize xScaleType; /** @property CPTScaleType yScaleType
* @brief The scale type for the y-axis.
@synthesize yScaleType; #pragma mark -
#pragma mark Init/Dealloc
/** @brief Initializes a newly allocated CPTXYGraph object with the provided frame rectangle and scale types.
*
* This is the designated initializer.
*
* @param newFrame The frame rectangle.
* @param newXScaleType The scale type for the x-axis.
* @param newYScaleType The scale type for the y-axis.
* @return The initialized CPTXYGraph object.
**/ -(instancetype)initWithFrame:(CGRect)newFrame
xScaleType:(CPTScaleType)newXScaleType
yScaleType:(CPTScaleType)newYScaleType
{
if ( (self = [super initWithFrame:newFrame]) ) {
xScaleType = newXScaleType;
yScaleType = newYScaleType;
}
return self;
}
/// @name Initialization
/** @brief Initializes a newly allocated CPTXYGraph object with the provided frame rectangle.
*
* The initialized layer will have the following properties:
* - @link CPTXYPlotSpace: :xScaleType xScaleType @endlink = #CPTScaleTypeLinear
* - @link CPTXYPlotSpace: :yScaleType yScaleType @endlink = #CPTScaleTypeLinear
*
* @param newFrame The frame rectangle.
* @return The initialized CPTXYGraph object.
* @see @link CPTXYGraph: :initWithFrame:xScaleType:yScaleType: - initWithFrame:xScaleType:yScaleType: @endlink
**/
-(instancetype)initWithFrame:(CGRect)newFrame
{
return [self initWithFrame:newFrame xScaleType:CPTScaleTypeLinear
yScaleType:CPTScaleTypeLinear];
}
m
III @cond
-(instancetype)initWithLayer:(id)layer
{
if ( (self = [super initWithLayer:layer]) ) {
CPTXYGraph *theLayer = (CPTXYGraph *)layer; xScaleType = theLayer->xScaleType;
yScaleType = theLayer->yScaleType;
}
return self;
} /// @endcond #pragma mark -
#pragma mark NSCoding Methods /// @cond
-(void)encodeWithCoder:(NSCoder *)coder
{
[super encodeWithCoder: coder];
[coder encodelntegenself.xScaleType forKey:@"CPTXYGraph.xScaleType"];
[coder encodeInteger:self.yScaleType forKey:@"CPTXYGraph.yScaleType"];
}
-(instancetype)initWithCoder:(NSCoder *)coder
{
if ( (self = [super initWithCoder: coder]) ) {
xScaleType = (CPTScaleType)[coder decodelntegerForKey :
@"CPTXYGraph.xScaleType"];
yScaleType = (CPTScaleType)[coder decodelntegerForKey :
@"CPTXYGraph.yScaleType"];
}
return self;
}
/// @endcond #pragma mark -
#pragma mark Factory Methods /// @cond
-(CPTPlotSpace *)newPlotSpace
{
CPTXYPlotSpace *space = [[CPTXYPlotSpace alloc] init]; space. xScaleType = selfxScaleType;
space. yScaleType = selfyScaleType;
return space;
}
-(CPTAxisSet *)newAxisSet
{
CPTXYAxisSet *newAxisSet = [[CPTXYAxisSet alloc] initWithFrame:self.bounds]; newAxisSet.xAxis.plotSpace = selfdefaultPlotSpace; newAxisSet.yAxis.plotSpace = self.defaultPlotSpace;
return newAxisSet;
}
/// @endcond @end
#import "CPTXYPlotSpace.h"
#import "CPTAnimation.h"
#import "CPTAnimationOperation.h"
#import "CPTAnimationPeriod.h"
#import "CPTAxisSet.h"
#import "CPTExceptions.h"
#import "CPTGraph.h"
#import "CPTGraphHostingView.h"
#import "CPTMutablePlotRange.h"
#import "CPTPlot.h"
#import "CPTPlotArea.h"
#import "CPTPlotAreaFrame.h"
#import "CPTUtilities.h"
#import "NSCoderExtensions.h"
#import <tgmath.h>
/// @cond
©interface CPTXYPlotSpace()
-(CGFloat)viewCoordinateForViewLength:(NSDecimal)viewLength
HnearPlotRange:(CPTPlotRange *)range plotCoordinateValue:(NSDecimal)plotCoord; -(CGFloat)viewCoordinateForViewLength:(CGFloat)viewLength
1 inearPl otRange : (CPTP1 otRange * )range
doublePrecisionPlotCoordinateValue:(double)plotCoord;
-(CGFloat)viewCoordinateForViewLength:(CGFloat)viewLength
1 ogPl otRange : (CPTPl otRange * )range
doublePrecisionPlotCoordinateValue:(double)plotCoord;
-(NSDecimal)plotCoordinateForViewLength:(NSDecimal)viewLength
HnearPlotRange:(CPTPlotRange *)range boundsLength:(NSDecimal)boundsLength; -(double)doublePrecisionPlotCoordinateForViewLength:(CGFloat)viewLength linearPlotRange:(CPTPlotRange *)range boundsLength:(CGFloat)boundsLength;
-(double)doublePrecisionPlotCoordinateForViewLength:(CGFloat)viewLength logPlotRange:(CPTPlotRange *)range boundsLength:(CGFloat)boundsLength;
-(CPTPlotRange *)constrainRange:(CPTPlotRange *)existingRange toGlobalRange:(CPTPlotRange *)globalRange;
-(void)animateRangeForCoordinate:(CPTCoordinate)coordinate shift: (NSDecimal)shift momentum Time:(CGFloat)momentum Time speed:(CGFloat)speed
acceleration:(CGFloat)acceleration;
-(CPTPlotRange *)shiftRange:(CPTPlotRange *)oldRange by:(NSDecimal)shift usingMomentum:(BOOL)momentum inGlobalRange:(CPTPlotRange *)globalRange withDisplacement:(CGFloat *)displacement;
-(C GF1 oat)vi ewC oordinateF orRange : (CPTP1 otRange * )range
coordinate:(CPTCoordinate)coordinate direction:(BOOL)direction;
CGFloat firstPositiveRoot(CGFloat a, CGFloat b, CGFloat c);
@property (nonatomic, readwrite) BOOL isDragging;
@property (nonatomic, readwrite) CGPoint lastDragPoint;
@property (nonatomic, readwrite) CGPoint lastDisplacement;
@property (nonatomic, readwrite) NSTimelnterval lastDragTime;
@property (nonatomic, readwrite) NSTimelnterval lastDeltaTime;
@property (nonatomic, readwrite, retain) NSMutableArray *animations;
@end
/// @endcond #pragma mark -
/**
^brief A plot space using a two-dimensional cartesian coordinate system.
*
* The @ref xRange and @ref yRange determine the mapping between data coordinates
* and the screen coordinates in the plot area. The @quote{end} of a range is
* the location plus its length. Note that the length of a plot range can be negative, so
* the end point can have a lesser value than the starting location.
*
* The global ranges constrain the values of the @ref xRange and @ref yRange.
* Whenever the global range is set (non-@nil), the corresponding plot
* range will be adjusted so that it fits in the global range. When a new
* range is set to the plot range, it will be adjusted as needed to fit
* in the global range. This is useful for constraining scrolling, for
* instance.
**/
@implementation CPTXYPlotSpace /** @property CPTPlotRange *xRange
* @brief The range of the x coordinate. Defaults to a range with @link
CPTPlotRange: location location @endlink zero (@num{0})
* and a @link CPTPlotRange: :length length @endlink of one (@num{ 1 }). * The @link CPTPlotRange: location location @endlink of the @ref xRange
* defines the data coordinate associated with the left edge of the plot area.
* Similarly, the @link CPTPlotRange:: end end @endlink of the @ref xRange
* defines the data coordinate associated with the right edge of the plot area.
**/
@synthesize xRange;
/** @property CPTPlotRange *yRange
* @brief The range of the y coordinate. Defaults to a range with @link
CPTPlotRange: location location @endlink zero (@num{0})
* and a @link CPTPlotRange: length length @endlink of one (@num{ 1 }).
*
* The @link CPTPlotRange: location location @endlink of the @ref yRange
* defines the data coordinate associated with the bottom edge of the plot area.
* Similarly, the @link CPTPlotRange:: end end @endlink of the @ref yRange
* defines the data coordinate associated with the top edge of the plot area.
**/
@synthesize yRange;
/** @property CPTPlotRange *globalXRange
* @brief The global range of the x coordinate to which the @ref xRange is constrained.
*
* If non-@nil, the @ref xRange and any changes to it will
* be adjusted so that it always fits within the @ref globalXRange.
* If @nil (the default), there is no constraint on x.
**/
@synthesize globalXRange;
/** @property CPTPlotRange * global YRange
* @brief The global range of the y coordinate to which the @ref yRange is constrained.
*
* If non-@nil, the @ref yRange and any changes to it will
* be adjusted so that it always fits within the @ref globalYRange.
* If @nil (the default), there is no constraint on y.
**/
@synthesize globalYRange;
/** @property CPTScaleType xScaleType
* @brief The scale type of the x coordinate. Defaults to #CPTScaleTypeLinear.
**/
@synthesize xScaleType;
/** @property CPTScaleType yScaleType
* @brief The scale type of the y coordinate. Defaults to #CPTScaleTypeLinear.
**/
@synthesize yScaleType; /** @property BOOL allowsMomentum
* @brief If @YES, plot space scrolling in any direction slows down gradually rather than stopping abruptly. Defaults to @NO.
* */
@dynamic allowsMomentum;
/** @property BOOL allowsMomentumX
* @brief If @YES, plot space scrolling in the x-direction slows down gradually rather than stopping abruptly. Defaults to @NO.
**/
@synthesize allowsMomentumX;
/** @property BOOL allowsMomentum Y
* @brief If @YES, plot space scrolling in the y-direction slows down gradually rather than stopping abruptly. Defaults to @NO.
**/
@synthesize allowsMomentum Y; /** @property CPTAnimationCurve momentumAnimationCurve
* @brief The animation curve used to stop the motion of the plot ranges when scrolling with momentum. Defaults to #CPTAnimationCurveQuadraticOut.
**/
@synthesize momentumAnimationCurve;
/** @property CPTAnimationCurve bounceAnimationCurve
* @brief The animation curve used to return the plot range back to the global range after scrolling. Defaults to #CPTAnimationCurveQuadraticOut.
**/
@synthesize bounceAnimationCurve;
/** @property CGFloat momentum Acceleration
* @brief Deceleration in pixel s/secondA2 for momentum scrolling. Defaults to
@num {2000.0}.
* */
@synthesize momentumAcceleration;
/** @property CGFloat bounceAccelerati on
* @brief Bounce-back acceleration in pixel s/secondA2 when scrolled past the global range. Defaults to @num{3000.0}.
**/
@synthesize bounceAcceleration;
/** @property CGFloat minimumDisplacementToDrag
* @brief The minimum distance the interaction point must move before the event is considered a drag. Defaults to @num{2.0}.
**/ @synthesize minimumDisplacementToDrag;
@dynamic isDragging;
@synthesize lastDragPoint;
@synthesize lastDisplacement;
@synthesize lastDragTime;
@synthesize lastDeltaTime;
@synthesize animations; #pragma mark -
#pragma mark Init/Dealloc
/// @name Initialization
/// @{
/** @brief Initializes a newly allocated CPTXYPlotSpace object.
*
* The initialized object will have the following properties:
* - @ref xRange = [@num{0}, @num{ 1 }]
* - @ref yRange = [@num{0}, @num{ 1 }]
* - @ref globalXRange = @nil
* - @ref globalYRange = @nil
* - @ref xScaleType = #CPTScaleTypeLinear
* - @ref yScaleType = #CPTScaleTypeLinear
* - @ref allowsMomentum = @NO
* - @ref allowsMomentumX = @NO
* - @ref allowsMomentumY = @NO
* - @ref momentumAnimationCurve = #CPTAnimationCurveQuadraticOut
* - @ref bounceAnimationCurve = #CPTAnimationCurveQuadraticOut
* - @ref momentumAcceleration = @num{2000.0}
* - @ref bounceAcceleration = @num{3000.0}
* - @ref minimumDisplacementToDrag = @num{2.0}
*
* @return The initialized object.
* */
-(instancetype)init
{
if ( (self = [super init]) ) {
xRange = [[CPTPlotRange alloc]
initWithLocation:CPTDecimalFromInteger(0) length:CPTDecimalFromInteger(l)]; yRange = [[CPTPlotRange alloc]
initWithLocation:CPTDecimalFromInteger(0) length:CPTDecimalFromInteger(l)]; globalXRange = nil;
globalYRange = nil;
xScaleType = CPTScaleTypeLinear;
yScaleType = CPTScaleTypeLinear;
lastDragPoint = CGPointZero; lastDisplacement = CGPointZero;
lastDragTime = 0.0;
lastDeltaTime = 0.0;
animations = [[NSMutableArray alloc] init]; allowsMomentumX = NO;
allowsMomentumY = NO;
momentumAnimationCurve = CPTAnimationCurveQuadraticOut;
bounceAnimationCurve = CPTAnimationCurveQuadraticOut;
momentumAcceleration = 2000.0;
bounceAcceleration = 3000.0;
minimumDisplacementToDrag = 2.0;
}
return self;
}
III ®}
#pragma mark - #pragma mark NSCoding Methods
/// @cond
-(void)encodeWithCoder:(NSCoder *)coder
{
[super encodeWithCoder: coder];
[coder encodeObjec self.xRange forKey:@"CPTXYPlotSpace.xRange"];
[coder encodeObject:self.yRange forKey:@"CPTXYPlotSpace.yRange"];
[coder encodeObject:self.globalXRange forKey:@"CPTXYPlotSpace.globalXRange"];
[coder encodeObjec self.globalYRange forKey:@"CPTXYPlotSpace.globalYRange"];
[coder encodeInteger:self.xScaleType forKey:@"CPTXYPlotSpace.xScaleType"];
[coder encodeInteger:self.yScaleType forKey:@"CPTXYPlotSpace.yScaleType"];
[coder encodeBool : self. allowsMomentumX
forKey:@"CPTXYPlotSpace.allowsMomentumX"];
[coder encodeBool : self. allowsMomentumY
forKey:@"CPTXYPlotSpace.allowsMomentumY"];
[coder encodelnt: selfmomentumAnimationCurve
forKey:@"CPTXYPlotSpace. momentumAnimationCurve"];
[coder encodelnt: self. bounceAnimationCurve
forKey:@"CPTXYPlotSpace.bounceAnimationCurve"];
[coder encodeCGFloat: selfmomentumAccelerati on
forKey:@"CPTXYPlotSpace. momentumAcceleration"];
[coder encodeCGFloat: selfbounceAcceleration
forKey:@"CPTXYPlotSpace.bounceAcceleration"];
[coder encodeCGFloat: selfminimumDisplacementToDrag
forKey:@"CPTXYPlotSpace. minimumDisplacementToDrag"]; // No need to archive these properties:
// lastDragPoint
// lastDisplacement
// lastDragTime
// lastDeltaTime
// animations
}
-(instancetype)initWithCoder:(NSCoder *)coder
{
if ( (self = [super initWithCoder: coder]) ) {
xRange = [[coder decodeObjectForKey:@"CPTXYPlotSpace.xRange"] copy]; yRange = [[coder decodeObjectForKey:@"CPTXYPlotSpace.yRange"] copy]; globalXRange = [[coder decodeObjectForKey:@"CPTXYPlotSpace.globalXRange"] copy];
globalYRange = [[coder decodeObjectForKey:@"CPTXYPlotSpace.globalYRange"] copy];
xScaleType = (CPTScaleType)[coder decodelntegerForKey :
@"CPTXYPlotSpace.xScaleType"];
yScaleType = (CPTScaleType)[coder decodelntegerForKey :
@"CPTXYPlotSpace.yScaleType"]; if ( [coder containsValueForKey:@"CPTXYPlotSpace.allowsMomentum"] ) { selfallowsMomentum = [coder
decodeB oolForKey : @ " CPTXYPlotSpace . allowsMomentum " ] ;
}
else {
allowsMomentumX = [coder
decodeB oolForKey : @ " CPTXYPlotSpace . allowsMomentumX" ] ;
allowsMomentumY = [coder
decodeB oolForKey : @ " CPTXYPlotSpace . allowsMomentumY" ] ;
}
momentumAnimationCurve = (CPTAnimationCurve)[coder decodelntForKey : @ " CPTXYPlotSpace .momentum AnimationCurve " ] ;
bounceAnimationCurve = (CPTAnimationCurve)[coder decodelntForKey :
@"CPTXYPlotSpace.bounceAnimationCurve"];
momentumAcceleration = [coder
decodeCGFloatForKey:@"CPTXYPlotSpace.momentumAcceleration"];
bounceAcceleration = [coder
decodeCGFloatForKey:@"CPTXYPlotSpace.bounceAcceleration"];
minimumDisplacementToDrag = [coder
decodeCGFloatForKey:@"CPTXYPlotSpace.minimumDisplacementToDrag"]; lastDragPoint = CGPointZero;
lastDisplacement = CGPointZero;
lastDragTime = 0.0; lastDeltaTime = 0.0;
animations = [[NSMutableArray alloc] init];
}
return self;
}
/// @endcond
#pragma mark - #pragma mark Ranges
/// @cond
-(voi d) setPl otRange : (CPTP1 otRange * )ne wRange
forCoordinate:(CPTCoordinate)coordinate
{
switch ( coordinate ) {
case CPTCoordinateX:
self.xRange = newRange;
break; case CPTCoordinateY:
self.yRange = newRange;
break; default:
// invalid coordinate—do nothing
break;
}
}
-(CPTPlotRange *)plotRangeForCoordinate:(CPTCoordinate)coordinate
{
CPTPlotRange *theRange = nil; switch ( coordinate ) {
case CPTCoordinateX:
theRange = self.xRange;
break; case CPTCoordinateY:
theRange = self.yRange;
break; default:
// invalid coordinate
break; }
return theRange;
}
-(void)setScaleType:(CPTScaleType)newType forCoordinate:(CPTCoordinate)coordinate
{
switch ( coordinate ) {
case CPTCoordinateX:
self.xScaleType = newType;
break; case CPTCoordinateY:
self.yScaleType = newType;
break; default:
// invalid coordinate—do nothing
break;
}
}
-(CPTScaleType)scaleTypeForCoordinate:(CPTCoordinate)coordinate
{
CPTScaleType theScaleType = CPTScaleTypeLinear; switch ( coordinate ) {
case CPTCoordinateX:
theScaleType = self.xScaleType;
break; case CPTCoordinateY:
theScaleType = self.yScaleType;
break; default:
// invalid coordinate
break;
}
return theScaleType;
}
-(voi d) setXRange : (CPTP1 otRange * )range
{
NSParameterAssert(range); if ( ! [range isEqualToRange:xRange] ) {
CPTPlotRange *constrainedRange; if ( selfallowsMomentumX ) {
constrainedRange = range;
}
else {
constrainedRange = [self con strainRange: range
toGl ob alRange : sel f . gl ob alXRange] ;
} id<CPTPlotSpaceDelegate> theDelegate = self, delegate;
if ( [theDelegate
respondsToSelector:@selector(plotSpace:willChangePlotRangeTo:forCoordinate:)] ) { constrainedRange = [theDelegate plotSpace:self
willChangePlotRangeToxonstrainedRange forCoordinate:CPTCoordinateX];
}
if ( ! [constrainedRange isEqualToRange:xRange] ) {
CGFloat displacement = self.lastDisplacement.x;
BOOL isScrolling = NO; if ( xRange && constrainedRange ) {
isScrolling = !CPTDecimalEquals(constrainedRange. location, xRange. location) && CPTDecimalEquals(constrainedRange. length, xRange. length); if ( isScrolling && ( displacement == CPTFloat(O.O) ) ) {
CPTGraph *theGraph = self.graph;
CPTPlotArea *plotArea = theGraph.plotAreaFrame.plotArea; if ( plotArea ) {
NSDecimal rangeLength = constrainedRange. length; if ( !CPTDecimalEquals( rangeLength, CPTDecimalFromlnteger(O) ) ) { NSDecimal diff =
CPTDecimalDivide(CPTDecimalSubtract(constrainedRange. location, xRange. location), rangeLength); displacement = plotArea.bounds.size.width *
CPTDecimalCGFloatValue(diff);
}
}
}
}
xRange = [constrainedRange copy]; [[NSNotificationCenter defaultCenter]
postNotificationName:CPTPlotSpaceCoordinateMappingDidChangeNotification
object: self
userlnfo : @ { CPTPlotSpaceCoordinateKey :
@(CPTCoordinateX),
CPTPlotSpaceScrollingKey: @(isScrolling), CPTPlotSpaceDisplacementKey:
©(displacement) }
];
if ( [theDelegate
respondsToSelector:@selector(plotSpace:didChangePlotRangeForCoordinate:)] ) {
[theDelegate plotSpace:self
didChangePlotRangeForCoordinate.CPTCoordinateX];
}
CPTGraph *theGraph = self, graph;
if ( theGraph ) {
[[NSNotificationCenter defaultCenter]
postNotificationName:CPTGraphNeedsRedrawNotification
objec theGraph];
}
}
-(void)setYRange:(CPTPlotRange *)range
{
NSParameterAssert(range); if ( ! [range isEqualToRange:yRange] ) {
CPTPlotRange *constrainedRange; if ( selfallowsMomentumY ) {
constrainedRange = range;
}
else {
constrainedRange = [self constrainRange:range
toGl ob alRange : self . gl ob al YRange] ;
}
id<CPTPlotSpaceDelegate> theDelegate = self, delegate;
if ( [theDelegate
respondsToSelector:@selector(plotSpace:willChangePlotRangeTo:forCoordinate:)] ) { constrainedRange = [theDelegate plotSpace:self
willChangePlotRangeToxonstrainedRange forCoordinate:CPTCoordinateY];
} if ( ! [constrainedRange isEqualToRange:yRange] ) {
CGFloat displacement = self.lastDisplacement.y;
BOOL isScrolling = NO; if ( yRange && constrainedRange ) {
isScrolling = !CPTDecimalEquals(constrainedRange. location, yRange. location) && CPTDecimalEquals(constrainedRange. length, yRange. length); if ( isScrolling && ( displacement == CPTFloat(O.O) ) ) {
CPTGraph *theGraph = selfgraph;
CPTPlotArea *plotArea = theGraph.plotAreaFrame.plotArea; if ( plotArea ) {
NSDecimal rangeLength = constrainedRange. length; if ( !CPTDecimalEquals( rangeLength, CPTDecimalFromlnteger(O) ) ) { NSDecimal diff =
CPTDecimalDivide(CPTDecimalSubtract(constrainedRange. location, yRange. location), rangeLength); displacement = plotArea.bounds. size. height *
CPTDecimalCGFloatValue(diff);
}
}
}
}
yRange = [constrainedRange copy];
[[NSNotificationCenter defaultCenter]
postNotificationName:CPTPlotSpaceCoordinateMappingDidChangeNotification
object: self
userInfo:@{ CPTPlotSpaceCoordinateKey:
@(CPTCoordinateY),
CPTPlotSpaceScrollingKey: @(isScrolling), CPTPlotSpaceDisplacementKey:
©(displacement) }
];
if ( [theDelegate
respondsToSelector:@selector(plotSpace:didChangePlotRangeForCoordinate:)] ) {
[theDelegate plotSpace:self
di dChangePl otRangeF orC oordi nate : CPTC oordinate Y] ;
}
CPTGraph *theGraph = selfgraph; if ( theGraph ) {
[[NSNotificationCenter defaultCenter]
postNotificationName:CPTGraphNeedsRedrawNotification
object: theGraph];
}
}
}
}
-(CPTPlotRange *)constrainRange:(CPTPlotRange *)existingRange
toGlobalRange:(CPTPlotRange *)globalRange
{
if ( ! globalRange ) {
return existingRange;
}
if ( ! existingRange ) {
return nil;
}
if ( CPTDecimalGreaterThanOrEqualTo(existingRange. length, globalRange. length) ) { return [globalRange copy];
}
else {
CPTMutablePlotRange *newRange = [existingRange mutableCopy];
[newRange shiftEndToFitInRange:globalRange];
[newRange shiftLocationToFitInRange:globalRange];
return newRange;
}
}
-(void)animateRangeForCoordinate:(CPTCoordinate)coordinate shift: (NSDecimal)shift momentum Time:(CGFloat)momentum Time speed:(CGFloat)speed
acceleration:(CGFloat)acceleration
{
NSMutableArray * animation Array = selfanimations;
CPTAnimationOperation *op;
NSString *property = nil;
CPTPlotRange *oldRange = nil;
CPTPlotRange *globalRange = nil; switch ( coordinate ) {
case CPTCoordinateX:
property = @"xRange";
oldRange = selfxRange;
globalRange = selfglobalXRange;
break; case CPTCoordinateY:
property = @"yRange";
oldRange = self.yRange;
globalRange = self.globalYRange;
break; default:
break;
}
CPTMutablePlotRange *newRange = [oldRange mutableCopy];
CGFloat bounceDelay = CPTFloat(O.O);
NSDecimal zero = CPTDecimalFromlnteger(O);
BOOL hasShift = !CPTDecimalEquals(shift, zero); if ( hasShift ) {
newRange. location = CPTDecimalAdd(newRange. location, shift); op = [CPTAnimation animate: self
property: property
fromPl otRange : ol dRange
toPl otRange : newRange
duration:momentumTime
animationCurve:self.momentumAnimationCurve
delegate:self];
[animationArray addObjec op]; bounceDelay = momentumTime;
}
if ( globalRange ) {
CPTPlotRange *constrainedRange = [self constrainRange: newRange toGlobalRange:globalRange]; if ( ! [newRange isEqualToRangexonstrainedRange] && ! [globalRange containsRange: newRange] ) {
BOOL direction = ( CPTDecimalGreaterThan(shift, zero) && CPTDecimalGreaterThan(ol dRange. length, zero) ) ||
( CPTDecimalLessThan(shift, zero) && CPTDecimalLessThan(oldRange. length, zero) );
// decelerate at the global range
if ( hasShift ) {
CGFloat brakingDelay = CPTFloat(NAN); if ( [globalRange containsRange:oldRange] ) {
// momentum started inside the global range; coast until we hit the global range
CGFloat globalPoint = [self viewCoordinateForRange:globalRange coordinate : coordinate direction : direction] ;
CGFloat oldPoint = [self viewCoordinateForRange:oldRange coordinate : coordinate direction : direction] ;
CGFloat brakingOffset = globalPoint - oldPoint;
brakingDelay = firstPositiveRoot(acceleration, speed, brakingOffset); if ( !isnan(brakingDelay) ) {
speed -= brakingDelay * acceleration;
// slow down quickly
while ( momentumTime > CPTFloat(O. l) ) {
acceleration *= CPTFloat(2.0);
momentumTime = speed / (CPTFloat(2.0) * acceleration);
}
CGFloat distanceTraveled = speed * momentumTime - CPTFloat(0.5) * acceleration * momentumTime * momentumTime;
CGFloat brakingLength = globalPoint - distanceTraveled;
CGPoint brakingPoint = CGPointZero;
switch ( coordinate ) {
case CPTCoordinateX:
brakingPoint = CPTPointMake(brakingLength, 0.0);
break; case CPTCoordinateY:
brakingPoint = CPTPointMake(0.0, brakingLength);
break; default:
break;
}
NSDecimal newPoint[2];
[self plotPoin newPoint numberOfCoordinates:2
forPlotAreaViewPoin brakingPoint];
NSDecimal brakingShift = CPTDecimalSubtract(newPoint[coordinate], direction ? globalRange. end : globalRange. location);
[newRange shiftEndToFitInRange:globalRange];
[newRange shiftLocationToFitInRange:globalRange]; newRange. location = CPTDecimalAdd(newRange. location, brakingShift);
}
}
else {
// momentum started outside the global range
brakingDelay = CPTFloat(O.O);
// slow down quickly
while ( momentumTime > CPTFloat(O. l) ) {
momentumTime *= CPTFloat(0.5); shift = CPTDecimalDivide( shift, CPTDecimalFromInteger(2) );
}
newRange = [oldRange mutableCopy]; newRange. location = CPTDecimalAdd(newRange. location, shift);
}
if ( !isnan(brakingDelay) ) {
op = [CPT Animation animate: self
property: property
fromPlotRangexonstrainedRange
toPl otRange : newRange
duration:momentumTime
withDelay:brakingDelay
animationCurve:self.momentumAnimationCurve
delegate:self];
[animationArray addObjec op]; bounceDelay = momentumTime + brakingDelay;
}
}
// bounce back to the global range
CGFloat newPoint = [self viewCoordinateForRange:newRange
coordinate: coordinate direction: ! direction];
CGFloat constrainedPoint = [self viewCoordinateForRangexonstrainedRange coordinate: coordinate direction: ! direction];
CGFloat offset = constrainedPoint - newPoint;
CGFloat bounceTime = sqrt(ABS(offset) / selfbounceAcceleration); op = [CPTAnimation animate: self
property: property
fromPl otRange : newRange toPlotRangexonstrainedRange
duration:bounceTime
withDelay:bounceDelay
animationCurve:self.bounceAnimationCurve
delegate:self ;
[animationArray addObject:op];
}
}
}
-(CGFloat)viewCoordinateForRange:(CPTPlotRange *)range
coordinate:(CPTCoordinate)coordinate direction:(BOOL)direction
{
CPTCoordinate orthogonalCoordinate = CPTOrthogonalCoordinate(coordinate); NSDecimal point[2]; point[coordinate] = (direction ? range. maxLimit : range. minLimit);
point[orthogonalCoordinate] = CPTDecimalFromlnteger(l);
CGPoint viewPoint = [self plotAreaViewPointForPlotPoin point
numberOfCoordinates:2];
CGFloat pointCoordinate = CPTFloat(NAN); switch ( coordinate ) {
case CPTCoordinateX:
pointCoordinate = viewPoint.x;
break; case CPTCoordinateY:
pointCoordinate = viewPoint.y;
break; default:
break;
}
return pointCoordinate;
}
// return NAN if no positive roots
CGFloat firstPositiveRoot(CGFloat a, CGFloat b, CGFloat c)
{
CGFloat root = CPTFloat(NAN);
CGFloat discriminant = sqrt(b * b - CPTFloat(4.0) * a * c); CGFloat rootl = (-b + discriminant) / (CPTFloat(2.0) * a);
CGFloat root2 = (-b - discriminant) / (CPTFloat(2.0) * a); if ( !isnan(rootl) && !isnan(root2) ) {
if ( rootl >= CPTFloat(O.O) ) {
root = rootl;
}
if ( ( root2 >= CPTFloat(O.O) ) && ( isnan(root) || (root2 < root) ) ) {
root = root2;
}
}
return root;
}
-(voi d) setGl ob alXRange : (C PTP1 otRange * )newRange
{
if ( ! [newRange isEqualToRange:globalXRange] ) {
globalXRange = [newRange copy];
selfxRange = [self constrainRange:self.xRange toGlobalRange:globalXRange];
}
}
-(void)setGlobalYRange:(CPTPlotRange *)newRange
{
if ( ! [newRange isEqualToRange:globalYRange] ) {
globalYRange = [newRange copy];
selfyRange = [self constrainRange:self.yRange toGlobalRange:globalYRange];
}
}
-(voi d)scaleToFitPlots:(NS Array *)plots
{
if ( plots. count == 0 ) {
return;
}
// Determine union of ranges
CPTMutablePlotRange *unionXRange = nil;
CPTMutablePlotRange *unionYRange = nil;
for ( CPTPlot *plot in plots ) {
CPTPlotRange *currentXRange = [plot plotRangeForCoordinate:CPTCoordinateX]; CPTPlotRange * current YRange = [plot plotRangeForCoordinate:CPTCoordinateY]; if ( !unionXRange ) {
unionXRange = [currentXRange mutableCopy];
}
if ( !unionYRange ) { union YRange = [current YRange mutableCopy];
}
[unionXRange unionPlotRange: currentXRange] ;
[union YRange unionPlotRange: current YRange] ;
}
// Set range
NSDecimal zero = CPTDecimalFromlnteger(O);
if ( unionXRange ) {
if ( CPTDecimalEquals(unionXRange. length, zero) ) {
[uni onXRange uni onPlotRange : self . xRange] ;
}
selfxRange = unionXRange;
}
if ( union YRange ) {
if ( CPTDecimalEquals(union YRange. length, zero) ) {
[uni on YRange uni onPlotRange : self . yRange] ;
}
self. yRange = union YRange;
}
-(void)setXScaleType:(CPTScaleType)newScaleType
{
if ( newScaleType != xScaleType ) {
xScaleType = newScaleType;
[[NSNotificationCenter defaultC enter]
postNotificationName:CPTPlotSpaceCoordinateMappingDidChangeNotification object: self
userInfo:@{ CPTPlotSpaceCoordinateKey:
@(CPTCoordinateX) }
];
CPTGraph *theGraph = selfgraph;
if ( theGraph ) {
[[NSNotificationCenter defaultCenter]
postNotificationName:CPTGraphNeedsRedrawNotification
object: theGraph];
}
}
-(void)setYScaleType:(CPTScaleType)newScaleType
{
if ( newScaleType != yScaleType ) {
yScaleType = newScaleType; [[NSNotificationCenter defaultCenter]
postNotificationName:CPTPlotSpaceCoordinateMappingDidChangeNotification object: self
userInfo:@{ CPTPlotSpaceCoordinateKey:
@(CPTCoordinateY) }
];
CPTGraph *theGraph = self.graph;
if ( theGraph ) {
[[NSNotificationCenter defaultCenter]
postNotificationName:CPTGraphNeedsRedrawNotifi cation
objecttheGraph];
}
}
/// @endcond #pragma mark -
#pragma mark Point Conversion (private utilities) /// @cond // Linear
-(CGFloat)viewCoordinateForViewLength:(NSDecimal)viewLength
HnearPlotRange:(CPTPlotRange *)range plotCoordinateValue:(NSDecimal)plotCoord
{
if ( ! range ) {
return CPTFloat(O.O);
}
NSDecimal factor = CPTDecimalDivide(CPTDecimalSubtract(plotCoord, range. location), range. length);
if ( NSDecimalIsNotANumber(&factor) ) {
factor = CPTDecimalFromlnteger(O);
}
NSDecimal viewCoordinate = CPTDecimalMultiply(viewLength, factor); return CPTDecimalCGFloatValue(viewCoordinate);
}
-(CGFloat)viewCoordinateForViewLength:(CGFloat)viewLength
1 i nearPl otRange : (CPTP1 otRange * )range
doublePrecisionPlotCoordinateValue:(double)plotCoord
{ if ( !range || (range. lengthDouble == 0.0) ) {
return CPTFloat(O.O);
}
return viewLength * (CGFloat)( (plotCoord - range. locationDouble) /
range. lengthDouble );
}
-(NSDecimal)plotCoordinateForViewLength:(NSDecimal)viewLength
HnearPlotRange:(CPTPlotRange *)range boundsLength:(NSDecimal)boundsLength {
const NSDecimal zero = CPTDecimalFromlnteger(O); if ( CPTDecimalEquals(boundsLength, zero) ) {
return zero;
}
NSDecimal location = range. location;
NSDecimal length = range. length; NSDecimal coordinate;
NSDecimalDivide(&coordinate, &viewLength, &boundsLength, NSRoundPlain); NSDecimalMultiply(&coordinate, &coordinate, &length, NSRoundPlain);
NSDecimalAdd(&coordinate, &coordinate, &location, NSRoundPlain); return coordinate;
}
-(double)doublePrecisionPlotCoordinateForViewLength:(CGFloat)viewLength
HnearPlotRange:(CPTPlotRange *)range boundsLength:(CGFloat)boundsLength
{
if ( boundsLength == 0.0 ) {
return 0.0;
}
double coordinate = viewLength / boundsLength;
coordinate *= range. lengthDouble;
coordinate += range. locationDouble; return coordinate;
}
// Log (only one version since there are no transcendental functions for NSDecimal) -(CGFloat)viewCoordinateForViewLength:(CGFloat)viewLength
1 ogPl otRange : (CPTP1 otRange * )range
doublePrecisionPlotCoordinateValue:(double)plotCoord
{
if ( (range. minLimitDouble <= 0.0) || (range. maxLimitDouble <= 0.0) || (plotCoord <= 0.0) ) {
return CPTFloat(O.O);
}
double logLoc = loglO(range.locationDouble);
double logCoord = loglO(plotCoord);
double logEnd = loglO(range.endDouble); return viewLength * (CGFloat)( (logCoord - logLoc) / (logEnd - logLoc) ); }
-(double)doublePrecisionPlotCoordinateForViewLength:(CGFloat)viewLength logPlotRange:(CPTPlotRange *)range boundsLength:(CGFloat)boundsLength
{
if ( boundsLength == 0.0 ) {
return 0.0;
}
double logLoc = loglO(range.locationDouble);
double logEnd = loglO(range.endDouble); double coordinate = viewLength * (logEnd - logLoc) / boundsLength + logLoc; return pow(10.0, coordinate);
}
/// @endcond
#pragma mark - #pragma mark Point Conversion
/// @cond
-(NSUInteger)numberOfCoordinates
{
return 2;
}
// Plot area view point for plot point
-(CGPoint)plotAreaViewPointForPlotPoint:(NSDecimal *)plotPoint
numberOfCoordinates : (NSUInteger)count
{
CGPoint viewPoint = [super plotAreaViewPointForPlotPoin plotPoint numberOfCoordinates : count] ;
CGSize layerSize;
CPTGraph *theGraph = self, graph; CPTPlotArea *plotArea = theGraph.plotAreaFrame.plotArea; if ( plotArea ) {
layerSize = plotArea.bounds.size;
}
else {
return viewPoint;
}
switch ( self.xScaleType ) {
case CPTScaleTypeLinear:
case CPTScaleTypeCategory:
viewPoint.x = [self viewCoordinateForViewLength:plotArea.widthDecimal linearPlotRange:self.xRange plotCoordinateValue:plotPoint[CPTCoordinateX]]; break; case CPTScaleTypeLog:
{
double x = CPTDecimalDoubleValue(plotPoint[CPTCoordinateX]);
viewPoint.x = [self viewCoordinateForViewLengttdayerSize. width logPlotRange:self.xRange doublePrecisionPlotCoordinateValue:x];
}
break; default:
[NSException raise:CPTException format:@"Scale type not supported in CPTXYPlotSpace"];
}
switch ( selfyScaleType ) {
case CPTScaleTypeLinear:
case CPTScaleTypeCategory:
viewPoint.y = [self viewCoordinateForViewLength:plotArea.heightDecimal HnearPlotRange:selfyRange plotCoordinateValue:plotPoint[CPTCoordinateY]]; break; case CPTScaleTypeLog:
{
double y = CPTDecimalDoubleValue(plotPoint[CPTCoordinateY]);
viewPoint.y = [self viewCoordinateForViewLength:layerSize. height logPlotRange:self.yRange doublePrecisionPlotCoordinateValue:y];
}
break; default:
[NSException raise:CPTException format:@"Scale type not supported in CPTXYPlotSpace"]; }
return viewPoint;
}
-(CGPoint)plotAreaViewPointForDoublePrecisionPlotPoint:(double *)plotPoint numberOfCoordinates : (NSUInteger)count
{
CGPoint viewPoint = [super plotAreaViewPointForDoublePrecisionPlotPointplotPoint numberOfC oordinates : count] ;
CGSize layerSize;
CPTGraph *theGraph = self, graph;
CPTPlotArea *plotArea = theGraph.plotAreaFrame.plotArea; if ( plotArea ) {
layerSize = plotArea.bounds.size;
}
else {
return viewPoint;
}
switch ( self.xScaleType ) {
case CPTScaleTypeLinear:
case CPTScaleTypeCategory:
viewPoint.x = [self viewCoordinateForViewLengttdayerSize. width
HnearPlotRange: selfxRange
doublePrecisionPlotCoordinateValue:plotPoint[CPTCoordinateX]];
break; case CPTScaleTypeLog:
viewPoint.x = [self viewCoordinateForViewLengttdayerSize. width
logPlotRange: selfxRange
doublePrecisionPlotCoordinateValue:plotPoint[CPTCoordinateX]];
break; default:
[NSException raise:CPTException format:@"Scale type not supported in
CPTXYPlotSpace"];
}
switch ( selfyScaleType ) {
case CPTScaleTypeLinear:
case CPTScaleTypeCategory:
viewPoint.y = [self viewCoordinateForViewLength:layerSize. height
1 inearPl otRange : self, y Range
doublePrecisionPlotCoordinateValue:plotPoint[CPTCoordinateY]]; break; case CPTScaleTypeLog:
viewPoint.y = [self viewCoordinateForViewLength:layerSize. height
logPlotRange: self.yRange
doublePrecisionPlotCoordinateValue:plotPoint[CPTCoordinateY]];
break; default:
[NSException raise:CPTException format:@"Scale type not supported in CPTXYPlotSpace"];
}
return viewPoint;
}
// Plot point for view point
-(void)plotPoint:(NSDecimal *)plotPoint numberOfCoordinates:(NSUInteger)count forPlotAreaViewPoint:(CGPoint)point
[super plotPoint:plotPoint numberOfCoordinatesxount forPlotAreaViewPoin point];
CGSize boundsSize;
CPTGraph *theGraph = self, graph;
CPTPlotArea *plotArea = theGraph.plotAreaFrame.plotArea; if ( plotArea ) {
boundsSize = plotArea.bounds.size;
}
else {
NSDecimal zero = CPTDecimalFromlnteger(O);
plotPoint[CPTCoordinateX] = zero;
plotPoint[CPTCoordinateY] = zero;
return; switch ( selfxScaleType ) {
case CPTScaleTypeLinear:
case CPTScaleTypeCategory:
plotPoint[CPTCoordinateX] = [self
plotCoordinateForViewLength:CPTDecimalFromCGFloat(point.x)
HnearPlotRange:self.xRange boundsLength:plotArea.widthDecimal];
break; case CPTScaleTypeLog:
plotPoint[CPTCoordinateX] = CPTDecimalFromDouble([self
doublePrecisionPlotCoordinateForViewLength:point.x logPlotRange: selfxRange boundsLength:boundsSize.width]);
break; default:
[NSException raise:CPTException format:@"Scale type not supported in
CPTXYPlotSpace"];
}
switch ( self.yScaleType ) {
case CPTScaleTypeLinear:
case CPTScaleTypeCategory:
plotPoint[CPTCoordinateY] = [self
plotCoordinateForViewLength:CPTDecimalFromCGFloat(point.y)
HnearPlotRange:self.yRange boundsLength:plotArea.heightDecimal];
break; case CPTScaleTypeLog:
plotPoint[CPTCoordinateY] = CPTDecimalFromDouble([self
doublePrecisionPlotCoordinateForViewLength:point.y logPlotRange:self.yRange boundsLength:boundsSize. height]);
break; default:
[NSException raise:CPTException format:@"Scale type not supported in CPTXYPlotSpace"];
}
}
-(void)doublePreci si onPlotPoint: (double *)plotPoint
numberOfCoordinates :(NSUInteger)count forPlotAreaViewPoint:(CGPoint)point
{
[super doublePrecisionPlotPoin plotPoint numberOfCoordinates: count forPlotAreaViewPoint:point]; CGSize boundsSize;
CPTGraph *theGraph = selfgraph;
CPTPlotArea *plotArea = theGraph.plotAreaFrame.plotArea; if ( plotArea ) {
boundsSize = plotArea.bounds.size;
}
else {
plotPoint[CPTCoordinateX] = 0.0;
plotPoint[CPTCoordinateY] = 0.0;
return;
} switch ( self.xScaleType ) {
case CPTScaleTypeLinear:
case CPTScaleTypeCategory:
plotPoint[CPTCoordinateX] = [self
doublePrecisionPlotCoordinateForViewLength:point.x HnearPlotRange:self.xRang boundsLength:boundsSize. width];
break; case CPTScaleTypeLog:
plotPoint[CPTCoordinateX] = [self
doublePrecisionPlotCoordinateForViewLength:point.x logPlotRange:self.xRange boundsLength:boundsSize. width];
break; default:
[NSException raise:CPTException format:@"Scale type not supported in CPTXYPlotSpace"];
}
switch ( selfyScaleType ) {
case CPTScaleTypeLinear:
case CPTScaleTypeCategory:
plotPoint[CPTCoordinateY] = [self
doublePrecisionPlotCoordinateForViewLength:point.y HnearPlotRange:selfyRang boundsLength:boundsSize. height];
break; case CPTScaleTypeLog:
plotPoint[CPTCoordinateY] = [self
doublePrecisionPlotCoordinateForViewLength:point.y logPlotRange:self.yRange boundsLength.boundsSize. height];
break; default:
[NSException raise:CPTException format:@"Scale type not supported in CPTXYPlotSpace"];
}
}
// Plot area view point for event
-(CGPoint)plotAreaViewPointForEvent:(CPTNativeEvent *)event
{
CGPoint plotAreaViewPoint = CGPointZero;
CPTGraph *theGraph = self, graph;
CPTGraphHostingView *theHostingView = theGraph. hosting View;
CPTPlotArea *thePlotArea = theGraph. plotAreaFrame.plotArea; if ( theHostingView && thePlotArea ) {
#if TARGET IPHO E SIMULATOR || T ARGET O S IPHO E
CGPoint interactionPoint = [[[event touchesForView:theHostingView] anyObject] 1 ocati onln Vi ew : theHosting Vi ew] ;
if ( theHosting View.collapsesLayers ) {
interactionPoint.y = theHosting View.frame. size. height - interactionPoint.y; plotAreaViewPoint = [theGraph convertPointinteractionPoint
toLayenthePlotArea];
}
else {
plotAreaViewPoint = [theHosting View.layer convertPointinteractionPoint toLayenthePlotArea];
}
#else
CGPoint interactionPoint = NSPointToCGPoint([theHostingView
convertPoint: [event locati onln Window] fromView:nil]);
plotAreaViewPoint = [theHosting View.layer convertPointinteractionPoint toLayenthePlotArea];
#endif
}
return plotAreaViewPoint;
}
// Plot point for event
-(void)plotPoint(NSDecimal *)plotPoint numberOfCoordinates:(NSUInteger)count forEvent:(CPTNativeEvent *)event
{
[self plotPoint:plotPoint numberOfCoordinatesxount forPlotAreaViewPoint:[self pi ot Area Vi ewPointF orEvent : event] ] ;
}
-(void)doublePrecisionPlotPoint(double *)plotPoint
numberOfCoordinates:(NSUInteger)count forEvent:(CPTNativeEvent *)event
{
[self doublePrecisionPlotPointplotPoint numberOfCoordinatesxount
forPlotAreaViewPoint: [self plotAreaViewPointForEvent event]];
}
/// @endcond
#pragma mark
#pragma mark
/// (¾cond -(void)scaleBy:(CGFloat)interactionScale aboutPoint:(CGPoint)plotAreaPoint
{
CPTGraph *theGraph = self, graph;
CPTPlotArea *plotArea = theGraph.plotAreaFrame.plotArea; if ( !plotArea || (interactionScale <= l .e-6) ) {
return;
}
if ( ! [plotArea containsPoint:plotAreaPoint] ) {
return;
}
// Ask the delegate if it is OK
id<CPTPlotSpaceDelegate> theDelegate = selfdelegate;
BOOL shouldScale = YES;
if ( [theDelegate respondsToSelector:@selector(plotSpace:shouldScaleBy:aboutPoint:)]
) {
shouldScale = [theDelegate plotSpace:self shouldScaleBy interactionScale aboutPoin plotAreaPoint];
}
if ( ! shouldScale ) {
return;
}
// Determine point in plot coordinates
NSDecimal const decimalScale = CPTDecimalFromCGFloat(interactionScale);
NSDecimal plotInteractionPoint[2];
[self plotPoin plotlnteractionPoint numberOfCoordinates:2
forPlotAreaViewPoin plotAreaPoint];
// Cache old ranges
CPTPlotRange *oldRangeX = selfxRange;
CPTPlotRange *oldRangeY = selfyRange;
// Lengths are scaled by the pinch gesture inverse proportional
NSDecimal newLengthX = CPTDecimalDivide(oldRangeX. length, decimalScale);
NSDecimal newLengthY = CPTDecimalDivide(oldRangeY. length, decimalScale); // New locations
NSDecimal newLocationX;
if ( CPTDecimalGreaterThanOrEqualTo( oldRangeX.length,
CPTDecimalFromlnteger(O) ) ) {
NSDecimal oldFirstLengthX =
CPTDecimalSubtract(plotInteractionPoint[CPTCoordinateX], oldRangeX.minLimit); // x - minX
NSDecimal newFirstLengthX = CPTDecimalDivide(oldFirstLengthX, decimalScale); // (x - minX) / scale
newLocationX = CPTDecimalSubtract(plotInteractionPoint[CPTCoordinateX], newFirstLengthX);
}
else {
NSDecimal oldSecondLengthX = CPTDecimalSubtract(oldRangeX.maxLimit, plotInteractionPoint[0]); // maxX - x
NSDecimal newSecondLengthX = CPTDecimalDivide(oldSecondLengthX, decimal Scale); // (maxX - x) / scale
newLocationX = CPTDecimal Add(plotInteractionPoint[CPTCoordinateX], newSecondLengthX);
}
NSDecimal newLocationY;
if ( CPTDecimalGreaterThanOrEqualTo( oldRangeY.length,
CPTDecimalFromlnteger(O) ) ) {
NSDecimal oldFirstLengthY =
CPTDecimalSubtract(plotInteractionPoint[CPTCoordinateY], oldRangeY.minLimit); II y - minY
NSDecimal newFirstLengthY = CPTDecimalDivide(oldFirstLengthY, decimal Scale);
// (y - minY) / scale
newLocationY = CPTDecimalSubtract(plotInteractionPoint[CPTCoordinateY], newFirstLengthY);
}
else {
NSDecimal oldSecondLengthY = CPTDecimalSubtract(oldRangeY.maxLimit, plotInteractionPoint[l]); // maxY - y
NSDecimal newSecondLengthY = CPTDecimalDivide(oldSecondLengthY, decimal Scale); // (maxY - y) / scale
newLocationY = CPTDecimalAdd(plotInteractionPoint[CPTCoordinateY], newSecondLengthY);
}
// New ranges
CPTPlotRange *newRangeX = [[CPTPlotRange alloc] initWithLocation:newLocationX 1 ength : newLengthX] ;
CPTPlotRange *newRangeY = [[CPTPlotRange alloc] initWithLocatiomnewLocationY 1 ength : newLength Y] ; BOOL oldMomentum = selfallowsMomentumX;
selfallowsMomentumX = NO;
selfxRange = newRangeX;
selfallowsMomentumX = oldMomentum; oldMomentum = selfallowsMomentumY;
selfallowsMomentumY = NO;
selfyRange = newRangeY; self.allowsMomentumY = oldMomentum;
}
/// @endcond
#pragma mark - #pragma mark Interaction
/// @name User Interaction
/// @{
/**
* @brief Informs the receiver that the user has
* @if MacOnly pressed the mouse button. @endif
* @if iOSOnly touched the screen. @endif
*
*
* If the receiver has a @ref delegate and the delegate handles the event,
* this method always returns @YES.
* If @ref allowsUserlnteraction is @NO
* or the graph does not have a @link CPTPlotAreaFrame: :plotArea plotArea @endlink layer,
* this method always returns @NO.
* Otherwise, if the @par{interactionPoint} is within the bounds of the
* @link CPTPlotAreaFrame: :plotArea plotArea @endlink, a drag operation starts and
* this method returns @YES.
*
* @param event The OS event.
* @param interactionPoint The coordinates of the interaction.
* @return Whether the event was handled or not.
**/
-(BOOL)pointingDeviceDownEvent:(CPTNativeEvent *)event
atPoint:(CGPoint)interactionPoint
{
selfisDragging = NO;
BOOL handledByDelegate = [super pointingDeviceDownEven event
atPointinteractionPoint];
if ( handledByDelegate ) {
return YES;
}
CPTGraph *theGraph = self, graph;
CPTPlotArea *plotArea = theGraph.plotAreaFrame. plotArea;
if ( !selfallowsUserlnteraction || !plotArea ) {
return NO;
} CGPoint pointlnPlotArea = [theGraph convertPointinteractionPoint toLayer:plotArea]; if ( [plotArea containsPoin pointlnPlotArea] ) {
// Handle event
selflastDragPoint = pointlnPlotArea;
selflastDisplacement = CGPointZero;
selflastDragTime = event.timestamp;
selflastDeltaTime = 0.0; // Clear any previous animations
NSMutableArray * animation Array = selfanimations;
for ( CPTAnimationOperation *op in animationArray ) {
[[CPT Animation sharedlnstance] removeAnimationOperatiomop];
}
[animationArray removeAllObj ects] ; return YES;
}
return NO;
}
* @brief Informs the receiver that the user has
* @if MacOnly released the mouse button. @endif
* @if iOSOnly lifted their finger off the screen. @endif
*
*
* If the receiver has a @ref delegate and the delegate handles the event,
* this method always returns @YES.
* If @ref allowsUserlnteraction is @NO
* or the graph does not have a @link CPTPlotAreaFrame: :plotArea plotArea @endlink layer,
* this method always returns @NO.
* Otherwise, if a drag operation is in progress, it ends and
* this method returns @YES.
*
* @param event The OS event.
* @param interactionPoint The coordinates of the interaction.
* @return Whether the event was handled or not.
**/
-(BOOL)pointingDeviceUpEvent:(CPTNativeEvent *)event
atPoint:(CGPoint)interactionPoint
{
BOOL handledByDelegate = [super pointingDeviceUpEvent: event
atPoin interactionPoint]; if ( handledByDelegate ) {
return YES;
}
CPTGraph *theGraph = self, graph;
CPTPlotArea *plotArea = theGraph.plotAreaFrame.plotArea;
if ( !selfallowsUserlnteraction || !plotArea ) {
return NO;
}
if ( selfisDragging ) {
selfisDragging = NO;
CGFloat acceleration = CPTFloat(O.O);
CGFloat speed = CPTFloat(O.O);
CGFloat momentumTime = CPTFloat(O.O);
NSDecimal shiftX = CPTDecimalFromlnteger(O);
NSDecimal shiftY = CPTDecimalFromlnteger(O);
CGFloat scaleX = CPTFloat(O.O);
CGFloat scaleY = CPTFloat(O.O); if ( selfallowsMomentum ) {
NSTimelnterval deltaT = event.timestamp - selflastDragTime;
NSTimelnterval lastDeltaT = selflastDeltaTime; if ( (deltaT > 0.0) && (deltaT < 0.05) && (lastDeltaT > 0.0) ) {
CGPoint pointlnPlotArea = [theGraph convertPoin interactionPoint toLayenplotArea];
CGPoint displacement = selflastDisplacement; acceleration = selfmomentumAcceleration;
speed = sqrt(displacement.x * displacement.x + displacement.y * displacement.y) / CPTFloat(lastDeltaT);
momentumTime = speed / (CPTFloat(2.0) * acceleration);
CGFloat distanceTraveled = speed * momentumTime - CPTFloat(0.5) * acceleration * momentumTime * momentumTime;
distanceTraveled = MAX( distanceTraveled, CPTFloat(O.O) );
CGFloat theta = atan2(displacement.y, displacement.x);
scaleX = cos(theta);
scaleY = sin(theta); NSDecimal lastPoint[2], newPoint[2];
[self plotPoin lastPoint numberOfCoordinates:2
forPlotAreaViewPoint:point!nPlotArea]; [self plotPointnewPoint numberOfCoordinates:2
forPlotAreaViewPoint:CGPointMake(pointInPlotArea.x + distanceTraveled * scaleX, pointlnPlotArea.y + distanceTraveled * scaleY)]; if ( selfallowsMomentumX ) {
shiftX = CPTDecimalSubtract(lastPoint[CPTCoordinateX],
newPoint[CPTCoordinateX]);
}
if ( selfallowsMomentumY ) {
shiftY = CPTDecimalSubtract(lastPoint[CPTCoordinateY],
newPoint[CPTCoordinateY]);
}
}
}
// X range
[self animateRangeForCoordinate:CPTCoordinateX
shift: shiftX
momentumTime:momentumTime
speed: speed * scaleX
acceleration:acceleration * scaleX];
// Y range
[self animateRangeForCoordinate:CPTCoordinateY
shift: shiftY
momentum Time:momentum Time
speed: speed * scaleY
acceleratiomacceleration * scaleY]; return YES;
}
return NO;
}
/**
* @brief Informs the receiver that the user has moved
* @if MacOnly the mouse with the button pressed. @endif
* @if iOSOnly their finger while touching the screen. @endif
* If the receiver has a @ref delegate and the delegate handles the event,
* this method always returns @YES.
* If @ref allowsUserlnteraction is @NO
* or the graph does not have a @link CPTPlotAreaFrame: :plotArea plotArea @endlink layer, * this method always returns @NO.
* Otherwise, if a drag operation commences or is in progress, the @ref xRange
* and @ref yRange are shifted to follow the drag and
* this method returns @YES.
*
* @param event The OS event.
* @param interactionPoint The coordinates of the interaction.
* @return Whether the event was handled or not.
**/
-(BOOL)pointingDeviceDraggedEvent:(CPTNativeEvent *)event
atPoint:(CGPoint)interactionPoint
{
BOOL handledByDelegate = [super pointingDeviceDraggedEvent: event
atPointinteractionPoint]; if ( handledByDelegate ) {
return YES;
}
CPTGraph *theGraph = self, graph;
CPTPlotArea *plotArea = theGraph. plotAreaFrame.plotArea;
if ( !selfallowsUserlnteraction || !plotArea ) {
return NO;
}
CGPoint lastDraggedPoint = selflastDragPoint;
CGPoint pointlnPlotArea = [theGraph convertPoin interactionPoint toLayer:plotArea]; CGPoint displacement = CPTPointMake(pointInPlotArea.x - lastDraggedPoint.x, pointlnPlotArea.y - lastDraggedPoint.y); if ( ! selfisDragging ) {
// Have we started dragging, i.e., has the interactionPoint moved sufficiently to indicate a drag has started?
CGFloat displacedBy = sqrt(displacement.x * displacement.x + displacement.y * displacement.y);
selfisDragging = (displacedBy > selfminimumDisplacementToDrag);
}
if ( selfisDragging ) {
CGPoint pointToUse = pointlnPlotArea; id<CPTPlotSpaceDelegate> theDelegate = self, delegate;
// Allow delegate to override
if ( [theDelegate respondsToSelector:@selector(plotSpace:willDisplaceBy:)] ) { displacement = [theDelegate plotSpace:self willDisplaceBy:displacement];
pointToUse = CPTPointMake(lastDraggedPoint.x + displacement.x, lastDraggedPoint.y + displacement.y);
}
NSDecimal lastPoint[2], newPoint[2];
[self plotPoin lastPoint numberOfCoordinates:2
forPlotAreaViewPoint:lastDraggedPoint];
[self plotPoin newPoint numberOfCoordinates:2
forPlotAreaViewPoint:pointToUse];
// X range
NSDecimal shiftX = CPTDecimalSubtract(lastPoint[CPTCoordinateX], newPoint[CPTCoordinateX]);
CPTPl otRange *newRangeX = [self shiftRange:self.xRange
by:shiftX
usingMomentum:self.allowsMomentumX
inGl ob alRange : self . gl ob alXRange
withDisplacement:&displacement.x];
// Y range
NSDecimal shiftY = CPTDecimalSubtract(lastPoint[CPTCoordinateY], newPoint[CPTCoordinateY]);
CPTPl otRange *newRangeY = [self shiftRange:self.yRange
by: shiftY
usingMomentum:self.allowsMomentumY
inGl ob alRange : self . gl ob al YRange
withDisplacement&displacement.y]; selflastDragPoint = pointlnPlotArea;
selflastDisplacement = displacement;
NSTimelnterval currentTime = event.timestamp;
selflastDeltaTime = currentTime - selflastDragTime
selflastDragTime = currentTime; selfxRange = newRangeX;
selfyRange = newRangeY; return YES;
}
return NO;
}
-(CPTPlotRange *)shiftRange:(CPTPlotRange *)oldRange by:(NSDecimal)shift usingMomentum:(BOOL)momentum inGlobalRange:(CPTPlotRange *)globalRange withDisplacement:(CGFloat *)displacement
{ CPTMutablePlotRange *newRange = [oldRange mutableCopy]; newRange. location = CPTDecimalAdd(newRange. location, shift); if ( globalRange ) {
CPTPlotRange *constrainedRange = [self constrainRange: newRange toGl ob alRange : glob alRange] ; if ( momentum ) {
if ( ! [newRange isEqualToRangexonstrainedRange] ) {
// reduce the shift as we get farther outside the global range
NSDecimal rangeLength = newRange. length; if ( !CPTDecimalEquals( rangeLength, CPTDecimalFromlnteger(O) ) ) { NSDecimal diff =
CPTDecimalDivide(CPTDecimalSubtract(constrainedRange. location,
newRange. location), rangeLength);
diff = CPTDecimalMax( CPTDecimalMin( CPTDecimalMultiply( diff, CPTDecimalFromDouble(2.5) ), CPTDecimalFromlnteger(l) ),
CPTDecimalFromlnteger(-l) ); newRange. location = CPTDecimalSubtract( newRange. location, CPTDecimalMultiply( shift, CPTDecimalAbs(diff) ) );
*displacement = *displacement * ( CPTFloat(l .O) - ABS( CPTDecimalCGFloatValue(diff) ) );
}
}
}
else {
newRange = (CPTMutablePlotRange *)constrainedRange;
return newRange;
}
#if TARGET IPHONE SIMULATOR || T ARGET O S IPHONE
#else
/**
* @brief Informs the receiver that the user has moved the scroll wheel.
*
*
* If the receiver does not have a @ref delegate,
* this method always returns @NO. Otherwise, the
* @link CPTPlotSpaceDelegate: :plotSpace:shouldHandleScrollWheelEvent:fromPoint:toPoint: - plotSpace:shouldHandleScrollWheelEvent:fromPoint:toPoint: @endlink
* delegate method is called. If it returns @NO, this method returns @YES
* to indicate that the event has been handled and no further processing should occur. *
* @param event The OS event.
* @param fromPoint The starting coordinates of the interaction.
* @param toPoint The ending coordinates of the interaction.
* @return Whether the event was handled or not.
**/
-(BOOL)scrollWheelEvent:(CPTNativeEvent *)event fromPoint: (CGPoint)fromPoint toPoint: (CGPoint)toPoint
{
BOOL handledByDelegate = [super scrollWheelEvent: event fromPoin fromPoint toPointtoPoint]; if ( handledByDelegate ) {
return YES;
}
CPTGraph *theGraph = self, graph;
CPTPlotArea *plotArea = theGraph.plotAreaFrame.plotArea;
if ( !selfallowsUserlnteraction || !plotArea ) {
return NO;
}
CGPoint fromPointlnPlotArea = [theGraph convertPoint:fromPoint toLayer:plotArea]; CGPoint toPointlnPlotArea = [theGraph convertPoin toPoint toLayer:plotArea]; CGPoint displacement = CPTPointMake(toPointInPlotArea.x - fromPointlnPlotArea.x, toPointlnPlotArea.y - fromPointlnPlotArea.y);
CGPoint pointToUse = toPointlnPlotArea; id<CPTPlotSpaceDelegate> theDelegate = selfdelegate; // Allow delegate to override
if ( [theDelegate respondsToSelector:@selector(plotSpace:willDisplaceBy:)] ) { displacement = [theDelegate plotSpace:self willDisplaceBy:displacement];
pointToUse = CPTPointMake(fromPointInPlotArea.x + displacement.x, fromPointlnPlotArea.y + displacement^);
}
NSDecimal lastPoint[2], newPoint[2];
[self plotPoint:lastPoint numberOfCoordinates:2
forPlotAreaViewPoin fromPointlnPlotArea];
[self plotPoint:newPoint numberOfCoordinates:2 forPlotAreaViewPoin pointToUse];
// X range NSDecimal shiftX = CPTDecimalSubtract(lastPoint[CPTCoordinateX], newPoint[CPTCoordinateX]);
CPTPlotRange *newRangeX = [self shiftRange:self.xRange
by:shiftX
usingMomentum :NO
i nGl ob alRange : self . gl ob alXRange withDisplacement:&displacement.x];
// Y range
NSDecimal shiftY = CPTDecimalSubtract(lastPoint[CPTCoordinateY], newPoint[CPTCoordinateY]);
CPTPlotRange *newRangeY = [self shiftRange:selfyRange
by: shiftY
usingMomentum :NO
i nGl ob alRange : self . gl ob al YRange withDisplacement:&displacement.y]; selfxRange = newRangeX;
selfyRange = newRangeY; return YES;
}
#endif
/**
* @brief Reset the dragging state and cancel any active animations.
**/
-(void)cancelAnimations
{
selfisDragging = NO;
for ( CPTAnimationOperation *op in self. animations ) {
[[CPT Animation sharedlnstance] removeAnimationOperation:op];
}
}
III ®}
#pragma mark - #pragma mark Accessors
/// @cond
-(void)setAllowsMomentum:(BOOL)newMomentum
{
selfallowsMomentumX = newMomentum;
self. allowsM omentum Y = newMomentum;
} -(BOOL)allowsMomentum
{
return self.allowsMomentumX || self.allowsMomentumY;
}
/// @endcond #pragma mark -
#pragma mark Animation Delegate /// @cond
-(void)animationDidFinish:(CPTAnimationOperation *)operation
{
[self.animations removeObjectIdenticalTo:operation];
}
/// @endcond @end
#import "NSCoderExtensions.h" #import "CPTUtilities.h"
#import "NSNumberExtensions.h" void MyCGPathApplierFunc(void *info, const CGPathElement *element); #pragma mark -
©implementation NSCoder(CPTExtensions)
#pragma mark - #pragma mark Encoding
/** @brief Encodes a @ref CGFloat and associates it with the string @par{key} .
* @param number The number to encode.
* @param key The key to associate with the number.
**/
-(void)encodeCGFloat:(CGFloat)number forKey:(NS String *)key
{
#if CGFLOAT IS DOUBLE
[self encodeDouble:number forKey:key];
#else
[self encodeFloat:number forKey:key];
#endif }
/** @brief Encodes a @ref CGPoint and associates it with the string @par{key}.
* @param point The point to encode.
* @param key The key to associate with the point.
**/
-(void)encodeCPTPoint:(CGPoint)point forKey:(NS String *)key
{
NSString *newKey = [[NSString alloc] initWithFormat:@"%@.x", key];
[self encodeCGFloa point.x forKey:newKey]; newKey = [[NSString alloc] initWithFormat:@"%@.y", key];
[self encodeCGFloa point.y forKey:newKey];
}
/** @brief Encodes a @ref CGSize and associates it with the string @par{key}.
* @param size The size to encode.
* @param key The key to associate with the size.
* */
-(void)encodeCPTSize:(CGSize)size forKey: (NSString *)key
{
NSString *newKey = [[NSString alloc] initWithFormat:@"%@.width", key]; [self encodeCGFloat: size. width forKey :newKey]; newKey = [[NSString alloc] initWithFormat:@"%@. height", key];
[self encodeCGFloat: size. height forKey: newKey];
}
/** @brief Encodes a @ref CGRect and associates it with the string @par{key}.
* @param rect The rectangle to encode.
* @param key The key to associate with the rectangle.
**/
-(void)encodeCPTRect:(CGRect)rect forKey: (NSString *)key
{
NSString *newKey = [[NSString alloc] initWithFormat:@"%@. origin", key]; [self encodeCPTPoin rect.origin forKey: newKey]; newKey = [[NSString alloc] initWithFormat:@"%@.size", key];
[self encodeCPTSize:rect.size forKey: newKey];
}
/** @brief Encodes a color space and associates it with the string @par{key}.
* @param colorSpace The @ref CGColorSpaceRef to encode.
* @param key The key to associate with the color space. * @note The current implementation only works with named color spaces.
**/
-(void)encodeCGColorSpace:(CGColorSpaceRef)colorSpace forKey:(NS String *)key {
#if TARGET IPHO E SIMULATOR || T ARGET O S IPHO E
NSLog(@"Color space encoding is not supported on iOS. Decoding will return a generic RGB color space.");
#else
if ( colorSpace ) {
CFDataRef iccProfile = CGColorSpaceCopylCCProfile(colorSpace);
[self encodeObject:( bridge NSData *)iccProfile forKey:key];
CFRelease(iccProfile);
}
#endif
}
/// @cond void MyCGPathApplierFunc(void *info, const CGPathElement *element)
{
NSMutableDictionary *elementData = [[NSMutableDictionary alloc] init]; elementData[@"type"] = @(element->type); switch ( element->type ) {
case kCGPathElementAddCurveToPoint: // 3 points
elementData[@"point3.x"] = @(element->points[2].x);
elementData[@"point3.y"] = @(element->points[2].y); case kCGPathElementAddQuadCurveToPoint: // 2 points
elementData[@"point2.x"] = @(element->points[l].x);
elementData[@"point2.y"] = @(element->points[l].y); case kCGPathElementMoveToPoint: // 1 point
case kCGPathElementAddLineToPoint: // 1 point
elementData[@"pointl .x"] = @(element->points[0].x);
elementData[@"pointl .y"] = @(element->points[0].y);
break; case kCGPathElementCloseSubpath: // 0 points
break;
}
NSMutableArray *pathData = ( bridge NSMutableArray *)info;
[pathData addObj ect: elementData] ;
} /// @endcond
/** @brief Encodes a path and associates it with the string @par{key}.
* @param path The @ref CGPathRef to encode.
* @param key The key to associate with the path.
**/
-(void)encodeCGPath:(CGPathRef)path forKey :(NS String *)key
{
NSMutable Array *pathData = [[NSMutableArray alloc] init]; // walk the path and gather data for each element
CGPathApply(path, (_bridge void *)(pathData), &MyCGPathApplierFunc); // encode data count
NSUInteger dataCount = pathData. count;
NSString *newKey = [[NSString alloc] initWithFormat:@"%@. count", key]; [self encodeInteger:(NSInteger)dataCount forKey :newKey];
// encode data elements
for ( NSUInteger i = 0; i < dataCount; i++ ) {
NSDictionary *elementData = pathData[i];
CGPathElementType type = (CGPathElementType)[elementData[@"type"] int Value];
newKey = [[NSString alloc] initWithFormat:@"%@[%lu].type", key, (unsigned long)i];
[self encodelnttype forKey :newKey];
CGPoint point; switch ( type ) {
case kCGPathElementAddCurveToPoint: // 3 points
point.x = [elementData[@"point3.x"] cgFloat Value];
point.y = [elementData[@"point3.y"] cgFloat Value];
newKey = [[NSString alloc] initWithFormat:@"%@[%lu].point3", key, (unsigned long)i];
[self encodeCPTPoin point forKey: newKey]; case kCGPathElementAddQuadCurveToPoint: // 2 points
point.x = [elementData[@"point2.x"] cgFloat Value];
point.y = [elementData[@"point2.y"] cgFloat Value];
newKey = [[NSString alloc] initWithFormat:@"%@[%lu].point2", key, (unsigned long)i];
[self encodeCPTPoin point forKey: newKey]; case kCGPathElementMoveToPoint: // 1 point
case kCGPathElementAddLineToPoint: // 1 point point.x = [elementData[@"pointl .x"] cgFloat Value];
point.y = [elementData[@" point l .y"] cgFloat Value];
newKey = [[NSString alloc] initWithFormat:@"%@[%lu].pointl ", key, (unsigned long)i];
[self encodeCPTPointpoint forKey:newKey];
break; case kCGPathElementCloseSubpath: // 0 points
break;
}
}
}
/** @brief Encodes an image and associates it with the string @par{key}.
* @param image The @ref CGImageRef to encode.
* @param key The key to associate with the image.
**/
-(void)encodeCGImage:(CGImageRef)image forKey: (NSString *)key
{
NSString *newKey = [[NSString alloc] initWithFormat:@"%@.width", key];
[self encodeInt64:(int64_t)CGImageGetWidth(image) forKey :newKey]; newKey = [[NSString alloc] initWithFormat:@"%@. height", key];
[self encodeInt64 : (int64_t)CGImageGetHeight(image) forKey :newKey] ; newKey = [[NSString alloc] initWithFormat:@"%@.bitsPerComponent", key];
[self encodeInt64:(int64_t)CGImageGetBitsPerComponent(image) forKey: newKey]; newKey = [[NSString alloc] initWithFormat:@"%@.bitsPerPixel", key];
[self encodeInt64:(int64_t)CGImageGetBitsPerPixel (image) forKey: newKey]; newKey = [[NSString alloc] initWithFormat:@"%@.bytesPerRow", key];
[self encodeInt64:(int64_t)CGImageGetBytesPerRow(image) forKey: newKey]; newKey = [[NSString alloc] initWithFormat:@"%@.colorSpace", key];
CGColorSpaceRef colorSpace = CGImageGetColorSpace(image);
[self encodeCGColorSpacexolorSpace forKey: newKey]; newKey = [[NSString alloc] initWithFormat:@"%@.bitmapInfo", key];
const CGBitmapInfo info = CGImageGetBitmapInfo(image);
[self encodeBytes: (const void *)(&info) length: sizeof(CGBitmapInfo) forKey: newKey];
CGDataProviderRef provider = CGImageGetDataProvider(image);
CFDataRef providerData = CGDataProviderCopyData(provider);
newKey = [[NSString alloc] initWithFormat:@"%@. provider", key];
[self encodeObject:( bridge NSData *)providerData forKey: newKey]; if ( providerData ) {
CFRel ease(provi derData) ;
}
const CGFloat *decodeArray = CGImageGetDecode(image);
if ( decodeArray ) {
size t numberOfComponents =
CGColorSpaceGetNumberOfComponents(colorSpace);
newKey = [[NSString alloc] initWithFormat:@"%@. numberOfComponents", key]; [self encodeInt64:(int64_t)numberOfComponents forKey:newKey]; for ( size t i = 0; i < numberOfComponents; i++ ) {
newKey = [[NSString alloc] initWithFormat:@"%@.decode[%zu]. lower", key, i]; [self encodeCGFloat:decodeArray[i * 2] forKey:newKey]; newKey = [[NSString alloc] initWithFormat:@"%@.decode[%zu].upper", key, i]; [self encodeCGFloat:decodeArray[i * 2 + 1] forKey: newKey];
}
}
newKey = [[NSString alloc] initWithFormat:@"%@.shouldInterpolate", key];
[self encodeBool:CGImageGetShouldInterpolate(image) forKey: newKey]; newKey = [[NSString alloc] initWithFormat:@"%@.renderingIntent", key];
[self encodeInt:CGImageGetRenderingIntent(image) forKey: newKey];
}
/** @brief Encodes an @ref NSDecimal and associates it with the string @par{key}.
* @param number The number to encode.
* @param key The key to associate with the number.
**/
-(void)encodeDecimal:(NSDecimal)number forKey: (NSString *)key
{
[self encodeObj ect: [NSDecimalNumber decimalNumberWithDecimal :number] forKey: key];
}
#pragma mark - #pragma mark Decoding
/** @brief Decodes and returns a number that was previously encoded with
* @link NSCoder: :encodeCGFloat:forKey: -encodeCGFloa forKey: @endlink
* and associated with the string @par{key}.
* @param key The key associated with the number.
* @return The number as a @ref CGFloat.
**/
-(CGFloat)decodeCGFloatForKey: (NSString *)key {
#if CGFLOAT IS DOUBLE
return [self decodeDoubleForKey:key]; #else
return [self decodeFloatForKey:key];
#endif
}
/** @brief Decodes and returns a point that was previously encoded with
* @link NSCoder: :encodeCPTPoint:forKey: -encodeCPTPoin forKey: @endlink
* and associated with the string @par{key}.
* @param key The key associated with the point.
* @return The point.
* */
-(CGPoint)decodeCPTPointForKey:(NSString *)key
{
CGPoint point; NSString *newKey = [[NSString alloc] initWithFormat:@"%@.x", key]; point.x = [self decodeCGFloatForKey:newKey]; newKey = [[NSString alloc] initWithFormat:@"%@.y", key];
point.y = [self decodeCGFloatForKey:newKey]; return point;
}
/** @brief Decodes and returns a size that was previously encoded with
* @link NSCoder: :encodeCPTSize:forKey: -encodeCPTSize:forKey:@endlink
* and associated with the string @par{key}.
* @param key The key associated with the size.
* @return The size.
* */
-(CGSize)decodeCPTSizeForKey : (NS String *)key
{
CGSize size; NSString *newKey = [[NSString alloc] initWithFormat:@"%@.width", key]; size.width = [self decodeCGFloatForKey:newKey]; newKey = [[NSString alloc] initWithFormat:@"%@.height", key];
size. height = [self decodeCGFloatForKey:newKey]; return size; }
/** @brief Decodes and returns a rectangle that was previously encoded with
* @link NSCoder: :encodeCPTRect:forKey: -encodeCPTRect:forKey:@endlink
* and associated with the string @par{key}.
* @param key The key associated with the rectangle.
* @return The rectangle.
**/
-(CGRect)decodeCPTRectForKey:(NS String *)key
{
CGRect rect;
NSString *newKey = [[NSString alloc] initWithFormat:@"%@. origin", key]; rect.origin = [self decodeCPTPointForKey:newKey]; newKey = [[NSString alloc] initWithFormat:@"%@.size", key];
rect.size = [self decodeCPTSizeForKey:newKey]; return rect;
}
/** @brief Decodes and returns an new color space object that was previously encoded with
* @link NSCoder: :encodeCGColorSpace:forKey: - encodeCGCol or Space : forKey : @endlink
* and associated with the string @par{key}.
* @param key The key associated with the color space.
* @return The new path.
* @note The current implementation only works with named color spaces.
**/
-(CGColorSpaceRef)newCGColorSpaceDecodeForKey: (NSString *)key
{
CGColorSpaceRef colorSpace = NULL;
#if TARGET IPHONE SIMULATOR || T ARGET O S IPHONE
NSLog(@"Color space decoding is not supported on iOS. Using generic RGB color space.");
colorSpace = CGColorSpaceCreateDeviceRGB();
#else
NSData *iccProfile = [self decodeObjectForKey:key];
if ( iccProfile ) {
colorSpace = CGColorSpaceCreateWithICCProfile( (_bridge CFDataRef)iccProfile
);
}
else {
NSLog(@"Color space not available for key '%@'. Using generic RGB color space.", key);
colorSpace = CGColorSpaceCreateWithName(kCGColorSpaceGenericRGB);
}
#endif return colorSpace;
}
/** @brief Decodes and returns a new path object that was previously encoded with * @link NSCoder: :encodeCGPath:forKey: -encodeCGPath:forKey:@endlink
* and associated with the string @par{key}.
* @param key The key associated with the path.
* @return The new path.
**/
-(CGPathRef)newCGPathDecodeForKey:(NS String *)key
{
CGMutablePathRef newPath = CGPathCreateMutable(); // decode count
NSString *newKey = [[NSString alloc] initWithFormat:@"%@. count", key]; NSUInteger count = (NSUInteger)[self decodelntegerForKey : newKey];
// decode elements
for ( NSUInteger i = 0; i < count; i++ ) {
newKey = [[NSString alloc] initWithFormat:@"%@[%lu].type", key, (unsigned long).];
CGPathElementType type = (CGPathElementType)[self decodelntForKey : newKey]; CGPoint pointl = CGPointZero;
CGPoint point2 = CGPointZero;
CGPoint point3 = CGPointZero; switch ( type ) {
case kCGPathElementAddCurveToPoint: // 3 points
newKey = [[NSString alloc] initWithFormat:@"%@[%lu].point3", key, (unsigned long)i];
point3 = [self decodeCPTPointForKey:newKey]; case kCGPathElementAddQuadCurveToPoint: // 2 points
newKey = [[NSString alloc] initWithFormat:@"%@[%lu].point2", key, (unsigned long)i];
point2 = [self decodeCPTPointForKey:newKey]; case kCGPathElementMoveToPoint: // 1 point
case kCGPathElementAddLineToPoint: // 1 point
newKey = [[NSString alloc] initWithFormat:@"%@[%lu].pointl ", key, (unsigned long)i];
pointl = [self decodeCPTPointForKey:newKey];
break; case kCGPathElementCloseSubpath: // 0 points
break;
}
switch ( type ) {
case kCGPathElementMoveToPoint:
CGPathMoveToPoint(newPath, NULL, pointl .x, pointl .y);
break; case kCGPathElementAddLineToPoint:
CGPathAddLineToPoint(newPath, NULL, pointl .x, pointl .y);
break; case kCGPathElementAddQuadCurveToPoint:
CGPathAddQuadCurveToPoint(newPath, NULL, pointl .x, pointl .y, point2.x point2.y);
break; case kCGPathElementAddCurveToPoint:
CGPathAddCurveToPoint(newPath, NULL, pointl .x, pointl .y, point2.x, point2.y, point3.x, point3.y);
break; case kCGPathElementCloseSubpath:
CGPathCloseSubpath(newPath);
break;
}
}
return newPath;
}
/** @brief Decodes and returns a new image object that was previously encoded with
* @link NSCoder: :encodeCGImage:forKey: -encodeCGImage:forKey:@endlink
* and associated with the string @par{key}.
* @param key The key associated with the image.
* @return The new image.
**/
-(CGImageRef)newCGImageDecodeForKey:(NS String *)key
{
NSString *newKey = [[NSString alloc] initWithFormat:@"%@.width", key];
size t width = (size_t)[self decode!nt64ForKey : newKey]; newKey = [[NSString alloc] initWithFormat:@"%@. height", key];
size t height = (size_t)[self decodeInt64ForKey : newKey]; newKey = [[NSString alloc] initWithFormat:@"%@.bitsPerComponent", key];
size t bitsPerComponent = (size_t)[self decodeInt64ForKey : newKey]; newKey = [[NSString alloc] initWithFormat:@"%@.bitsPerPixel", key];
size t bitsPerPixel = (size_t)[self decodeInt64ForKey : newKey]; newKey = [[NSString alloc] initWithFormat:@"%@.bytesPerRow", key];
size t bytesPerRow = (size_t)[self decodeInt64ForKey : newKey]; newKey = [[NSString alloc] initWithFormat:@"%@.colorSpace", key];
CGColorSpaceRef colorSpace = [self newCGColorSpaceDecodeForKey:newKey]; newKey = [[NSString alloc] initWithFormat:@"%@.bitmapInfo", key];
NSUInteger length;
const CGBitmapInfo *bitmapInfo = (const void *)[self decodeBytesForKey: newKey returnedLength : &length] ; newKey = [[NSString alloc] initWithFormat:@"%@. provider", key];
CGDataProviderRef provider = CGDataProviderCreateWithCFData( ( bridge
CFDataRef)[self decodeObjectForKey:newKey] ); newKey = [[NSString alloc] initWithFormat:@"%@.numberOfComponents", key]; size t numberOfComponents = (size_t)[self decodeInt64ForKey : newKey];
CGFloat *decodeArray = NULL;
if ( numberOfComponents ) {
decodeArray = malloc( numberOfComponents * 2 * sizeof(CGFloat) ); for ( size t i = 0; i < numberOfComponents; i++ ) {
newKey = [[NSString alloc] initWithFormat:@"%@.decode[%zu]. lower", key, i];
decodeArray[i * 2] = [self decodeCGFloatForKey:newKey]; newKey = [[NSString alloc]
initWithFormat:@"%@.decode[%zu].upper", key, i];
decodeArray[i * 2 + 1] = [self decodeCGFloatForKey:newKey];
}
}
newKey = [[NSString alloc] initWithFormat:@"%@.shouldInterpolate", key];
bool shouldlnterpolate = [self decodeBoolForKey:newKey]; newKey = [[NSString alloc] initWithFormat:@"%@.renderingIntent", key];
CGColorRenderinglntent intent = (CGColorRenderingIntent)[self decodelntForKey : newKey];
CGImageRef newlmage = CGImageCreate(width,
height,
bitsPerComponent,
bitsPerPixel,
bytesPerRow,
colorSpace,
*bitmapInfo,
provider,
decodeArray,
shoul dlnterpol ate,
intent);
CGColorSpaceRelease(colorSpace);
CGDataProviderRelease(provider);
if ( decodeArray ) {
free(decodeArray);
}
return newlmage;
}
/** @brief Decodes and returns a decimal number that was previously encoded with
* @link NSCoder: :encodeDecimal:forKey: -encodeDecimal:forKey:@endlink
* and associated with the string @par{key}.
* @param key The key associated with the number.
* @return The number as an @ref NSDecimal.
**/
-(NSDecimal)decodeDecimalForKey:(NSString *)key
{
NSDecimal result;
NSNumber *number = [self decodeObjectForKey:key]; if ( [number respondsToSelector:@selector(decimalValue)] ) {
result = [number decimal Value];
}
else {
result = CPTDecimalNaNO;
}
return result;
}
@end #import "NSNumberExtensions.h"
©implementation NSNumber(CPTExtensions) /** @brief Creates and returns an NSNumber object containing a given value, treating it as a @ref CGFloat.
* @param number The value for the new number.
* @return An NSNumber object containing value, treating it as a @ref CGFloat.
**/
+(instancetype)numberWithCGFloat:(CGFloat)number
{
return @(number);
}
/** @brief Returns the value of the receiver as a @ref CGFloat.
* @return The value of the receiver as a @ref CGFloat.
**/
-(CGFloat)cgFloatValue
{
#if CGFLOAT IS DOUBLE
return [self doubleValue];
#else
return [self float Value];
#endif
}
/** @brief Returns an NSNumber object initialized to contain a given value, treated as a @ref CGFloat.
* @param number The value for the new number.
* @return An NSNumber object containing value, treating it as a @ref CGFloat.
**/
-(instancetype)initWithCGFloat:(CGFloat)number
{
#if CGFLOAT IS DOUBLE
return [self initWithDouble:number];
#else
return [self initWithFloat:number];
#endif
}
/** @brief Returns the value of the receiver as an NSDecimalNumber.
* @return The value of the receiver as an NSDecimalNumber.
* */
-(NSDecimalNumber *)decimalNumber
{ if ( [self isMemberOfClass: [NSDecimalNumber class]] ) {
return (NSDecimalNumber *)self;
}
return [NSDecimalNumber decimalNumberWithDecimal:[self decimal Value]];
}
@end
#import "CPTPlotArea.h"
#import "CPTAxis.h"
#import "CPTAxisLabelGroup.h"
#import "CPTAxisSet.h"
#import "CPTFill.h"
#import "CPTGridLineGroup.h"
#import "CPTLineStyle.h"
#import "CPTPlotGroup.h"
#import "CPTUtilities.h" static const size t kCPTNumberOfLayers = 6; // number of primary layers to arrange
/// @cond
©interface CPTPlotArea() @property (nonatomic, readwrite, assign) CPTGraphLayerType *bottomUpLayerOrder; @property (nonatomic, readwrite, assign, getter = isUpdatingLayers) BOOL
updatingLayers;
@property (nonatomic, readwrite) CGPoint touchedPoint;
@property (nonatomic, readwrite) NSDecimal widthDecimal;
@property (nonatomic, readwrite) NSDecimal heightDecimal;
-(void)updateLayerOrder;
-(unsigned)indexForLayerType:(CPTGraphLayerType)layerType; @end
/// @endcond #pragma mark -
/** @brief A layer representing the actual plotting area of a graph.
*
* All plots are drawn inside this area while axes, titles, and borders may fall outside.
* The layers are arranged so that the graph elements are drawn in the following order: * -# Background fill
* -# Minor grid lines
* -# Major grid lines * -# Background border
* -# Axis lines with major and minor tick marks
* -# Plots
* -# Axis labels
* -# Axis titles
**/
@implementation CPTPlotArea
/** @property CPTGridLineGroup *minorGridLineGroup
* @brief The parent layer for all minor grid lines.
**/
@synthesize minorGridLineGroup;
/** @property CPTGridLineGroup *majorGridLineGroup
* @brief The parent layer for all major grid lines.
**/
@synthesize majorGridLineGroup;
/** @property CPTAxisSet *axisSet
* @brief The axis set.
**/
@synthesize axisSet;
/** @property CPTPlotGroup *plotGroup
* @brief The plot group.
**/
@synthesize plotGroup;
/** @property CPTAxisLabel Group *axisLabel Group
* @brief The parent layer for all axis labels.
**/
@synthesize axisLabelGroup;
/** @property CPTAxisLabel Group *axisTitleGroup
* @brief The parent layer for all axis titles.
**/
@synthesize axisTitleGroup;
/** @property NS Array *topDownLayerOrder
* @brief An array of graph layers to be drawn in an order other than the default.
*
* The array should reference the layers using the constants defined in
#CPTGraphLayerType.
* Layers should be specified in order starting from the top layer.
* Only the layers drawn out of the default order need be specified; all others will
* automatically be placed at the bottom of the view in their default order.
* * If this property is @nil, the layers will be drawn in the default order (bottom to top):
* -# Minor grid lines
* -# Major grid lines
* -# Axis lines, including the tick marks
* -# Plots
* -# Axis labels
* -# Axis titles
*
* Example usage:
* @code
* [graph setTopDownLayerOrder:@[
* @(CPTGraphLayerTypePlots),
* @(CPTGraphLayerTypeAxisLabels),
* @(CPTGraphLayerTypeMaj orGridLines)];
* @endcode
**/
@synthesize topDownLayerOrder;
/** @property CPTLineStyle *borderLineStyle
* @brief The line style for the layer border.
* If @nil, the border is not drawn.
**/
@dynamic borderLineStyle; /** ©property CPTFill *fill
* @brief The fill for the layer background.
* If @nil, the layer background is not filled.
**/
@synthesize fill;
/** @property NSDecimal widthDecimal
* @brief The width of the @ref bounds as an @ref NSDecimal value.
**/
@synthesize widthDecimal;
/** @property NSDecimal heightDecimal
* @brief The height of the @ref bounds as an @ref NSDecimal value.
**/
@synthesize heightDecimal;
// Private properties
@synthesize bottomUpLayerOrder;
@synthesize updatingLayers;
@synthesize touchedPoint;
#pragma mark -
#pragma mark Init/Dealloc /// @name Initialization
/// @{
/** @brief Initializes a newly allocated CPTPlotArea object with the provided frame rectangle.
*
* This is the designated initializer. The initialized layer will have the following properties:
* - @ref minorGridLineGroup = @nil
* - @ref majorGridLineGroup = @nil
* - @ref axisSet = @nil
* - @ref plotGroup = @nil
* - @ref axisLabelGroup = @nil
* - @ref axisTitleGroup = @nil
* - @ref fill = @nil
* - @ref topDownLayerOrder = @nil
* - @ref plotGroup = a new CPTPlotGroup with the same frame rectangle
* - @ref needsDisplayOnBoundsChange = @YES
*
* @param newFrame The frame rectangle.
* @return The initialized CPTPlotArea object.
**/
-(instancetype)initWithFrame:(CGRect)newFrame
{
if ( (self = [super initWithFrame: newFrame]) ) {
minorGridLineGroup = nil;
majorGridLineGroup = nil;
axis Set = nil;
plotGroup = nil;
axisLabelGroup = nil;
axisTitleGroup = nil;
fill = nil;
touchedPoint = CGPointMake(NAN, NAN);
topDownLayerOrder = nil;
bottomUpLayerOrder = malloc( kCPTNumberOfLayers *
sizeof(CPTGraphLayerType) );
[self updateLayerOrder];
CPTPlotGroup *newPlotGroup = [[CPTPlotGroup alloc] initWithFrame: newFrame]; selfplotGroup = newPlotGroup;
CGSize boundsSize = self.bounds.size;
widthDecimal = CPTDecimalFromCGFloat(boundsSize.width);
heightDecimal = CPTDecimalFromCGFloat(boundsSize. height); selfneedsDisplayOnBoundsChange = YES; }
return self;
}
III ®)
III @cond
-(instancetype)initWithLayer:(id)layer
{
if ( (self = [super initWithLayenlayer]) ) {
CPTPlotArea *theLayer = (CPTPlotArea *)layer; minorGridLineGroup = theLayer->minorGridLineGroup;
majorGridLineGroup = theLay er->majorGridLineGroup;
axisSet = theLayer->axisSet;
plotGroup = theLayer->plotGroup;
axisLabelGroup = theLayer->axisLabelGroup;
axisTitleGroup = theLayer->axisTitleGroup;
fill = theLayer->fill;
touchedPoint = theLayer->touchedPoint;
topDownLayerOrder = theLayer->topDownLayerOrder;
bottomUpLayerOrder = malloc( kCPTNumberOfLayers * sizeof(CPTGraphLayerType) );
memcpy( bottomUpLayerOrder, theLayer->bottomUpLayerOrder, kCPTNumberOfLayers * sizeof(CPTGraphLayerType) );
widthDecimal = theLayer->widthDecimal;
heightDecimal = theLayer->heightDecimal;
}
return self;
}
-(void)dealloc
{
free(bottomUpLayerOrder);
}
/// @endcond #pragma mark -
#pragma mark NSCoding Methods /// @cond
-(void)encodeWithCoder:(NSCoder *)coder
{
[super encodeWithCoder: coder]; [coder encodeObj ect: self.minorGridLineGroup
forKey : @ " CPTPlotArea. minorGridLineGroup " ] ;
[coder encodeObjec self.majorGridLineGroup
forKey : @ " CPTPlotArea. maj orGridLineGroup " ] ;
[coder encodeObjectself.axisSet forKey :@"CPTPlotArea.axisSet"];
[coder encodeObj ect: self. plotGroup forKey :@"CPTPlotArea.plotGroup"];
[coder encodeObjec selfaxisLabelGroup forKey :@"CPTPlotArea.axisLabelGroup"];
[coder encodeObjec selfaxisTitleGroup forKey :@"CPTPlotArea.axisTitleGroup"];
[coder encodeObj ect: selffill forKey:@"CPTPlotArea.fill"];
[coder encodeObj ect: selftopDownLayerOrder
forKey:@"CPTPlotArea.topDownLayerOrder"];
// No need to archive these properties:
// bottomUpLayerOrder
// updatingLayers
// touchedPoint
// widthDecimal
// heightDecimal
}
-(instancetype)initWithCoder:(NSCoder *)coder
{
if ( (self = [super initWithCoder: coder]) ) {
minorGridLineGroup = [coder
decodeObj ectForKey : @" CPTPlotArea.minorGridLineGroup"] ;
maj orGridLineGroup = [coder
decodeObjectForKey:@"CPTPlotArea. maj orGridLineGroup"];
axisSet = [coder decodeObjectForKey:@"CPTPlotArea.axisSet"];
plotGroup = [coder decodeObjectForKey:@"CPTPlotArea. plotGroup"];
axisLabelGroup = [coder
decodeObj ectForKey : @" CPTPlotArea. axisLabelGroup" ];
axisTitleGroup = [coder decodeObjectForKey:@"CPTPlotArea.axisTitleGroup"]; fill = [[coder decodeObjectForKey:@"CPTPlotArea.fill"] copy];
topDownLayerOrder = [coder
decodeObj ectForKey : @" CPTPlotArea.topDownLayerOrder"] ; bottomUpLayerOrder = malloc( kCPTNumberOfLayers *
sizeof(CPTGraphLayerType) );
[self updateLayerOrder]; touchedPoint = CGPointMake(NAN, NAN);
CGSize boundsSize = self.bounds.size;
widthDecimal = CPTDecimalFromCGFloat(boundsSize.width);
heightDecimal = CPTDecimalFromCGFloat(boundsSize. height);
} return self;
}
/// @endcond
#pragma mark - #pragma mark Drawing
/// @cond
-(void)renderAsVectorInContext:(CGContextRef)context
{
if ( self.hidden ) {
return;
}
[super render As Vector InC ontext: context] ;
BOOL useMask = self.masksToBounds;
self.masksToBounds = YES;
CGC ontext S aveGState(context) ;
CGPathRef maskPath = selfmaskingPath;
if ( maskPath ) {
CGContextBeginPath(context);
CGC ontext AddPath(context, maskPath);
CGContextClip(context);
}
[self. fill fillRec selfbounds inContext: context];
NSArray *theAxes = self.axisSet.axes; for ( CPTAxis *axis in theAxes ) {
[axis drawBackgroundBandsInContex context];
}
for ( CPTAxis *axis in theAxes ) {
[axi s drawB ackgroundLimitsInC ontext : context] ;
}
CGContextRestoreGState(context);
self.masksToBounds = useMask;
}
/// @endcond #pragma mark - #pragma mark Layout
/// @name Layout
/// @{
/**
* @brief Updates the layout of all sublayers. Sublayers fill the super layer&rsquo;s bounds
* except for the @ref plotGroup, which will fill the receiver&rsquo;s bounds.
*
* This is where we do our custom replacement for the Mac-only layout manager and autoresizing mask.
* Subclasses should override this method to provide a different layout of their own sublayers.
* */
-(void)layoutSublayers
{
[super layoutSublayers]; CPTAxisSet *myAxisSet = selfaxisSet;
BOOL axisSetHasB order = (myAxisSet.borderLineStyle != nil);
CALayer *superlayer = selfsuperlayer;
CGRect sublayerBounds = [self convertRec superlayer.bounds fromLayensuperlayer]; sublayerBounds. origin = CGPointZero;
CGPoint sublayerPosition = [self convertPoin selfbounds. origin toLayensuperlayer]; sublayerPosition = CPTPointMake(-sublayerPosition.x, -sublayerPosition.y);
CGRect sublayerFrame = CPTRectMake(sublayerPosition.x, sublayerPosition.y, sublayerBounds. size.width, sublayerBounds. size. height); selfminorGridLineGroup. frame = sublayerFrame;
selfmajorGridLineGroup. frame = sublayerFrame;
if ( axisSetHasB order ) {
self.axisSet.frame = sublayerFrame;
}
// make the plot group the same size as the plot area to clip the plots
CPTPlotGroup *thePlotGroup = selfplotGroup;
if ( thePlotGroup ) {
CGSize selfBoundsSize = self. bounds. size;
thePlotGroup.frame = CPTRectMake(0.0, 0.0, selffloundsSize.width,
selfBoundsSize. height);
}
// the label and title groups never have anything to draw; make them as small as possible to save memory
sublayerFrame = CPTRectMake(sublayerPosition.x, sublayerPosition.y, 0.0, 0.0);
self.axisLabelGroup. frame = sublayerFrame;
self.axisTitleGroup. frame = sublayerFrame;
if ( !axisSetHasBorder ) {
myAxisSet.frame = sublayerFrame;
[myAxisSet layoutSublayers];
}
}
-(NSSet *)sublayersExcludedFromAutomaticLayout
{
CPTGridLineGroup *minorGrid = selfminorGridLineGroup;
CPTGridLineGroup *majorGrid = selfmajorGridLineGroup;
CPTAxisSet *theAxisSet = selfaxisSet;
CPTPlotGroup *thePlotGroup = selfplotGroup;
CPTAxisLabelGroup *labels = self.axisLabelGroup;
CPTAxisLabelGroup *titles = self.axisTitleGroup; if ( minorGrid || majorGrid || theAxisSet || thePlotGroup || labels || titles ) { NSMutableSet *excludedSublayers = [[super
sublayersExcludedFromAutomaticLayout] mutableCopy] ;
if ( ! excludedSublayers ) {
excludedSublayers = [NSMutableSet set];
}
if ( minorGrid ) {
[excludedSublayers addObj ec minorGrid];
}
if ( majorGrid ) {
[excludedSublayers addObj ec majorGrid];
}
if ( theAxisSet ) {
[excluded Subl ay ers addObj ect : the Axi s S et] ;
}
if ( thePlotGroup ) {
[excludedSublayers addObj ectthePlotGroup];
}
if ( labels ) {
[excludedSublayers addObjectlabels];
}
if ( titles ) {
[excludedSublayers addObj ec titles];
}
return excludedSublayers;
}
else { return [super sublayersExcludedFromAutomaticLayout];
}
}
III ®}
#pragma mark -
#pragma mark Layer ordering
/// @cond
-(void)updateLayerOrder
{
CPTGraphLayerType *buLayerOrder = self.bottomUpLayerOrder; for ( sizej i = 0; i < kCPTNumberOfLayers; i++ ) {
*(buLayerOrder++) = (CPTGraphLayerType)i;
}
NSArray *tdLayerOrder = self.topDownLayerOrder;
if ( tdLayerOrder ) {
buLayerOrder = self.bottomUpLayerOrder; for ( NSUInteger layerlndex = 0; layerlndex < [tdLayerOrder count]; layerlndex++ )
{
CPTGraphLayerType layer Type =
(CPTGraphLayerType)[tdLayerOrder[layerIndex] int Value];
NSUInteger i = kCPTNumberOfLayers - layerlndex - 1;
while ( buLayerOrder[i] != layer Type ) {
if ( i = 0 ) {
break;
}
i~;
}
while ( i < kCPTNumberOfLayers - layerlndex - 1 ) {
buLayerOrder[i] = buLayerOrder[i + 1];
i++;
}
buLayerOrder[kCPTNumberOfLayers - layerlndex - 1] = layer Type;
}
}
// force the layer hierarchy to update
selfupdatingLayers = YES;
selfminorGridLineGroup = selfminorGridLineGroup;
selfmajorGridLineGroup = selfmajorGridLineGroup;
selfaxisSet = selfaxisSet; self.plotGroup = self.plotGroup;
self.axisLabelGroup = self.axisLabelGroup;
self.axisTitleGroup = self.axisTitleGroup;
self.updatingLayers = NO;
}
-(unsigned)indexForLayerType:(CPTGraphLayerType)layerType
{
CPTGraphLayerType *buLayerOrder = self.bottomUpLayerOrder; unsigned idx = 0; for ( sizej i = 0; i < kCPTNumberOfLayers; i++ ) {
if ( buLayerOrder[i] == layer Type ) {
break;
}
switch ( buLayerOrder[i] ) {
case CPTGraphLayerTypeMinorGndLines:
if ( selfminorGridLineGroup ) {
idx++;
}
break; case CPTGraphLayerTypeMaj orGridLines:
if ( selfmajorGridLineGroup ) {
idx++;
}
break; case CPTGraphLayerTypeAxisLines:
if ( selfaxisSet ) {
idx++;
}
break; case CPTGraphLayerTypePlots:
if ( self.plotGroup ) {
idx++;
}
break; case CPTGraphLayerTypeAxisLabels:
if ( self.axisLabelGroup ) {
idx++;
}
break; case CPTGraphLayerTypeAxisTitles: if ( selfaxisTitleGroup ) {
idx++;
}
break;
}
}
return idx;
}
/// @endcond #pragma mark -
#pragma mark Axis set layer management /** @brief Checks for the presence of the specified layer group and adds or removes it as needed.
* @param layer Type The layer type being updated.
**/
-(void)updateAxisSetLayersForType:(CPTGraphLayerType)layerType
{
BOOL needsLayer = NO;
CPTAxisSet *theAxisSet = selfaxisSet; for ( CPTAxis * axis in theAxisSet.axes ) {
switch ( layer Type ) {
case CPTGraphLayerTypeMinorGridLines:
if ( axis.minorGridLineStyle ) {
needsLayer = YES;
}
break; case CPTGraphLayerTypeMaj orGridLines:
if ( axis.majorGridLineStyle ) {
needsLayer = YES;
}
break; case CPTGraphLayerTypeAxisLabels:
if ( axis.axisLab els. count > 0 ) {
needsLayer = YES;
}
break; case CPTGraphLayerTypeAxisTitles:
if ( axis.axisTitle ) {
needsLayer = YES;
} break; default:
break;
}
}
if ( needsLayer ) {
[self setAxisSetLayersForType:layerType];
}
else {
switch ( layer Type ) {
case CPTGraphLayerTypeMinorGridLines:
selfminorGridLineGroup = nil;
break; case CPTGraphLayerTypeMaj orGridLines:
selfmajorGridLineGroup = nil;
break; case CPTGraphLayerTypeAxisLabels:
selfaxisLabelGroup = nil;
break; case CPTGraphLayerTypeAxisTitles:
selfaxisTitleGroup = nil;
break; default:
break;
}
}
}
/** @brief Ensures that a group layer is set for the given layer type.
* @param layer Type The layer type being updated.
**/
-(void)setAxisSetLayersForType:(CPTGraphLayerType)layerType
{
switch ( layer Type ) {
case CPTGraphLayerTypeMinorGridLines:
if ( ! selfminorGridLineGroup ) {
CPTGridLineGroup *newGridLineGroup = [[CPTGridLineGroup alloc] initWithFrame:self.bounds];
selfminorGridLineGroup = newGridLineGroup;
}
break; case CPTGraphLayerTypeMaj orGridLines :
if ( ! selfmajorGridLineGroup ) {
CPTGridLineGroup *newGridLineGroup = [[CPTGridLineGroup alloc] initWithFrame:self.bounds];
selfmajorGridLineGroup = newGridLineGroup;
}
break; case CPTGraphLayerTypeAxisLabels:
if ( ! selfaxisLabelGroup ) {
CPTAxisLabelGroup *newAxisLab el Group = [[CPTAxisLabelGroup alloc] initWithFrame:self.bounds];
selfaxisLabelGroup = newAxisLab el Group;
}
break; case CPTGraphLayerTypeAxisTitles:
if ( ! selfaxisTitleGroup ) {
CPTAxisLabelGroup *newAxisTitleGroup = [[CPTAxisLabelGroup alloc] initWithFrame:self.bounds];
selfaxisTitleGroup = newAxisTitleGroup;
}
break; default:
break;
}
}
/** @brief Computes the sublayer index for the given layer type and axis.
* @param axis The axis of interest.
* @param layer Type The layer type being updated.
* @return The sublayer index for the given layer type.
**/
-(unsigned)sublayerIndexForAxis:(CPTAxis *)axis
lay er Type :(CPTGraphLayerType)layer Type
{
unsigned idx = 0; for ( CPTAxis *currentAxis in self.axisSet.axes ) {
if ( currentAxis == axis ) {
break;
}
switch ( layer Type ) {
case CPTGraphLayerTypeMinorGridLines: if ( currentAxis.minorGridLineStyle ) {
idx++;
}
break; case CPTGraphLayerTypeMaj orGridLines:
if ( currentAxis.majorGridLineStyle ) {
idx++;
}
break; case CPTGraphLayerTypeAxisLabels:
if ( currentAxis.axisLabels. count > 0 ) {
idx++;
}
break; case CPTGraphLayerTypeAxisTitles:
if ( currentAxis.axisTitle ) {
idx++;
}
break; default:
break;
}
}
return idx;
}
#pragma mark -
#pragma mark Event Handling /// @name User Interaction
/// @{
/**
* @brief Informs the receiver that the user has
* @if MacOnly pressed the mouse button. @endif
* @if iOSOnly touched the screen. @endif
* If this plot area has a delegate that responds to the
* @link CPTPlotAreaDelegate: : plotAreaTouchDown: -plotAreaTouchDown: @endlink and/or
* @link CPTPlotAreaDelegate: :plotAreaTouchDown:withEvent: - plotAreaTouchDown:withEvent: @endlink
* methods, the delegate method will be called and this method returns @YES if the @par{interactionPoint} is within the
* plot area bounds.
*
* @param event The OS event.
* @param interactionPoint The coordinates of the interaction.
* @return Whether the event was handled or not.
**/
-(BOOL)pointingDeviceDownEvent:(CPTNativeEvent *)event
atPoint:(CGPoint)interactionPoint
{
CPTGraph *theGraph = selfgraph; if ( !theGraph || self, hidden ) {
return NO;
}
id<CPTPlotAreaDelegate> theDelegate = self, delegate;
if ( [theDelegate respondsToSelector:@selector(plotAreaTouchDown:)] ||
[theDelegate respondsToSelector:@selector(plotAreaTouchDown:withEvent:)] || [theDelegate respondsToSelector:@selector(plotAreaWasSelected:)] ||
[theDelegate respondsToSelector:@selector(plotAreaWasSelected:withEvent:)] ) { // Inform delegate if a point was hit
CGPoint plotAreaPoint = [theGraph convertPoin interactionPoint toLayer:self]; if ( CGRectContainsPoint(self.bounds, plotAreaPoint) ) {
selftouchedPoint = plotAreaPoint; if ( [theDelegate respondsToSelector:@selector(plotAreaTouchDown:)] ) {
[theDelegate plotAreaTouchDowmself];
}
if ( [theDelegate respondsToSelector:@selector(plotAreaTouchDown:withEvent:)]
) {
[theDelegate plotAreaTouchDowmself withEvent: event];
}
return NO; // don't block other events in the responder chain
}
}
return [super pointingDeviceDownEvent: event atPoin interactionPoint];
}
/**
* @brief Informs the receiver that the user has
* @if MacOnly released the mouse button. @endif * @if iOSOnly ended touching the screen. @endif
*
*
* If this plot area has a delegate that responds to the
* @link CPTPlotAreaDelegate: :plotAreaTouchUp: -plotAreaTouchUp: @endlink,
* @link CPTPlotAreaDelegate: :plotAreaTouchUp:withEvent: - plotAreaTouchUp :withEvent: @endlink,
* @link CPTPlotAreaDelegate: :plotAreaWasSelected: -plotAreaWasSelected: @endlink, and/or
* @link CPTPlotAreaDelegate: :plotAreaWasSelected:withEvent: - pi ot AreaWas S el ected : withEvent : @endlink
* methods, the delegate method will be called and this method returns @YES if the @par{interactionPoint} is within the
* plot area bounds.
*
* @param event The OS event.
* @param interactionPoint The coordinates of the interaction.
* @return Whether the event was handled or not.
**/
-(BOOL)pointingDeviceUpEvent:(CPTNativeEvent *)event
atPoint:(CGPoint)interactionPoint
{
CPTGraph *theGraph = selfgraph; if ( !theGraph || self, hidden ) {
return NO;
}
CGPoint lastPoint = selftouchedPoint;
selftouchedPoint = CGPointMake(NAN, NAN); id<CPTPlotAreaDelegate> theDelegate = self, delegate;
if ( [theDelegate respondsToSelector:@selector(plotAreaTouchUp:)] ||
[theDelegate respondsToSelector:@selector(plotAreaTouchUp:withEvent:)] ||
[theDelegate respondsToSelector:@selector(plotAreaWasSelected:)] ||
[theDelegate respondsToSelector:@selector(plotAreaWasSelected:withEvent:)] ) { // Inform delegate if a point was hit
CGPoint plotAreaPoint = [theGraph convertPoin interactionPoint toLayer:self]; if ( CGRectContainsPoint(self.bounds, plotAreaPoint) ) {
CGVector offset = CGVectorMake(plotAreaPoint.x - lastPoint.x, plotAreaPoint.y - lastPoint.y);
if ( (offset.dx * offset.dx + offset.dy * offset.dy) <= CPTFloat(25.0) ) {
if ( [theDelegate respondsToSelector:@selector(plotAreaTouchUp:)] ) {
[theDelegate plotAreaTouchUp : self] ;
} if ( [theDelegate respondsToSelector:@selector(plotAreaTouchUp:withEvent:)]
) {
[theDelegate plotAreaTouchUp:self withEvent: event];
}
if ( [theDelegate respondsToSelector:@selector(plotAreaWasSelected:)] ) { [theDelegate plotAreaWasSelected:self];
}
if ( [theDelegate
respondsToSelector:@selector(plotAreaWasSelected:withEvent:)] ) {
[theDelegate plotAreaWasSelected:self withEvent: event];
}
return NO; // don't block other events in the responder chain
}
}
}
return [super pointingDeviceUpEvent: event atPointinteractionPoint];
}
III ®}
#pragma mark - #pragma mark Accessors
/// @cond
-(CPTLineStyle *)borderLineStyle
{
return self.axisSet.borderLineStyle;
}
-(void)setBorderLineStyle:(CPTLineStyle *)newLineStyle
{
self.axisSet.borderLineStyle = newLineStyle;
}
-(void)setFill:(CPTFill *)newFill
{
if ( newFill != fill ) {
fill = [newFill copy];
[self setNeedsDisplay];
}
} -(void)setMinorGridLineGroup:(CPTGridLineGroup *)newGridLines
{
if ( (newGridLines != minorGridLineGroup) || selfisUpdatingLayers ) {
[minorGridLineGroup removeFromSuperlayer];
minorGridLineGroup = newGridLines;
if ( minorGridLineGroup ) {
minorGridLineGroup. plotArea = self;
minorGridLineGroup. major = NO;
[self insertSublayenminorGridLineGroup atlndex:[self
indexForLayerType:CPTGraphLayerTypeMinorGridLines]];
}
[self setNeedsLayout];
}
}
-(void)setMaj orGridLineGroup : (CPTGridLineGroup *)newGridLines
{
if ( (newGridLines != maj orGridLineGroup) || selfisUpdatingLayers ) {
[maj orGridLineGroup removeFromSuperlayer];
maj orGridLineGroup = newGridLines;
if ( maj orGridLineGroup ) {
maj orGridLineGroup. plotArea = self;
maj orGridLineGroup. maj or = YES;
[self insertSublayenmaj orGridLineGroup atlndex:[self
indexForLayerType:CPTGraphLayerTypeMajorGridLines]];
}
[self setNeedsLayout];
}
}
-(void)setAxisSet:(CPTAxisSet *)newAxisSet
{
if ( (newAxisSet != axisSet) || selfisUpdatingLayers ) {
[axisSet removeFromSuperlayer];
for ( CPTAxis *axis in axis Set. axes ) {
axis. plotArea = nil;
}
axisSet = new AxisSet;
[self updateAxisSetLayersForType:CPTGraphLayerTypeMajorGridLines]; [self updateAxisSetLayersForType:CPTGraphLayerTypeMinorGridLines]; [self updateAxisSetLayersForType:CPTGraphLayerTypeAxisLabels]; [self updateAxisSetLayersForType:CPTGraphLayerTypeAxisTitles]; if ( axisSet ) {
CPTGraph *theGraph = self, graph;
[self insertSublayenaxisSet atlndex:[self indexForLayerType:CPTGraphLayerTypeAxisLines]];
for ( CPT Axis *axis in axis Set. axes ) {
axis.plotArea = self;
axis. graph = theGraph;
}
}
[self setNeedsLayout];
}
}
-(void)setPlotGroup : (CPTPlotGroup *)newPlotGroup
{
if ( (newPlotGroup != plotGroup) || selfisUpdatingLayers ) {
[plotGroup removeFromSuperlayer];
plotGroup = newPlotGroup;
if ( plotGroup ) {
[self insertSublayenplotGroup atlndex:[self
indexForLayerType:CPTGraphLayerTypePlots]];
}
[self setNeedsLayout];
}
}
-(voi d) setAxi sLab el Group : (CPT Axi sLab el Group * )new Axi sLab elGroup
{
if ( (newAxisLabelGroup != axi sLab el Group) || selfisUpdatingLayers ) { [axisLabelGroup removeFromSuperlayer];
axisLabelGroup = newAxisLabelGroup;
if ( axisLabelGroup ) {
[self insertSublayenaxisLabelGroup atlndex:[self
indexForLayerType:CPTGraphLayerTypeAxisLabels]];
}
[self setNeedsLayout];
}
}
-(void)setAxisTitleGroup:(CPTAxisLabelGroup *)newAxisTitleGroup
{
if ( (newAxisTitleGroup != axisTitleGroup) || selfisUpdatingLayers ) { [axisTitleGroup removeFromSuperlayer];
axisTitleGroup = newAxisTitleGroup;
if ( axisTitleGroup ) {
[self insertSublayenaxisTitleGroup atlndex:[self
indexForLayerType:CPTGraphLayerTypeAxisTitles]];
}
[self setNeedsLayout];
} }
-(void)setTopDownLayerOrder:(NSArray *)new Array
{
if ( new Array != topDownLayerOrder ) {
topDownLayerOrder = new Array;
[self updateLayerOrder];
}
}
-(void)setGraph:(CPTGraph *)newGraph
{
if ( newGraph != self, graph ) {
[super setGraph: newGraph]; for ( CPTAxis *axis in self.axisSet.axes ) {
axis. graph = newGraph;
}
}
-(void)setBounds:(CGRect)newBounds
{
if ( !CGRectEqualToRect(self.bounds, newBounds) ) {
[super setBounds:newBounds]; selfwidthDecimal = CPTDecimalFromCGFloat(newBounds. size, width); selfheightDecimal = CPTDecimalFromCGFloat(newBounds. size. height);
}
}
/// @endcond @end
#import "CPTPlotAreaFrame.h"
#import "CPTAxisSet.h"
#import "CPTPlotArea.h"
#import "CPTPlotGroup.h"
/// @cond
@interface CPTPlotAreaFrame()
@property (nonatomic, readwrite, strong) CPTPlotArea *plotArea; @end /// @endcond #pragma mark -
/**
* @brief A layer drawn on top of the graph layer and behind all plot elements.
* All graph elements, except for titles, legends, and other annotations
* attached directly to the graph itself are clipped to the plot area frame.
**/
©implementation CPTPlotAreaFrame
/** @property CPTPlotArea *plotArea
* @brief The plot area.
**/
@synthesize plotArea;
/** @property CPTAxisSet *axisSet
* @brief The axis set.
**/
@dynamic axisSet;
/** @property CPTPlotGroup *plotGroup
* @brief The plot group.
**/
@dynamic plotGroup;
#pragma mark - #pragma mark Init/Dealloc
/// @name Initialization
/// @{ /** @brief Initializes a newly allocated CPTPlotAreaFrame object with the provided frame rectangle.
*
* This is the designated initializer. The initialized layer will have the following properties:
* - @ref plotArea = a new CPTPlotArea with the same frame rectangle
* - @ref masksToB order = @YES
* - @ref needsDisplayOnBoundsChange = @YES
* @param newFrame The frame rectangle.
* @return The initialized CPTPlotAreaFrame object.
**/
-(instancetype)initWithFrame:(CGRect)newFrame {
if ( (self = [super initWithFrame:newFrame]) ) {
plotArea = nil;
CPTPlotArea *newPlotArea = [[CPTPlotArea alloc] initWithFrame:newFrame]; selfplotArea = newPlotArea; self. masksToB order = YES;
selfneedsDisplayOnBoundsChange = YES;
}
return self;
}
III ®} III @cond
-(instancetype)initWithLayer:(id)layer
{
if ( (self = [super initWithLayenlayer]) ) {
CPTPlotAreaFrame *theLayer = (CPTPlotAreaFrame *)layer; plotArea = theLayer->plotArea;
}
return self;
}
/// @endcond #pragma mark -
#pragma mark NSCoding Methods /// @cond
-(void)encodeWithCoder:(NSCoder *)coder
{
[super encodeWithCoder: coder];
[coder encodeObject: selfplotArea forKey:@"CPTPlotAreaFrame. plotArea"];
}
-(instancetype)initWithCoder:(NSCoder *)coder
{
if ( (self = [super initWithCoder: coder]) ) {
plotArea = [coder decodeObjectForKey:@"CPTPlotAreaFrame. plotArea"];
}
return self; }
/// @endcond #pragma mark -
#pragma mark Event Handling
/// @name User Interaction
/// @{
/**
* @brief Informs the receiver that the user has
* @if MacOnly pressed the mouse button. @endif
* @if iOSOnly touched the screen. @endif
*
* @param event The OS event.
* @param interactionPoint The coordinates of the interaction.
* @return Whether the event was handled or not.
**/
-(BOOL)pointingDeviceDownEvent:(CPTNativeEvent *)event
atPoint:(CGPoint)interactionPoint
{
if ( [self.plotArea pointingDeviceDownEvent: event atPoin interactionPoint] ) { return YES;
}
else {
return [super pointingDeviceDownEven event atPointinteractionPoint];
}
}
/**
* @brief Informs the receiver that the user has
* @if MacOnly released the mouse button. @endif
* @if iOSOnly lifted their finger off the screen. @endif
*
* @param event The OS event.
* @param interactionPoint The coordinates of the interaction.
* @return Whether the event was handled or not.
**/
-(BOOL)pointingDeviceUpEvent:(CPTNativeEvent *)event
atPoint:(CGPoint)interactionPoint
{
if ( [self.plotArea pointingDeviceUpEvent: event atPointinteractionPoint] ) { return YES;
}
else {
return [super pointingDeviceUpEvent: event atPointinteractionPoint]; }
}
/**
* @brief Informs the receiver that the user has moved
* @if MacOnly the mouse with the button pressed. @endif
* @if iOSOnly their finger while touching the screen. @endif
*
* @param event The OS event.
* @param interactionPoint The coordinates of the interaction.
* @return Whether the event was handled or not.
**/
-(BOOL)pointingDeviceDraggedEvent:(CPTNativeEvent *)event
atPoint:(CGPoint)interactionPoint
{
if ( [self. plotArea pointingDeviceDraggedEventevent atPointinteractionPoint] ) { return YES;
}
else {
return [super pointingDeviceDraggedEventevent atPointinteractionPoint];
}
}
/**
* @brief Informs the receiver that tracking of
* @if MacOnly mouse moves @endif
* @if iOSOnly touches @endif
* has been cancelled for any reason.
*
* @param event The OS event.
* @return Whether the event was handled or not.
**/
-(BOOL)pointingDeviceCancelledEvent:(CPTNativeEvent *)event
{
if ( [self. plotArea pointingDeviceCancelledEventevent] ) {
return YES;
}
else {
return [super pointingDeviceCancelledEvent: event];
}
}
III ®} #pragma mark -
#pragma mark Accessors /// @cond
-(void)setPlotArea:(CPTPlotArea *)newPlotArea
{
if ( newPlotArea != plotArea ) {
[plotArea removeFromSuperlayer];
plotArea = newPlotArea;
if ( plotArea ) {
[self insertSublayenplotArea atlndex:0]; plotArea.graph = selfgraph;
}
[self setNeedsLayout];
}
-(CPTAxisSet *)axisSet
{
return self.plotArea.axisSet;
}
-(void)setAxisSet:(CPTAxisSet *)newAxisSet {
self.plotArea.axisSet = newAxisSet;
}
-(CPTPlotGroup *)plotGroup
{
return self.plotArea.plotGroup;
}
-(void)setPlotGroup : (CPTPlotGroup *)newPlotGroup
{
self.plotArea.plotGroup = newPlotGroup;
}
-(void)setGraph:(CPTGraph *)newGraph
{
if ( newGraph != selfgraph ) {
[super setGraph: newGraph]; selfplotArea. graph = newGraph;
}
}
/// @endcond @end

Claims

What is claimed is: 1 . A wearable emotional feedback apparatus, comprising:
a wearable sensing mesh configured for wearing by a user;
a plurality of physiological sensors integrated into said wearable sensing mesh, wherein said sensors are configured for determining response to emotional stimuli;
a controller configured for receiving inputs from said physiological sensors and for detecting fluctuations in the emotional arousal state of the user wearing said wearable sensing mesh; and
a communication circuit configured for wirelessly communicating the detected signs to a mobile device configured for displaying the physiological information.
2. The apparatus as recited in claim 1 , wherein said apparatus is configured for detecting emotional arousal states in the user wearing said wearable sensing mesh associated with autism spectrum disorder (ASD).
3. The apparatus as recited in claim 1 , wherein said apparatus is configured for detecting states in the user wearing said wearable sensing mesh associated with Alzheimer's disease.
4. The apparatus as recited in claim 1 , wherein said wearable sensing mesh comprises at least one wearable glove, or sock.
5. The apparatus as recited in claim 1 , wherein said wearable sensing mesh comprises a wearable device retained in proximity to face, armpits, and/or crotch area of the user wearing said wearable sensing mesh.
6. The apparatus as recited in claim 1 , wherein said plurality of physiological sensors are selected from a group of sensors consisting of: heartrate (HR), heart rate (HR) variability, and galvanic skin response (GSR).
7. The apparatus as recited in claim 6, wherein said heartrate (HR) and heart rate (HR) variability are detected using a pulse oximeter for obtaining a photo plethysmograph to detect the heart rate and heart rate variability of the user wearing said wearable sensing mesh.
8. The apparatus as recited in claim 7, wherein said pulse oximeter detects heartrate pulse in response to emitting light into the skin of the user of said wearable sensing mesh with reflection amplitudes being recorded and analyzed to determine heart rate (HR) and heart rate (HR) variability.
9. The apparatus as recited in claim 6, wherein said galvanic skin response (GSR) sensors comprise conductive elements spanning one or more areas of said wearable mesh so that changes in skin resistance can be measured in response to an amount of sweat produced by the user wearing said wearable sensing mesh.
10. The apparatus as recited in claim 9, wherein said controller is configured for measuring said changes in skin resistance from said galvanic skin response (GSR) sensors by applying differential voltages between said conductive elements to measure resistance.
1 1 . The apparatus as recited in claim 10, wherein said resistance between said conductive elements is measured using a Wheatstone bridge input to a differential amplifier whose output is either directly sent to an analog input of said controller, or directed to a circuit for extracting information and passing it to said controller.
12. The apparatus as recited in claim 1 , wherein said plurality of physiological sensors includes an acceleration sensor.
13. The apparatus as recited in claim 12, wherein said acceleration sensor is configured for sensing accelerations in three-dimensions.
14. The apparatus as recited in claim 1 , wherein said plurality of physiological sensors includes an audio sensor or microphone input, wherein said apparatus transmits collected audio to the mobile device.
15. The apparatus as recited in claim 1 , wherein said plurality of physiological sensors includes a global positioning sensor (GPS) configured for communicating location of the user wearing said wearable sensing mesh to said controller for communication to the mobile device.
16. The apparatus as recited in claim 1 , wherein said plurality of physiological sensors includes a fall sensing detector configured for
communicating a suspected fall of the user wearing said wearable sensing mesh to said controller for communication to the mobile device.
17. The apparatus as recited in claim 1 , wherein said apparatus further comprises one or more output annunciators for communicating visual or audio information from the user of the mobile device to the user wearing the wearable sensing mesh.
18. The apparatus as recited in claim 1 , wherein said mobile device comprises a mobile phone, touch pad, or laptop computer.
19. The apparatus as recited in claim 1 , wherein said apparatus is configured for generating push notifications to alert a user of the mobile device which is receiving physiological information from said wearable sensing mesh that threshold conditions are exceeded for the user wearing said wearable sensing mesh.
20. An emotional feedback apparatus for use during treatment of autism spectrum disorder (ASD), comprising:
a wearable sensing mesh configured for wearing by a user;
a plurality of physiological sensors integrated into said wearable sensing mesh, wherein said sensors are configured for determining response to emotional stimuli;
a controller configured for receiving inputs from said physiological sensors and for detecting fluctuations in the emotional arousal state of the wearer which are associated with impulsivity which could lead to self-harm by the user wearing said wearable sensing mesh having autism spectrum disorder (ASD);
a heartrate (HR) sensor, and galvanic skin response (GSR) sensor within said physiological sensors for detecting heart rate, heart rate variability, and level of perspiration of the user wearing the wearable sensing mesh;
wherein said controller is configured to correlate heartrate, heart rate variability, and level of perspiration, to determine a level of emotional arousal state;
a communication circuit configured for wirelessly communicating the emotional arousal state to a mobile device configured for displaying the
physiological information; and
wherein said apparatus is configured for generating push notifications to generate alerts, to a user of the mobile device, that threshold conditions for emotional arousal state have been exceeded.
21 . The apparatus as recited in claim 20, wherein said plurality of physiological sensors includes an audio sensor or microphone input, wherein said apparatus transmits collected audio to the mobile device.
22. The apparatus as recited in claim 20, wherein said apparatus further comprises one or more output annunciators for communicating visual or audio information from the user of the mobile device to the user wearing the wearable sensing mesh.
23. The apparatus as recited in claim 20, wherein said plurality of physiological sensors includes a fall sensing detector configured for
communicating a suspected fall of the user wearing the wearable sensing mesh to said controller for communication to the mobile device.
24. The apparatus as recited in claim 20, wherein said mobile device comprises a mobile phone, touch pad, or laptop computer.
PCT/US2017/036725 2016-06-10 2017-06-09 Wearable emotional feedback apparatus for autism spectrum disorder WO2017214490A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201662348726P 2016-06-10 2016-06-10
US62/348,726 2016-06-10
US201662363157P 2016-07-15 2016-07-15
US62/363,157 2016-07-15

Publications (1)

Publication Number Publication Date
WO2017214490A1 true WO2017214490A1 (en) 2017-12-14

Family

ID=60578989

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2017/036725 WO2017214490A1 (en) 2016-06-10 2017-06-09 Wearable emotional feedback apparatus for autism spectrum disorder

Country Status (1)

Country Link
WO (1) WO2017214490A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108540966A (en) * 2018-03-16 2018-09-14 飞天诚信科技股份有限公司 A kind of method and communication device of Bluetooth communication
WO2019200474A1 (en) * 2018-04-16 2019-10-24 Technologies Hop-Child, Inc. Systems and methods for the determination of arousal states, calibrated communication signals and monitoring arousal states
CN110755063A (en) * 2018-10-06 2020-02-07 江苏创越医疗科技有限公司 Low-delay electrocardiogram drawing method
RU2772185C1 (en) * 2020-12-04 2022-05-18 Федеральное государственное автономное образовательное учреждение высшего образования "Национальный исследовательский Нижегородский государственный университет им. Н.И. Лобачевского" Method for registering emotional maladaptation based on a cardiorhythmogram

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080129518A1 (en) * 2006-12-05 2008-06-05 John Carlton-Foss Method and system for fall detection
US20110245633A1 (en) * 2010-03-04 2011-10-06 Neumitra LLC Devices and methods for treating psychological disorders
US20120022343A1 (en) * 2001-12-13 2012-01-26 Musc Foundation For Research Development Systems and methods for detecting deception by measuring brain activity
US20150297109A1 (en) * 2014-04-22 2015-10-22 Interaxon Inc. System and method for associating music with brain-state data
US20160104451A1 (en) * 2014-10-09 2016-04-14 Nedim T. SAHIN Method, system, and apparatus for battery life extension and peripheral expansion of a wearable data collection device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120022343A1 (en) * 2001-12-13 2012-01-26 Musc Foundation For Research Development Systems and methods for detecting deception by measuring brain activity
US20080129518A1 (en) * 2006-12-05 2008-06-05 John Carlton-Foss Method and system for fall detection
US20110245633A1 (en) * 2010-03-04 2011-10-06 Neumitra LLC Devices and methods for treating psychological disorders
US20150297109A1 (en) * 2014-04-22 2015-10-22 Interaxon Inc. System and method for associating music with brain-state data
US20160104451A1 (en) * 2014-10-09 2016-04-14 Nedim T. SAHIN Method, system, and apparatus for battery life extension and peripheral expansion of a wearable data collection device

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108540966A (en) * 2018-03-16 2018-09-14 飞天诚信科技股份有限公司 A kind of method and communication device of Bluetooth communication
WO2019200474A1 (en) * 2018-04-16 2019-10-24 Technologies Hop-Child, Inc. Systems and methods for the determination of arousal states, calibrated communication signals and monitoring arousal states
CN110755063A (en) * 2018-10-06 2020-02-07 江苏创越医疗科技有限公司 Low-delay electrocardiogram drawing method
RU2772185C1 (en) * 2020-12-04 2022-05-18 Федеральное государственное автономное образовательное учреждение высшего образования "Национальный исследовательский Нижегородский государственный университет им. Н.И. Лобачевского" Method for registering emotional maladaptation based on a cardiorhythmogram

Similar Documents

Publication Publication Date Title
US11432721B2 (en) Methods, systems and devices for physical contact activated display and navigation
Chen et al. Smart clothing: Connecting human with clouds and big data for sustainable health monitoring
US8954291B2 (en) Alarm setting and interfacing with gesture contact interfacing controls
JP2019042500A (en) Electronic equipment for acquiring biological information and method thereof
US20140343380A1 (en) Correlating Sensor Data Obtained from a Wearable Sensor Device with Data Obtained from a Smart Phone
US11622696B2 (en) Method for improving heart rate estimates by combining multiple measurement modalities
CN108268095A (en) Electronic equipment
JP2018506763A (en) System and method for generating health data using wearable device measurements
CN108366731A (en) The wearable device and method of electrodermal activity for determining object
WO2017214490A1 (en) Wearable emotional feedback apparatus for autism spectrum disorder
Zheng et al. An emerging wearable world: New gadgetry produces a rising tide of changes and challenges
JP2019513437A (en) Fetal movement monitoring system and method
EP3352659A1 (en) System and method for obtaining blood pressure measurement
US20210068674A1 (en) Track user movements and biological responses in generating inputs for computer systems
WO2017052822A1 (en) System and method for obtaining blood pressure measurement
Kumar et al. Electronics in textiles and clothing: design, products and applications
US11147505B1 (en) Methods, systems and devices for identifying an abnormal sleep condition
KR20180073795A (en) Electronic device interworking with smart clothes, operating method thereof and system
KR20200094344A (en) Method for calculating recovery index based on rem sleep stage and electonic device therof
JP7338587B2 (en) Operating state monitoring system, training support system, operating state monitoring method and program
Peter et al. Physiological sensing for affective computing
US11406330B1 (en) System to optically determine blood pressure
CN203208033U (en) Blood oxygen detection fabric
JPWO2020174901A1 (en) Fiber optic sensing system, state detector, state detection method, and program
Yang et al. Home health care system for the Elderly based on IMU wearable device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17811070

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17811070

Country of ref document: EP

Kind code of ref document: A1