For my first project I have decided to make a environmental station. This will be the third version of the station. My first version was an Arduino Uno with a DHT - 22 Sensor connected to my PC by a usb, this was nice but only let me see the data when my PC was on. Next came the SD shield for the Uno which logged the data; better but I had to stop the station to get at the data manually which was not perfect. So that lead me onto the Spark Core where you join me.
OK now to the present this post aim to instruct you on how I created my WiFi environmental station.
Items needed
Spark Core - I have the ufl one and an antenna to maximise rangeBreadboard
DHT - 22
Male to Male wires
10K Resistor - I think
Google account
Wiring
Here is a diagram depicting the wiring I used. It was all plugged into the breadboard that came with my spark core. The resistor goes between the 3.3v pin and the data pin. I have color coded the wires red is 3.3v, green is data and white is Ground. The picture is not the best but i hope it helps.Spark Core Code
The code for the Station was harder that I thought it would be. I used this library - PietteTech_DHT- in the end because it worked.Credit should go to scott piette for his work on the library. I adaped the dht-simple.ino example to create this code. It failed to verify if I change the name of the code; probably something simple I was missing.
/*
* FILE: DHT_simple.ino
* VERSION: 0.3
* PURPOSE: Example that uses DHT library with two sensors
* LICENSE: GPL v3 (http://www.gnu.org/licenses/gpl.html)
*
* Samples one sensor and monitors the results for long term
* analysis. It calls DHT.acquireAndWait
*
* Scott Piette (Piette Technologies) scott.piette@gmail.com
* January 2014 Original Spark Port
* October 2014 Added support for DHT21/22 sensors
* Improved timing, moved FP math out of ISR
* Edited by Sam The Tinkerer
*
*
*/
#include "PietteTech_DHT/PietteTech_DHT.h"
#define DHTTYPE DHT22 // Sensor type DHT11/21/22/AM2301/AM2302
#define DHTPIN 3 // Digital pin for communications
char resultstr[64];
float data1 = 1.00;
float data2 = 1.00;
int count = 0; // counter
//declaration
void dht_wrapper(); // must be declared before the lib initialization
// Lib instantiate
PietteTech_DHT DHT(DHTPIN, DHTTYPE, dht_wrapper);
void setup()
{
Serial.begin(9600);
// while (!Serial.available()) {
// Serial.println("Press any key to start.");
// delay (1000);
// }
Serial.println("DHT Example program using DHT.acquireAndWait");
Serial.print("LIB version: ");
Serial.println(DHTLIB_VERSION);
Serial.println("---------------");
// expose your char buffer to the Cloud API
Spark.variable("result", &resultstr, STRING);
}
// This wrapper is in charge of calling
// must be defined like this for the lib work
void dht_wrapper() {
DHT.isrCallback();
}
void loop()
{
if (Time.minute() == 4 || Time.minute() == 9 || Time.minute() == 14 || Time.minute() == 19 || Time.minute() == 24 || Time.minute() == 29 || Time.minute() == 34 || Time.minute() == 39 || Time.minute() == 44 || Time.minute() == 49 || Time.minute() == 54 || Time.minute() == 59)
{
int result = DHT.acquireAndWait();
switch (result) {
case DHTLIB_OK:
Serial.println("OK");
break;
case DHTLIB_ERROR_CHECKSUM:
Serial.println("Error\n\r\tChecksum error");
break;
case DHTLIB_ERROR_ISR_TIMEOUT:
Serial.println("Error\n\r\tISR time out error");
break;
case DHTLIB_ERROR_RESPONSE_TIMEOUT:
Serial.println("Error\n\r\tResponse time out error");
break;
case DHTLIB_ERROR_DATA_TIMEOUT:
Serial.println("Error\n\r\tData time out error");
break;
case DHTLIB_ERROR_ACQUIRING:
Serial.println("Error\n\r\tAcquiring");
break;
case DHTLIB_ERROR_DELTA:
Serial.println("Error\n\r\tDelta time to small");
break;
case DHTLIB_ERROR_NOTSTARTED:
Serial.println("Error\n\r\tNot started");
break;
default:
Serial.println("Unknown error");
break;
}
Serial.print("Humidity (%): ");
data1 = DHT.getHumidity(), 2;
Serial.println(data1);
Serial.print("Temperature (oC): ");
Serial.println(DHT.getCelsius(), 2);
data2 = DHT.getCelsius();
count = count + 1; // to test new reading
// format your data as JSON, don't forget to escape the double quotes
sprintf(resultstr, "{\"data1\":%f,\"data2\":%f,\"count\":%d}", data1, data2, count);
}
delay(1000);
}
I have stripped back to code to make to easier to work with. data 1 is Humidity, data 2 is Temperature in Celsius and count to check it is updating. Using a If statement the code it only take measurement at 5 min increments starting at 4 mins past the hour. It take 59 or 60 reading in the min and stores the latest in the Spark.variable. I have left some Serial.printing in for simple debugging * FILE: DHT_simple.ino
* VERSION: 0.3
* PURPOSE: Example that uses DHT library with two sensors
* LICENSE: GPL v3 (http://www.gnu.org/licenses/gpl.html)
*
* Samples one sensor and monitors the results for long term
* analysis. It calls DHT.acquireAndWait
*
* Scott Piette (Piette Technologies) scott.piette@gmail.com
* January 2014 Original Spark Port
* October 2014 Added support for DHT21/22 sensors
* Improved timing, moved FP math out of ISR
* Edited by Sam The Tinkerer
*
*
*/
#include "PietteTech_DHT/PietteTech_DHT.h"
#define DHTTYPE DHT22 // Sensor type DHT11/21/22/AM2301/AM2302
#define DHTPIN 3 // Digital pin for communications
char resultstr[64];
float data1 = 1.00;
float data2 = 1.00;
int count = 0; // counter
//declaration
void dht_wrapper(); // must be declared before the lib initialization
// Lib instantiate
PietteTech_DHT DHT(DHTPIN, DHTTYPE, dht_wrapper);
void setup()
{
Serial.begin(9600);
// while (!Serial.available()) {
// Serial.println("Press any key to start.");
// delay (1000);
// }
Serial.println("DHT Example program using DHT.acquireAndWait");
Serial.print("LIB version: ");
Serial.println(DHTLIB_VERSION);
Serial.println("---------------");
// expose your char buffer to the Cloud API
Spark.variable("result", &resultstr, STRING);
}
// This wrapper is in charge of calling
// must be defined like this for the lib work
void dht_wrapper() {
DHT.isrCallback();
}
void loop()
{
if (Time.minute() == 4 || Time.minute() == 9 || Time.minute() == 14 || Time.minute() == 19 || Time.minute() == 24 || Time.minute() == 29 || Time.minute() == 34 || Time.minute() == 39 || Time.minute() == 44 || Time.minute() == 49 || Time.minute() == 54 || Time.minute() == 59)
{
int result = DHT.acquireAndWait();
switch (result) {
case DHTLIB_OK:
Serial.println("OK");
break;
case DHTLIB_ERROR_CHECKSUM:
Serial.println("Error\n\r\tChecksum error");
break;
case DHTLIB_ERROR_ISR_TIMEOUT:
Serial.println("Error\n\r\tISR time out error");
break;
case DHTLIB_ERROR_RESPONSE_TIMEOUT:
Serial.println("Error\n\r\tResponse time out error");
break;
case DHTLIB_ERROR_DATA_TIMEOUT:
Serial.println("Error\n\r\tData time out error");
break;
case DHTLIB_ERROR_ACQUIRING:
Serial.println("Error\n\r\tAcquiring");
break;
case DHTLIB_ERROR_DELTA:
Serial.println("Error\n\r\tDelta time to small");
break;
case DHTLIB_ERROR_NOTSTARTED:
Serial.println("Error\n\r\tNot started");
break;
default:
Serial.println("Unknown error");
break;
}
Serial.print("Humidity (%): ");
data1 = DHT.getHumidity(), 2;
Serial.println(data1);
Serial.print("Temperature (oC): ");
Serial.println(DHT.getCelsius(), 2);
data2 = DHT.getCelsius();
count = count + 1; // to test new reading
// format your data as JSON, don't forget to escape the double quotes
sprintf(resultstr, "{\"data1\":%f,\"data2\":%f,\"count\":%d}", data1, data2, count);
}
delay(1000);
}
logging code
All credit should go to binaryfrost for this bit, his post can be found here. https://community.spark.io/t/example-logging-and-graphing-data-from-your-spark-core-using-google/2929.
Add this to you spark Core's code and make the int data = what you want to send to the Google. To change it to a float you also have to change the %d to %f
char resultstr[64];
void setup()
{
pinMode(A0, INPUT); // setup A0 as analog input
pinMode(A1, INPUT); // setup A1 as analog input
// expose your char buffer to the Cloud API
Spark.variable("result", &resultstr, STRING);
}
void loop()
{
int data1 = analogRead(A0); // read some data
int data2 = analogRead(A1); // some some other data
// format your data as JSON, don't forget to escape the double quotes
sprintf(resultstr, "{\"data1\":%d,\"data2\":%d}", data1, data2);
delay(1000); // wait for a second
}
For the Google end it is just this simple code- Create -> Spreadsheet
- Tools -> Script Editor
- Add the code below, amended accordingly
- Resources -> Current Project's Triggers
- Filling: collectData - Time Driven - Minutes Timer - Every 5 minutes
function collectData() {
var sheet = SpreadsheetApp.getActiveSheet();
var response = UrlFetchApp.fetch("https://api.spark.io/v1/devices/YOUR-DEVICE-ID/result?access_token=YOUR-ACCESS-TOKEN");
try {
var response = JSON.parse(response.getContentText()); // parse the JSON the Core API created
var result = unescape(response.result); // you'll need to unescape before your parse as JSON
try {
var p = JSON.parse(result); // parse the JSON you created
var d = new Date(); // time stamps are always good when taking readings
sheet.appendRow([d, p.data1, p.data2]); // append the date, data1, data2 to the sheet
} catch(e)
{
Logger.log("Unable to do second parse");
}
} catch(e)
{
Logger.log("Unable to returned JSON");
}
}
Once agin big thanks to binaryfrost for his hard work
Thanks for Reading My Project. I hope to do more in the future like how to control the RGB led on the spark core. If you have any comments/questions please post them below.
No comments:
Post a Comment