Getting a bit closer, still need to fix a few isues, but its close.
The requerements for running this is:
Python 3.5
the following python modules: requests, beautifulsoup4, lxml
Django 1.8
it ca run on anything that has Python 3.5 support, windows, linux, unix, macos..
guide:
Download and install python 3.5 for your system:
https://www.python.org/downloads
remember to click the option to add to the system path (if running windows)
when done start up a console, and write:
pip install request
pip install beautifulsoup4
pip install lxml (if you are using windows, and dont have a compile enviroment set up, you need to download a compiled version of lxml, and install that.. go to:
http://www.lfd.uci.edu/~gohlke/pythonlibs/#lxml download the version for python 3.5 either 32bit if you are using 32bit python or 64bit use this command to install it "pip install lxml-3.5.0-cp35-none-win_amd64.whl")
pip install django==1.8.7 (should work with 1.9 but i havent tested it yet)
that is all, now the computer can run the enviroment..
The package i will be uploading in a day or so, should be able to run out of the box, when the above is done. Later on i will probaly make a install script.
Do people still want to test it? or have the above scared people away?
here is a small update aswell, showing the new features.. (still taking ideas for features though)... I am trying to figure out how much data to keep, and to show.. I am curently thinking between 7 and 10 days for the main graphs, and mabye 14 days for the low/mid/high graphs, along with some historical data, mabye avarage pr month over a year or so....
Here you see around 10 days of log data. (remember its zoomable.. so you can zoom into the graps to show a section more detailed)
and zoomed in on a day
The system records the lowest, and highest reading through out the day, and calcurates the median (the value the system sits at the most) and log these. This is done curently one hour a day. It takes all the recordings from today and 7 days back, then calcurates these values, and log them, this is done each hour. That means i end up with 12 recordings a day. These values are then added togerther, and i then show the avarage for each on a graphs pr day.. (the 7 day thing will probaly be cut down to 3 days or so..) i hope it makes sense..
you can remove a graph to make it easier to read the other.. here i have unselected the high graphs.
and you can export...