Almost ready. I got it down to 1.5GB zipped. 3GB unzipped. I used https://github.com/Drewsif/PiShrink I might be able to get it smaller. Just have to sanitize it and upload it.
Image will expand file system on first boot, then reboot. Supports ADSBexchange.com MLAT. Features include: adsb-config.txt in /boot ssh user: pi password: adsb123 port 3226 ssh [email protected]<ip address> -p 3226 Raspbian GNU/Linux 9.1 (stretch) Custom Grafana Dashboard port:3000 login: adsb password: adsb Prometheus metrics Dump 1090 latest MLAT Client OpenLayers Maps: Vector World Map Vector Roads Vector US States US Sectionals US IFR High US IFR Low US IFR High Enroute US NEXRAD Major Airports GPS for Time Sync Offline (not setup by default) Various performance fixes Temporary Dropbox Link (compressed 842.32 MB)(uncompressed 3.1 GB) Unzip and burn like any other image Linux users open a terminal type lsblk dd if=/location/of/image.img of=/dev/mmcblk0 Windows users https://etcher.io/ SCREENSHOTS:
It should run on a Pi Zero but I haven't tested the image it on a Pi Zero. I compiled 1090dump, grafana, proemtheus, mlat from source. The most Ram I've seen everything use is 150MB. All the statistics write to /var/run [stored in Ram] and only collect for 4 hours - this should prevent bricking of SD card. I have a Pi Zero here, let me try it. EDIT: I tried the image with the Pi Zero. dump 1090, mlat, feeding seems to work. Dump 1090 map works. Grafana does not. Grafana seg faults on launch. I just tried installing it from apt and it still seg faults. No dashboard at this point, I'll see what I can do to get it working.
James I would like to add Grafana dashboard to my existing setup on Debian stretch (Intel NUC, amd64). Can you publish source? Of course if possible with some description "How to". I don't want to blow up opened doors .
If you're running linux already. Install Grafana, Prometheus via appropriate package manager or source. http://docs.grafana.org/installation/debian/ https://github.com/adsbxchange/prometheus https://github.com/adsbxchange/grafana Grafana will run on port 3000. I used lightttpd for server static dump 1090 files using dump1090-mutability You'll also need the exporter below. It will be available on <feederIP>:9105 https://github.com/adsbxchange/dump1090-exporter Add the scrape to prometheus: /etc/prometheus/prometheus.yml Code: scrape_configs: # The job name is added as a label `job=<job_name>` to any timeseries scraped from this config. # metrics_path defaults to '/metrics' # scheme defaults to 'http'. - job_name: 'dump1090' scrape_interval: 60s scrape_timeout: 5s static_configs: - targets: ['localhost:9105'] I remove the other scrapes by # commenting out on the Pi - but that was just due to running all the writes into ram. Dashboard JSON file attached below. Remove the .txt and import into Grafana using UI @ port 3000.
James I can't login into grafana. adsb doesn't work. What is default password? Found it Found it! Not so easy but works. Under Debian 9 it needs some extra work, but it works! Thanx again!!!
Grafana admin login should be: admin admin Cool you got it all working! Eventually I think I'm going to make it send stats to a custom ADSBexchange server so feeders can look up stats like FA. Shouldn't be hard to have prometheus or collectd export externally as an option. I was trying to make this image use as little bandwidth as possible.
James I'm afraid that grafana uses only US time and date format :-( I can't find t&d settings). If I'm right then this is perfect image for US users only :-( Anyway good job.
Yeah it looks like it's on the list of 'fixes' ... silly Europeans! I'm 99% certain it supports the YYYY-MM-DD international format tho. https://github.com/grafana/grafana/issues/1459 But that's more for visuals - reading further. I might switch to Kibana or something custom at some point. Writing a plugin for Grafana is not exactly simple either. https://www.elastic.co/guide/en/beats/metricbeat/master/metricbeat-module-prometheus.html I also looked at: https://keen.github.io/dashboards/ https://github.com/Shopify/dashing https://github.com/Freeboard/freeboard https://github.com/getredash/redash Also what causes this to error out? Dates in prometheus on port 9090 look like they are stored in yyyy/dd/mm format. Assuming you did the setup manually. Did you add Prometheus as a Data Source in the Grafana backend?
Something like this? I might just need you to NOT set the scrape interval. I could have manually set it on my image. EDIT: don't set the scrape interval
I was able to reproduce the error by manually setting the scrape interval on the datasource.... what an interesting issues Seems the interval needs to have a 's' notation .. not just integer... so leave scrape interval in the Data Source in Grafana blank OR set it to 60s (or whatever the interval for the job is set to in the Prometheus config) - if you used mine it's 60 secs
Main problem was dump1090exporter. Install using sudo! And for mutability change port number from 8080 to [IP of your computer/dump1090/]. Awfully a lot of small issues. Maybe I'll write tutorial, just to prevent others of making same mistakes . Once again thanks, really good job!
My next stupid question. How to add vector maps to existing setup? Mutability 1.15~dev Asking a question sometimes is followed by answer, no question=no answer .
You will need openlayers for the vector maps - so mutability 1.15 or a fork post the switch away from Google maps. I think I can fork dumnp1090-fa and do the same thing with it. I want to improve the dump1090 UI but it's low on the priority list - there's much better maps like VRS and https://github.com/Ysurac/FlightAirMap (php). Here is a direct zip of the html directory from my image. You might have to change were it looks for aircraft.json. layers.js - this contains the openlayers objects custom.geo.json ne_10m_roads.json ne_10m_airports.geojson ... etc.. Those are your vector files that are referenced in layers.js https://www.dropbox.com/s/8gwfzqf31pvc1eh/html.zip?dl=0 You should be able to install the mutability-1.15 or an open layers fork and swap in those files.