This is a project conducted by two students of the University of Applied Sciences and Arts Northwestern Switzerland. The website contains instructions and ways how to build a cluster with Raspberry Pi computers.
Also, it shows how you can install and run exmples for Machine Learning applications using several well-known ML libraries.
If you want to build a cluster on your own, but you do not know what components to buy, you can check out our order list and use this as a foundation for your endeavour.
Be informed, that this documentation goes very much along with the raspifarm GitHub repository. On this repository you will find many scripts and/or tools that some pages here reference or will make use of.
If you have 2 or more Raspberry Pi computers and want to make a cluster out of it, but you don't know where to start? We have written down in detail how we did the installation and configuration for our cluster. Feel free to get some inspiration from the setup manual we elaborated.
Distributed computation is fun. Learn how to prepare your cluster for it and use few Machine Learning frameworks for your experiments.
You should always change default passwords.
In case you want to use the raspberry as a router for network traffic
Why do we do this?
Embed your Raspberry Pi into a wireless network
The Raspbian image is delivered as a minimal image, in order to use the full size of your SD card, you need to expand the filesystem
Installing the operating system is really not that difficult, and exciting too!
Required to login on the slave nodes without a password
In case you want to run websites on your Raspberry Pi
Some packages come in handy from time to time.
Like humans, computer wants to have names too
One user has to do it all: Farmer!
What is this used for?
Lets you run code via webbrowser. Attention: It's awesome!
What's that used for?
Important if you want to use Spark on cluster
Orchestrate your cluster with Ansible
Take pictures with the camera
Starting Spark and Jupyter with the cluster
Using pyspark and working with the python-shell
Execute playbooks
New nodes have to be 'discovered' and registered
Install Apache Spark on the cluster
DHCP and DNS in one package
Easy access on the slave nodes for the farmer user
See what your nodes are doing
Because sometimes you'll need to do it
We need to know where our slaves are