Philip J. Basford, Steven J. Johnston, Colin S. Perkins, Tony Garnock-Jones, Fung Po Tso, Dimitrios Pezaros, Robert Mullins, Eiko Yoneki, Jeremy Singer, and Simon J. Cox, (2019) Performance analysis of single board computer clusters. Future Generation Computer Systems. (doi:10.1016/j.future.2019.07.040).
Since the writing of that paper the Raspberry Pi 4 has been announced, meaning it could not be included in the main body of the text. Despite this we wanted to see how the changes introduced in the new version changed the available performance. The Raspberry Pi 4 is available in 4 different memory sizes, we purchased a 2GB varient so it is directly comparable to the Odroid C2.
The Raspberry Pi 4 Model B was then benchmarked used HPL and ATLAS compiled on the device as per the instructions on a previous blog post. To make sure that the Raspberry Pi did not overheat or enter CPU throttling mode a metal heatsink was attached to the CPU, and a 60mm fan was arranged to blow air over the surface of the Raspberry Pi. The Raspberry Pi was powered using an offical Raspberry Pi 4 USB-C power supply. Where the tests of the SBCs compared in the paper we performed on an isolated network the tests for the Raspberry Pi 4 were performed on the main university network which may have introduced a performance penalty.
As can be seen in the chart the Raspberry Pi 4 significantly out performs the other SBCs at all problem sizes with a maximum of 10.68GFlops/s reached at 80% of RAM usage (swap disabled).
The initial testing of the Raspberry Pi 4 Model B shows that as a single node it has significantly better performance than any of the SBCs previously compared. It is hoped that as the network connection is capable of full gigabit/s speeds it will also scale better than the previous Raspberry Pi SBCs when combined into a cluster. For full details of the methodology used, and the performance of 16 node SBC clusters please read the published paper from doi:10.1016/j.future.2019.07.040 .
]]>The features that we wanted the PCB to have are as follows:
We have now completed the PCB and have been using one in one of our LoRaWAN gateways for a few months, and we used a second at EMF to provide LoRaWAN coverage. The designs for these boards known as Pi-Cot (Pi Concentrator on Top) are now freely (CC-BY-SA) from https://github.com/computenodes/pi-cot. The designs are provided as is, if there’s something yo would change please fork and issue a pull request.
]]>pip install plantower
The pypi module page can be found at https://pypi.org/project/plantower/
]]>Some of the slides presented at the beginning of the workshop were infographics taken from the World Health Organization (WHO). They can be found along with other general figures about air quality in the world at http://www.who.int/airpollution/infographics/en/. If you want to learn more about air quality and especially about air quality in UK, we suggest you to read the report published by the Royal College of Physicians and the Royal College of Paediatrics and Child Health in 2016: Every breath we take: the lifelong impact of air pollution.
The sensors used during the workshop are low-cost sensors used to monitor fine particles (PM10 and PM2.5). We strongly encourage you to read the advice from the Department for Environment Food & Rural Affairs (DEFRA) on the use of low-cost sensors for air pollution to have a better understanding of their limitations. The recommendation can be found at https://uk-air.defra.gov.uk/library/aqeg/pollution-sensors.php
The links below are the suppliers we used to purchase the parts – other suppliers are available.
We used heat shrink to attach the USB-serial adapter to the sensor as a single module. The heat shrink is not available in small quantities so a Velcro pad can be used instead to both secure the USB-serial adapter and to prevent accidental shorting of pins. The equipment was mounted on a custom laser cut perspex sheet, however, feel free to create your own mounting solution.
We have written a couple of custom libraries to facilitate this workshop (which will be installed in the setup steps). If you want to dig deeper into the code it can be found on github. If you make any improvements to the code please submit a pull request.
Having purchased / acquired the hardware the following steps have to be carried out to get to the stage where the workshop code can be run. In order for the LoRaWAN part to work you have to be in coverage of The Things Network. For the workshop we deployed 3 gateways on site. Coverage can be checked by using TTN Mapper. If you are not in coverage you can extend the network by running your own gateway. See https://www.thethingsnetwork.org/docs/gateways/ for details.
In order to assemble the hardware do the following steps:
Our finished nodes looked like
Copy the latest Raspbian image on to the SD card, and then connect to the Pi and run the following script.
The code below will install the required libraries and software to run the workshop. This will take some time.
echo "dtparam=spi=on" | sudo tee -a /boot/config.txt echo "dtoverlay=spi-gpio-cs" | sudo tee -a /boot/config.txt echo "dtparam=i2c_arm=on" | sudo tee -a /boot/config.txt sudo raspi-config noint do_serial 2 #get the spi CS pin overlay wget https://github.com/computenodes/dragino/releases/download/v0.0.1/spi-gpio-cs.dtbo sudo mv spi-gpio-cs.dtbo /boot/overlays/ #install jupyter sudo apt update sudo apt-get install -y python-dev sudo -H pip install --upgrade pip sudo -H pip install jupyter sudo apt-get install -y python-seaborn python-pandas python3-pandas sudo apt-get install -y ttf-bitstream-vera sudo pip3 install jupyter sudo ipython3 kernelspec install-self jupyter notebook --generate-config jupyter notebook password # this will prompt you for a password #add jupyter to crontab echo "@reboot pi /usr/local/bin/jupyter notebook --ip=0.0.0.0 --no-browser --notebook-dir=/home/pi >> /tmp/jupyter.out 2>> /tmp/jupyter.err" | sudo tee -a /etc/cron.d/jupyter #setup the RTC echo "i2c-bcm2708" | sudo tee -a /etc/modules sudo modprobe i2c:mcp7941x echo "@reboot root echo mcp7941x 0x6f > /sys/class/i2c-adapter/i2c-1/new_device; ( sleep 2; hwclock -s ) &" | sudo tee -a /etc/cron.d/rtc #get the cayenne module wget https://github.com/FEEprojects/cayennelpp-python/releases/download/v1.0.0/simplecayennelpp-1.0.0.tar.gz #install it pip3 install simplecayennelpp-1.0.0.tar.gz #get the plantower module wget https://github.com/FEEprojects/plantower/releases/download/v0.0.2/plantower-0.0.2.tar.gz pip3 install plantower-0.0.2.tar.gz #get the dragino lib and install wget https://github.com/computenodes/dragino/releases/download/v0.0.2/dragino-0.0.2.tar.gz pip3 install dragino-0.0.2.tar.gz sudo reboot
Once the node has rebooted make sure it has the correct time (If it’s connected to a network it will do this automatically). Then run:
sudo hwclock -w
This sets the time on the hardware RTC so the Pi knows the time when it boots even without GPS or network connections.
We use The Things Network to transfer the data from the device to Cayenne for viewing. This also requires setting up which can be done with the following steps. For more information please read The Things Network documentation.
Having done the above steps your config file should look similar but not identical to the following (The devaddr, nwskey and appskey lines will be different:
#GPS configuration gps_baud_rate = 9600 gps_serial_port = "/dev/serial0" gps_serial_timeout = 1 gps_wait_period = 10 #LoRaWAN configuration spreading_factor = 7 max_power = "0x0F" output_power = "0x0E" sync_word = 0x34 rx_crc = True #Where to store the frame count fcount_filename = "/home/pi/.lora_fcount" ##Valid auth modes are ABP or OTAA ##All values are hex arrays eg devaddr = 0x01, 0x02, 0x03, 0x04 auth_mode = "abp" devaddr = 0x26, 0x01, 0x19, 0xA9 nwskey = 0x4A, 0xE5, 0x3B, 0x8E, 0x1B, 0x4C, 0x24, 0x7C, 0x07, 0xA3, 0x52, 0x38, 0x3A, 0x28, 0x1F, 0xCD appskey = 0x02, 0x1D, 0x25, 0xF1, 0x92, 0xCE, 0x2B, 0xA2, 0x89, 0x8F, 0x29, 0x18, 0x43, 0xA9, 0x00, 0x1E #auth_mode = "otaa" #deveui = #appeui = #appkey =
In order to visualise the data you will also need an account on Cayenne https://cayenne.mydevices.com/cayenne/dashboard/start. Having created the account you need to add the device. To do this follow these steps:
Finally we are at the point where we can write the code that was used in the workshop. Connect to the Jupyter notebook, the exact address for this will depend on your network but it may be available at http://raspberrypi:8888.
The code created is as follows:
from dragino import Dragino # import the module required for GPS and LoRaWAN from simplecayennelpp import CayenneLPP # import the module required to pack the format cayenne expects from plantower import Plantower # import the module to speak to the PM sensor from time import sleep # import sleep so can wait between polling the sensor sensor = Plantower() dataList = [] for i in range(10): data = sensor.read() dataList.append(data.gr03um) sleep(1) average_reading = sum(dataList) / len(dataList) D = Dragino("/home/pi/dragino.ini") # Set up the dragino HAT my_position = D.get_gps() lpp = CayenneLPP() lpp.addAnalogInput(1, round(average_reading)) lapp.addGPS(2, my_position.latitude, my_position.longitude, myposition.altitude) D.send_bytes(list(lpp.getBuffer()))
If you find any mistakes in these instructions or they become out of date please comment below.
]]>sudo apt install gfortran automake
2. Download atlas from https://sourceforge.net/projects/math-atlas/. At the time of writing this is version 3.10.3, your version might be different.
tar xjvf atlas3.10.3.tar.bz2
3. Create a directory to build in (it’s recommended to not build in the source hierarchy), and cd into it
mkdir atlas-build cd atlas-build/
4. Disable CPU throttling on the Pi – the process will not start if it detects throttling.
echo performance | sudo tee /sys/devices/system/cpu/cpu0/cpufreq/scaling_governor
This will stop throttling, I found it helped to have a fan blowing air over the CPU to make sure it didn’t over heat.
5. Configure and build Atlas. These steps will take a while. Do NOT use the -j flag to parallelize the make process, this will cause inconsistent results. Where possible it will run operations in parallel automatically.
../ATLAS/configure make
6. Download MPI and install
cd wget http://www.mpich.org/static/downloads/3.2/mpich-3.2.tar.gz tar xzvf mpich-3.2.tar.gz cd mpich-3.2 ./configure make -j 4 sudo make install
6. Download HPL from and extract configure
cd wget http://www.netlib.org/benchmark/hpl/hpl-2.2.tar.gz tar xzvf hpl-2.2.tar.gz cd hpl-2.2 cd setup sh make_generic cp Make.UNKNOWN ../Make.rpi cd ..
7. Then edit Make.rpi to reflect where things are installed. In our case the following lines are edited from the default. Note line numbers might change with future versions
ARCH = rpi
[…]
TOPdir = $(HOME)/hpl-2.2
[…]
MPdir = /usr/local MPinc = -I $(MPdir)/include MPlib = /usr/local/lib/libmpich.so
[…]
LAdir = /home/pi/atlas-build LAinc = LAlib = $(LAdir)/lib/libf77blas.a $(LAdir)/lib/libatlas.a
8. Then compile HPL
make arch=rpi
Congratulations. You should now have a working HPL install. Let’s test it.
9. Change into the working directory and create the configuration needed to test the system. As the pi has 4 cores you need to tell mpi to assign 4 tasks to the host. Depending on the ambient temperature you may need to add a fan to stop the Pi CPU overheating as these tests are very demanding.
cd bin/rpi cat << EOF > nodes-1pi localhost localhost localhost localhost EOF
Customise the HPL.dat input file. The file below is the starting point we use
HPLinpack benchmark input file Innovative Computing Laboratory, University of Tennessee HPL.out output file name (if any) 6 device out (6=stdout,7=stderr,file) 1 # of problems sizes (N) 5120 Ns 1 # of NBs 128 NBs 0 PMAP process mapping (0=Row-,1=Column-major) 1 # of process grids (P x Q) 2 Ps 2 Qs 16.0 threshold 1 # of panel fact 2 PFACTs (0=left, 1=Crout, 2=Right) 1 # of recursive stopping criterium 4 NBMINs (>= 1) 1 # of panels in recursion 2 NDIVs 1 # of recursive panel fact. 1 RFACTs (0=left, 1=Crout, 2=Right) 1 # of broadcast 1 BCASTs (0=1rg,1=1rM,2=2rg,3=2rM,4=Lng,5=LnM) 1 # of lookahead depth 1 DEPTHs (>=0) 2 SWAP (0=bin-exch,1=long,2=mix) 64 swapping threshold 0 L1 in (0=transposed,1=no-transposed) form 0 U in (0=transposed,1=no-transposed) form 1 Equilibration (0=no,1=yes) 8 memory alignment in double (> 0)
Run the test
mpiexec -f nodes-1pi ./xhpl
If all goes to plan your output should look similar to
================================================================================ HPLinpack 2.2 -- High-Performance Linpack benchmark -- February 24, 2016 Written by A. Petitet and R. Clint Whaley, Innovative Computing Laboratory, UTK Modified by Piotr Luszczek, Innovative Computing Laboratory, UTK Modified by Julien Langou, University of Colorado Denver ================================================================================ An explanation of the input/output parameters follows: T/V : Wall time / encoded variant. N : The order of the coefficient matrix A. NB : The partitioning blocking factor. P : The number of process rows. Q : The number of process columns. Time : Time in seconds to solve the linear system. Gflops : Rate of execution for solving the linear system. The following parameter values will be used: N : 5120 NB : 128 PMAP : Row-major process mapping P : 2 Q : 2 PFACT : Right NBMIN : 4 NDIV : 2 RFACT : Crout BCAST : 1ringM DEPTH : 1 SWAP : Mix (threshold = 64) L1 : transposed form U : transposed form EQUIL : yes ALIGN : 8 double precision words -------------------------------------------------------------------------------- - The matrix A is randomly generated for each test. - The following scaled residual check will be computed: ||Ax-b||_oo / ( eps * ( || x ||_oo * || A ||_oo + || b ||_oo ) * N ) - The relative machine precision (eps) is taken to be 1.110223e-16 - Computational tests pass if scaled residuals are less than 16.0 ================================================================================ T/V N NB P Q Time Gflops -------------------------------------------------------------------------------- WR11C2R4 5120 128 2 2 25.11 3.565e+00 HPL_pdgesv() start time Wed May 16 10:35:46 2018 HPL_pdgesv() end time Wed May 16 10:36:11 2018 -------------------------------------------------------------------------------- ||Ax-b||_oo/(eps*(||A||_oo*||x||_oo+||b||_oo)*N)= 0.2389736 ...... PASSED ================================================================================ Finished 1 tests with the following results: 1 tests completed and passed residual checks, 0 tests completed and failed residual checks, 0 tests skipped because of illegal input values. -------------------------------------------------------------------------------- End of Tests. ================================================================================
The above steps were sufficient to get run atlas on a Pi 3B+ testing has shown that on a model 3B it may crash for N values above about 6000. This appears to be a problem with the hardware of the 3B, as described in a post on the pi forum. Following the step described in the post of adding the following line to /boot/config.txt enabled problem sizes up to and including 10240 to be executed.
over_voltage=2]]>
Having decided upon a standalone hardware based solution it was a matter of chosing which hardware to use. For another project we are using pycom devices; the LoPy in particular. This provides a LoRaWAN package in a convenient form factor which is programmed using Python. They also provide an expansion board the Pytrack which adds GPS connectivity to the board.
Having decided on the hardware we could then work out which of the other peripherals that the expansion board had to use. It was decided it would also be interesting to log GPS co-ordinates locally on the devices so that these could be compared to the data received at the base stations to identify gaps in coverage.
To build the hardware we used the following parts
The battery is connected to via the switch to the pytrack PCB. This means the board can be turned on/off without having to open the case. The case still has to be opened to charge the battery as there is currently no external USB connector. There is also currently no way of viewing the debug LED when the case is closed. The battery and PCB are secured in place using velcro.
The easiest way for the data to be imported into TTN mapper is to use one of their standard formats. They accept simple ascii but recommend a more efficient binary encoding of the data. In order to check you have encoded the data correctly you can add a custom payload to the things network. This is provided in the comments in a git repo available at https://github.com/jpmeijers/RN2483-Arduino-Library/blob/master/examples/TheThingsUno-GPSshield-TTN-Mapper-binary/TheThingsUno-GPSshield-TTN-Mapper-binary.ino, but also reproduced below.
function Decoder(bytes, port) { var decoded = {}; decoded.lat = ((bytes[0]<<16)>>>0) + ((bytes[1]<<8)>>>0) + bytes[2]; decoded.lat = (decoded.lat / 16777215.0 * 180) - 90; decoded.lon = ((bytes[3]<<16)>>>0) + ((bytes[4]<<8)>>>0) + bytes[5]; decoded.lon = (decoded.lon / 16777215.0 * 360) - 180; var altValue = ((bytes[6]<<8)>>>0) + bytes[7]; var sign = bytes[6] & (1 << 7); if(sign){ decoded.alt = 0xFFFF0000 | altValue; }else{ decoded.alt = altValue; } decoded.hdop = bytes[8] / 10.0; return decoded; }
Using this decoder has two advantages, you can see that the data you are sending makes sense and you verify that it is in the correct format to be understood by TTN Mapper.
The software for this is available from https://github.com/computenodes/pycom-gps-logger. It forms two parts, a part to run on the device. This was programmed using Visual Studio Code with the Pymakr plugin. Setup instructions are available from the pycom docs. Any details that are device specific are stored in config.py a default version is given in the repo and below. The correct details should be put in for your device.
APP_KEY = "" #Application key from the things network APP_EUI = "" #The EUI for the app JOIN_TIMEOUT = 0 #passed to the LoRaWAN join function GPS_TIMEOUT = 30 #How long to wait for a GPS reading per attempt POST_MESSAGE_SLEEP = 60 #How long to wait between messages - affects GPS sample rate when connected GPS_READ_INTERVAL = 10 #How often to read the GPS if not on the LoRaWAN network
TTN mapper relies on getting latitude, longitude, altitude and HDOP. The default library that pycom provide for the pytrack only provides the latitude and the longitude, as the rest of the data is not provided in the particular NMEA string that they are parsing. However, consulting the datasheet for the GPS module showed that the module also produces
GPGGA
NMEA strings which do contain the required information. The provided library was therefore modified to include an additional function which provides the required information. This will be tested a bit more comprehensively before submitting a pull request to add this to the core library.
Once you have given the required details to TTN Mapper as per the FAQ you should see a purple marker appear which represents your mobile device, with coverage details being added to the map periodically. An example is shown below.
As well as submitting the data to TTN mapper we wanted to keep a record of the positions from the device for a forthcoming project in which we would need the raw data to compare again. In order to do this we have a python application which subscribes to an MQTT stream from the application and logs data into a monogodb store.
{ "_id" : ObjectId("59c641d6c7a447dd52f8ba75"), "hdop" : 1, "timestamp" : "2017-09-23 11:13:26", "lon" : -1.39849, "alt" : 70, "gateways" : [ { "rssi" : -96, "snr" : -1, "gw_id" : "eui-b827ebfffee36ef8" }, { "rssi" : -102, "snr" : 9, "gw_id" : "eui-7276fffffe0102df" }, { "rssi" : -72, "snr" : 9.8, "gw_id" : "eui-b827ebfffeac4b12" }, { "rssi" : -104, "snr" : 9, "gw_id" : "eui-7276fffffe0103f0" }, { "rssi" : -106, "snr" : 7, "gw_id" : "eui-7276fffffe0103f0" }, { "rssi" : -122, "snr" : -7, "gw_id" : "eui-7276fffffe0103ec" }, { "rssi" : -99, "snr" : 8, "gw_id" : "eui-7276fffffe0103ec" } ], "lat" : 50.93727, "serial" : "70B3D5499247643E", "sf" : "SF7BW125" }
Creating a GPS position logger for the LoRaWAN network has been an interesting task, and it has been very easy to iterate and add functionality. It is a very simple project to complete and it can be interesting seeing which base-stations are able to receive your signals.
]]>We had recently been looking at GPS receivers for another area of a project being worked on and decided to use the Adafruit Ultimate GPS. The reasons we liked the module were the fact it could take an external antenna, and it offers a PPS output. The external antenna is important because all our LoRaWAN base stations use metal enclosures. The PPS output means it can be used in place of the uBlox. The PPS (Pulse Per Second) output is a output pin from the GPS receiver which is used to signify the start of the second. By using a simple output rather than a character over a serial link a much higher accuracy can be achieved, as much less hardware processing is needed
In order to connect this to the iC880 a standard 0.1″ header was soldered onto the GPS module. As this was just for testing the serial port is connected to the Raspberry Pi using a standard FTDI USB-serial converter cable. For the power, ground and PPS connectors our much abused Raspberry PI iC880A Lora Concentrator Gateway Shield has had some more patch wires soldered onto it, this time with sockets on the end to enable plugging into the 0.1″ header on the GPS module.
"gateway_conf": { "gateway_ID": "B827EBFFFEXXXXXX", "servers": [ { "server_address": "router.eu.thethings.network", "serv_port_up": 1700, "serv_port_down": 1700, "serv_enabled": true } ], "gps_tty_path": "/dev/ttyUSB0", "contact_email": "test@example.com", "description": "GPS-test-node", "fake_gps": false, "forward_crc_valid": true, "forward_crc_error": false, "forward_crc_disabled": true } }
The local config is very similar to that used for the uBlox module the only change is the serial port being used.
As before testing had to wait until with weather co-operated. When started up the program went through the same steps as seen last time waiting for the GPS receiver to send the correct data through before using it for a lock. This is still attempting to send beacons out, however this time it is succeeding. Although this could be because it’s connected to a different concentrator board.
NOTE: [down] beacon ready to send (frequency 0 Hz) --- Beacon payload --- 0xEE - 0xFF - 0xC0 - 0x00 - 0x00 - 0x00 - 0x00 - 0x84 0x00 - 0x00 - 0x9F - 0x91 - 0x59 - 0x38 - 0x0E - 0x1C 0x8C - 0x00 - 0x00 - 0x00 - 0x00 - 0x00 - 0x00 - 0x00 --- end of payload --- NOTE: [down] beacon sent successfully
I am still confused as to why it is sending out beacons given it again clearly says
INFO: Beacon is disabled
Maybe I’m misunderstanding what this configuration option means.
At the moment we have just tested that the gateways are getting the time / position from the GPS receiver, we are not doing anything more advanced with it. That will come in time, and updates will be posted.
]]>The iC880A is a popular PCB for creating a multichannel LoRaWAN base station. It is supplied with pads for the addition of a u-blox LEA-6T. As well as adding the GPS receiver module several other components have to be added including a power supply, USB connector and associated passives / TVS protection diodes.
|
|
However, following the instructions provided produces a system that enable communication with the GPS receiver but is unable to receive any satellite signal. This is due to a couple of problems, in order to debug these the Hardware Integration Manual for the LEA-6T is needed.
The first stage of the debugging was to identify that the active antenna connected to the SMA was not receiving power. Whilst debugging this stage it was confirmed it was an aerial problem as when probing for DC voltage on the antenna connector the multi-meter probes were acting as antenna enabling signal to be received and a lock to be obtained.
The first problem is in the power supply circuit for the active antenna. R_BIAS (R959) is the recommended 10 Ohms, however, there also appears to be R958 connected to the V_ANT pin. R958 is a 0 Ohm resistor to ground meaning V_ANT is now tied to 0V and therefore unable to power the active antenna. Upon removal of R958, it was possible to observe 3V at the RF pin suitable for powering an LNA as part of the antenna. However, the antenna was still receiving satellite signals.
This led to the diagnosis of a further problem the instructions from IMST have a IC (U950) between RF_IN on the receiver and the SMA connector on the PCB. This component is a SF1186K-3, which is a SAW Filter, which is designed to only pass the required frequency, thus improving noise rejection. This, chip also blocks DC (Minimum 35dB rejection) meaning the antenna does not receive power. As can be seen from the integration manual, the receiver does not require this filter, so it was remove and bypassed (visible in the bottom right of the second photo). Having completed this modification the GPS receiver is now able to receive the satellite signal and get a position lock. This was tested using the u-centre software available from u-blox connected via USB.
Finally the GPS serial output is connected to the Pi onboard serial port. This can either be done using jumper leads. For development we are using the Raspberry PI iC880A Lora Concentrator Gateway Shield, however, this doesn’t have these ports connected, so patch wire was used to connect the required pins.
Now the GPS is able to communicate both with the Pi and satellites a bit of software configuration is needed. The Pi is now connected to UART1 of the GPS receiver needs configuration. This can be done in u-centre, view -> configuration). Before leaving a page you need to click send to actually tell the GPS unit about the change.
The first things is to enable NMEA output on the serial port connected to the Pi. This is under the ‘PRT’ section. UART1 should be set to NMEA out @ baudrate 9600.
The GPS module is now sending lots of NMEA messages to the Pi, most of which are ignored by the LoRa gateway. The only messages that the code parses match $G?RMC & $G?GGA. All other messages can be turned off using the ‘MSG’ section of the configuration.
Having made the changes to the configuration these need to be made persistent. Go to ‘CFG’, make sure “save current configuration” is selected and then click send. These changes should now persist.
Make sure that the serial port is enabled on the Rapsberry Pi using raspi-config. Do not enable a console on the serial port.
Finally change the local_config.json to enable the GPS receiver. The configuration used on the node used for testing is
"gateway_conf": { "gateway_ID": "B827EBFFFEXXXXXX", "servers": [ { "server_address": "router.eu.thethings.network", "serv_port_up": 1700, "serv_port_down": 1700, "serv_enabled": true } ], "gps_tty_path": "/dev/serial0", "contact_email": "test@example.com", "description": "GPS-test-node", "fake_gps": false, "forward_crc_valid": true, "forward_crc_error": false, "forward_crc_disabled": true } }
Due to the poor quality of GPS signal available in my office and in the (basement) electronics lab testing has had to wait until the weather has been good enough to spend some time outside. This has meant waiting rather longer than hoped, but that’s the British weather for you! When suitable weather did arrive I took the gateway and a client device outside and setup on a picnic bench outside the office. On starting the poly_pkt_fwd script there was the usual output stream with some additional lines shown below.
INFO: Validation thread activated. WARNING: [gps] GPS out of sync, keeping previous time reference WARNING: [gps] GPS out of sync, keeping previous time reference INFO: [down] for server router.eu.thethings.network PULL_ACK received in 299 ms INFO: [down] for server router.eu.thethings.network PULL_ACK received in 82 ms INFO: [down] for server router.eu.thethings.network PULL_ACK received in 93 ms ##### 2017-08-01 17:06:10 GMT ##### ### [UPSTREAM] ### # RF packets received by concentrator: 0 # CRC_OK: 0.00%, CRC_FAIL: 0.00%, NO_CRC: 0.00% # RF packets forwarded: 0 (0 bytes) # PUSH_DATA datagrams sent: 0 (0 bytes) # PUSH_DATA acknowledged: 0.00% ### [DOWNSTREAM] ### # PULL_DATA sent: 3 (100.00% acknowledged) # PULL_RESP(onse) datagrams received: 0 (0 bytes) # RF packets sent to concentrator: 0 (0 bytes) # TX errors: 0 ### [GPS] ### # Valid gps time reference (age: 1 sec) # System GPS coordinates: latitude 50.93680, longitude -1.40584, altitude 69 m ##### END #####
As can be seen at first start up the GPS receiver hadn’t yet got a decent lock so the forwarder ignore the data that it was receiving from the device and kept going with the time the device had from NTP. Shortly afterwards it can be seen that the GPS had achieved a lock and was being used as the time reference.
I am confused by one aspect of the behaviour of the system, and that is the transmission of beacons. In the global_conf file the transmission of beacons is disabled. This is confirmed by the following line in the logs.
INFO: Beacon is disabled
However, it would appear that the device still tries to send a beacon.
NOTE: [down] beacon ready to send (frequency 0 Hz) --- Beacon payload --- 0xEE - 0xFF - 0xC0 - 0x00 - 0x00 - 0x00 - 0x00 - 0x84 0x00 - 0x80 - 0xA1 - 0x91 - 0x59 - 0x38 - 0x0E - 0x1C 0x8C - 0x00 - 0x00 - 0x00 - 0x00 - 0x00 - 0x00 - 0x00 --- end of payload --- WARNING: [down] beacon was scheduled but failed to TX
Further testing was carried out and it was identified that the PPS output of the u-blox module was not actually connected to the PPS line used by the transceiver (and available on the header). The failure of transmission was because the expected rising edge never arrived rather than anything with the actual transmission going wrong. On the iC880A PCB there is a test pad available by the timepulse output of the u-blox module. So it was a simple matter of linking the two together. Having done this it was tested again and it worked giving the following output when transmitting a beacon.
NOTE: [down] beacon ready to send (frequency 0 Hz) --- Beacon payload --- 0xEE - 0xFF - 0xC0 - 0x00 - 0x00 - 0x00 - 0x00 - 0x84 0x00 - 0x80 - 0x62 - 0x94 - 0x59 - 0x38 - 0x0E - 0x1C 0x8C - 0x00 - 0x00 - 0x00 - 0x00 - 0x00 - 0x00 - 0x00 --- end of payload --- NOTE: [down] beacon sent successfully
Although why it is transmitting a beacon with the flag set to false is still a mystery to me.
I’m lucky in that I have experience working with 0402 components which meant soldering on the passives was a feasbile proposition, and the uBlox module itself is a reasonable size for hand soldering. However, the SAW filter was beyond my abilities (many thanks to my colleague Graeme Bragg for his help with this.) so the fact it is not needed reduces the complexity of this modification considerably. Even so it’s not something I would recommend people without experience of hand soldering 0402 components tries. Which got us thinking there must be an easier way to add GPS receivers onto this iC880A concentrator boards. More to follow.
]]>In writing this, I assume the reader has some working knowledge of using the Linux command-line. I strongly recommend keeping your work inside a git repository: being able to revert back to a known-working version was a lifesaver on multiple occasions.
In a nutshell, Singularity is a container platform built on the principle of mobility of compute. It is designed to be used on HPC clusters and, unlike Docker, it does not require root access to mount an image. In addition, it can use Docker images out-of-the-box and it can pull them from the Docker Hub. For more information, see the Singularity website.
Containers are a solution to the ever-present dependency problem: how do you make sure that the user has all of the software needed to run the program you are shipping? In general terms, containers work by bundling a specific operating system, alongside other necessary software, and running the target program using them.
While Docker has become the most-used containerisation platform, Singularity is interesting for a couple of reasons:
Therefore, a computer running Singularity on top of a minimal Linux build can host virtually any software, regardless of its dependencies or even the operating system the software was built for, as long as you can make or obtain a working image.
The Yocto Project is an open-source, mostly MIT-licensed collaboration that makes it easier to create and build Linux-based embedded systems. It offers a mature, fully automated Linux build system, supporting all major embedded architectures, alongside recipes for various commonly-used software.
Unifying the world of embedded Linux is a daunting task. The Yocto Project has done a stellar job of it, but it is not user-friendly. Things can and oftentimes do go wrong. Errors in a file can create errors of an apparently unrelated origin. Thankfully, the Yocto user manual provides immense detail concerning common tasks, and the Yocto Project mailing lists are publicly available and can be of great help.
If you are new to the Yocto Project, I strongly recommend reading the entirety of the Yocto Project Quickstart and following its steps until you can successfully emulate a basic image using QEMU. By doing this, you will become familiar with some of the basic tasks, and if something goes wrong it will be significantly easier to find help online. Once this have succeeded, you will have a known working state onto which you can apply the steps within this guide.
Note: I prefer keeping the build directory and my layers outside the poky directory, in order to make changes easier to track using git submodules. This is my directory structure:
I initialise the build environment by running:
Contrast with the approach in the quickstart guide, where the build directory resides within poky and layers are cloned within the latter. I recommend following my approach, for consistency with the rest of this guide.
In this section I will detail the process of writing an image using software from within the Yocto project. The end product will be a fully-featured Linux build containing the Singularity containerisation engine.
Since our target platform is the Raspberry Pi, you must have the Raspberry Pi BSP in our layer directory. Obtain it by cloning the repository:
The link to the remote used in the command can be found within the webpage linked above. For reference, here is the command used:
git clone git://git.yoctoproject.org/meta-raspberrypi</code]>
Once you have cloned a layer, you need to tell the build system to use it. You can do that using:
bitbake-layers add-layer ../meta-raspberrypi/
Here it is in action:
WARNING: Running the bitbake-layers
tool outside of the build directory makes it fail and print out a massive stack trace. Make sure you are within the build directory and that you reference the layers from this directory.
Click here for further information about bitbake-layers
.
Inspecting the README.md file of meta-raspberrypi reveals that some extra layers are required: the meta-oe, meta-multimedia, meta-python and meta-networking from within meta-openembedded. Clone the meta-openembedded layer using the remote in README.md and add the necessary layers using bitbake-layers add-layer
. The order is important: Adding meta-networking before meta-python causes an error:
Thankfully, bitbake-layers
told us exactly what is wrong. The error is fixed by adding meta-python before you add meta-networking:
We can see what layers we are currently using by running bitbake-layers show-layers
. At this stage, you should have the same layers as shown below. Make sure you have all these layers before proceeding.
The recipe for Singularity resides within meta-virtualisation. Clone this layer and add it to the build system as shown above. The README.md file reveals that this layer has further dependencies: openembedded-core, meta-filesystems, oe-meta-go and meta-selinux. The README.md file also provides links to the remote repositories of these dependencies, therefore adding them to your build should be straightforward.
NOTE: The dependency on openembedded-core is fulfilled by the layers automatically added from within poky. You should skip cloning and adding this layer. Poky can replace openembedded-core because it was developed off of OpenEmbedded.
Once you have added the layers required by meta-virtualisation, you should be left with this:
Once you are at this stage, you are ready to create your own image. Important takeaways:
In order to keep everything organised, and to make it easy to share the work, I like to keep my images inside a layer, even though you can obtain the same result by modifying local.conf. The easiest way to create a layer is by using the yocto-layer
script:
yocto-layer create
generates the minimally required files to create a layer, leaving it to the user to populate it with recipes.
Make sure you make the build system aware of the layer by using bitbake-layers add
. Remember that you must be inside the build directory to successfully run the bitbake-layers script. Additionally, give a sensible name to your layer! The OpenEmbedded build system does not support file or directory names that contain spaces.
It is conventional to store image recipes under a recipes-core/images directory, inside the layer folder. Therefore, we will follow this convention. The image itself should be a .bb file. I have named mine ‘computenodes-image.bb’ . Here is what the directory structure should look like:
computenodes-image.bb
must contain the following lines:
SUMMARY = "Basic image, containing Singularity" LICENSE = "MIT" # Base the image upon a mostly complete Linux build include recipes-extended/images/core-image-full-cmdline.bb # Install the Singularity containerization platform # Also install kernel modules, as detailed in the BSP that is being used IMAGE_INSTALL_append = " \ singularity \ kernel-modules \ " # Allocate ~1 extra GB to pull containers IMAGE_ROOTFS_EXTRA_SPACE = "1000000"
This recipe is based upon rpi-hwup-image from meta-raspberrypi. Files preceded by a ‘#’ are comments. The important changes are the fact that it includes core-image-full-cmdline, as opposed to core-image-minimal. This change provides a more complete Linux environment, as opposed to a system which is not capable of doing much more than booting. We are additionally installing Singularity and allocating more SD card space for the containers.
The SUMMARY variable is meant to succinctly describe the product of the recipe. The LICENSE variable specifies the license under which the source code / recipe is licensed. It must be included inside any recipe. Here is some more general information about how OpenEmbedded handles licensing.
The include keyword includes all of the text of the mentioned recipe inside the file. BitBake has multiple ways of sharing functionality between files. Additionally, IMAGE_INSTALL must be used with care. While IMAGE_ROOTFS_EXTRA_SPACE is more or less self-explanatory, the Yocto documentation has a section on it.
To inform the build system that you want to target a Raspberry Pi, you must add the following line to build/conf/local.conf
:
MACHINE = "raspberrypi3"
Now, you are finally ready to build. Run bitbake computenodes-image
and, after waiting for the build to be complete, you can find the end product under build/tmp/deploy/images/raspberrypi3
. In my case, the image is called computenodes-image-raspberrypi3.rpi-sdimg
. You can then flash the image using Etcher. If Etcher does not recognise the file as a valid OS image, change .rpi-sdimg to .img inside the filename. Alternatively, you can achieve the same result with a symbolic link to computenodes-image-raspberrypi3.rpi-sdimg
if the link’s filename ends with .img .
Obtain access to the terminal by plugging in a monitor and keyboard or, if you know the IP address of the Pi, you can SSH into it. Once you’re there, you can test Singularity:
If the selftest succeeds, try pulling a container from the Docker hub:
It works! Let us open a shell inside the container.
Another example:
It is clear that the environment inside the container is different from the one outside.
We have looked at creating a basic image using the Yocto project. The process involves cloning the necessary layers and their dependencies, creating a new layer, writing the image recipe and, finally, building it. Our test image successfully booted, and the Singularity containerisation engine works without a flaw.
]]>