About 3 years ago I started a really new interesting project in collaboration with my boxing gym Glorious Fight Gym.
I still don’t remember how it started, but I ended up developing a machine that can be used to measure boxer reflexes.
If you remember this you had a great childhood and maybe you are starting to think about what I want to achieve.
The device was made by 6 sensorized punch pad, each one identified with a led of a specific color, with 6 accelerometers inside to detect punches and an iOS application able to setup a punching sequences.
The idea is to set up a sequence using the application, the boxer each time sees a led turning on must hit the corresponding pad.
The device not only was able to detect hits on the target and compare them with the programmed sequence, but also to calculate the time taken between the light stimulus and the actual hit, giving a sort of reflexes estimation.
App and device communicate using bluetooth low energy.
This the first of a series of 3 articles. Part 2 here.
THE “OLD” SYSTEM
The hardware was made by:
- Adafruit feather BLE
- 6 ADXL345 accelerometer connected using SPI
- 6 RGB LED powered by in PWM
- 6 Punch pad
Software was made by:
- C firmware for the Adafruit written with Arduino IDE
- iOS Application
A target is basically a functional unit made by: one LED of a specific color, one accelerometer and one punch pad.
Each punch pad was emptied and filled again with polystyrene foam and the accelerometer was placed in the middle of the sheets to detect punches.
Accelerometers were connected to the microcontroller unit using SPI and GPIO pins to enable/disable and read data from them. LED terminals were soldered together to give the right color, positive terminal and ground were connected to the GPIO pins.
The application was in charge to connect to the microcontroller using bluetooth low energy and:
- set the sequence
- set the maximum interval between each color switch
- be notified about the responses for each hit or miss
Everything was displayed in a table with the resulting time taken to hit the correct target.
It was working, but not very reliable, mostly due to bugs and unoptimized code. C is not my primary language and I really had a lot of difficulties in developing the firmware. I also thought that probably the computational power wasn’t enough to get events that happens in less of 500ms.
While it seems possible that there are a lot of other process going on: SPI parsing, BLE services, other services etc. For sure a firmware developed and optmized just for that would be enough to manage everything.
HAIL TO THE “NEW” SYSTEM
Trying to avoid some of the issues on software side I said to myself wouldn’t be wonderful if I could program also the microcontroller part in Swift?
A Raspberry PI is a single board computer that can run Linux, there are several distribution available. Swift is fully compatible with Linux… hey, wait there is also a super nice GPIO port!!!!
At the moment a buy a Raspberry is almost impossible due to the increases in prices, but back then I’ve bough a nice stater pack for 100€.
On the hardware side there was also another thing that was bothering me: accelerators.
The main reason to use them was to find a way to avoid a mechanical system that could be broken, when a boxer throw punches, it do fast and hard. The price for industrial and super resistant switches was really high so I’ve decided to use an accelerometer to detect the initial hit.
Of course this decision comes with some trade off, each accelerometer must be installed on different structures and each structures must be mounted to a wall or at something that could disperse the impact of a punch.
Thus, for this version I came up with the idea that I could use IR proximity sensors. The idea is that once the punch is basically at contact with the pad, the sensor sends a signal and I can register it as a hit.
In this way I can also avoid the use of SPI bus, using less wires and only pure digital input.
The negative aspect is that IR sensors works with infrared wavelenght and if gloves are colored with colors that absorb them (such as black) it may not detect the hit.
Here the configuration of the new system:
- 6 RGB LED
- 6 IR Proximity Sensors
- 6 Punch Pad
- Swift software on the Raspberry
- iOS Application
There are different ways to install Swift on RaspberryPI and most of the times is just a matter of which distribution is running on it. I’m not going to explain how to install a distribution of Linux this is already shown in a lot of tutorials that can be found on internet.
On a Raspberry you can install 32bit or 64bit Linux distribution, at the time of writing the official linux distribution for raspberry is available only at 32 bit the 64bit is in beta.
While this doesn’t seems to be an issue it actually is. The ARM port of swift in 32bit can be found here, but unfortunately the relase is stuck to swift 5.4, while we are at 5.5.
Thus I’ve installed the 64bit beta version of the RaspberryPI OS.
To install swift on linux there are a lot of tutorial but thanks to the Swift-Arm community is just a matter of launch some script from terminal:
curl -s https://packagecloud.io/install/repositories/swift-arm/release/script.deb.sh | sudo bash
sudo apt install swiftlang
Now can easily write your first “Hello world” program
Note: On linux programming in swift is raw and simple but you must know pretty well the swift package manger or you will be completely lost.
THE MISSING IDE
In Linux you don’t have Xcode and even if as iOS developers we complain a lot about it, Xcode takes care of a lot of things in our daily programming routine.
There are at least 2 alternatives that you can consider to replace Xcode:
Both require the installation of specific plugins to work with swift.
Everything that you’ll see from now on was made by using VSCode.
Install VSCode is super easy (this is one way):
curl -s https://packagecloud.io/install/repositories/swift-arm/vscode/script.deb.sh | sudo bashsudo apt install code
After you successfully installed VSCode you should install two plugins to simplify your life:
- Swift Language
GPIO or GPIE?
What’ s a GPIO?
A general-purpose input/output (GPIO) is an uncommitted digital signal pin on an integrated circuit or electronic circuit board which may be used as an input or output, or both, and is controllable by the user at runtime.
So basically they are PINs that you can configure and use. You can set one as HIGH or LOW, you can also use a combination of them to read and write using SPI protocol. They represent a very handy way to communicate with external devices.
On my system they are used to read values from proximity sensors and to switch on or off the RGB LEDs.
Before using you must enable them in the RaspberryPI OS configuration.
Configuring and controlling them in a swift application is supereasy thanks to this library available on github.
IS BLUETOOH ON RASPBERRY A BLUEBERRY?
Bluetooth is a core part of that project and probably it was the hardest to make it work. Fortunately there is a project on Github called PureSwift.
PureSwift provides frameworks to bring some of AppKit and UIKit functionalities to linux, such as a pure swift bluetooth stack. I’ve found also another library, SwiftLinuxBLE, that, by using those frameworks, provides a super easy interface to implement bluetooth LE functionalities.
Unfortunately the latest library was not exactly maintained, but after fixing swift version miss-match, miss-matching configuration between libraries and some other minor issues I was able to build and use it.
You can find and, of course, use my fork here. Please note that to make it work I not only forked the SwiftLinuxBLE framework but also some of its dependencies.
After fixing bluetooth libraries issues I’ve started to port the old arduino software in swift.
As an iOS/macOS programmer I’m used to have some features ready out of the box… easy peasy debugging was one of the one. Unfortunately in linux to get it work from VSCode it requires some extra step.
After that I must say that the user experience using VSCode is not that different from Xcode.
Debugging by using lldb can be done from terminal or VSCode, debugging using terminal is just a matter of launching the executable using lldb and know a “bunch” of lldb command to set breakpoints and move from them.
After building your executable using
you should find inside the directory .build/debug/ your executable. To open an lldb session:
An lldb session will open up and you can set a breakpoint to the file and line you prefer.
breakpoint set -f file-name.swift -l 34
Now you can run your software by pressing
r and return. For more lldb command there is this awesome cheatsheet.
What if we would rather prefer to use a graphic UI? Can we use VSCode to enable debugging? sure! but requires some extra step.
At the root of out project we must create 2 files and put them into the folder .vscode:
The first one launch.json is telling VSCode that we want to use lldb and attach it to a specific program (ours), then we also a define a pre launch task called “Build”
Let’s write our pre-launch task:
"command": "swift build"
The pre-launch task is just a way to tell VSCode that we want to launch a shell command that is building our software.
Now we can debug using VSCode UI and it would be a lot easier if you are not a great fan of the CLI.
THE SUDO PROBLEM
The first thing I’ve noticed launching the software was that to use the Bluetooth module you must launch it with root permission and that is absolutely fine until you need to debug. On the terminal side is pretty easy because you just need to launch lldb as a super user, on VSCode it was little bit hard.
The only solution I’ve found was to launch VSCode as a superuser.
pi@raspberrypi:~/Desktop/RaspberryBLE/PunchBLE sudo code . --user-data-dir=’~/.vscode-root’ --no-sandbox
Doing this I was able to debug the software using VSCode, but launching VSCode as superuser is not recommended.
THE RUN LOOP
First time I tried to launch the software I saw that the internal timer that I had created to turn on the led wasn’t working. It was created but I couldn’t see any firing from it.
This happens because the software doesn’t have a run loop where to schedule the
Timer. To fix that or basically fix similar issue of any software/script that requires long asynchronous task, you have 2 ways:
- create a main queue using
- create a run loop using
I used the first to keep my software running for all the time requested.
This is something that when you create a macOS or an iOS app you don’t need to do, because a valid run loop that takes care of care of handling inputs and interruptions has been already provided by UIApplicationMain or NSApplicationMain.
Using a breadboard I was able to test the hardware and the software running, it works pretty fine as you can see from the video.
You can see the programmed series from the application and my hand acting as punch on the proximity sensors. Some were missed on purpose.
WHAT’ S NEXT
In the next article I’m going to show the iOS application in detail and how characteristic are exposed from the Raspberry PI.