Metal Commando – Primal Fear

Paul Grenyer from Paul Grenyer

I've had much anticipation for this release and I wasn't disappointed. While it lacks the epic nature of Severn Seals and New Religion until the final (13 minute!) track, it's packed full of solid power metal songs. Unlike most albums, I found it instantly enjoyable on first listen. Other than the lack of epicness, my only complaint would be it's rather short. I'm fully expecting this to become one of my favourite albums of 2020.

Metal Commando

Greenback backup

Paul Grenyer from Paul Grenyer


Why

When Naked Element was still a thing, we used DigitalOcean almost exclusively for our client’s hosting. For the sorts of projects we were doing it was the most straightforward and cost effective solution. DigitalOcean provided managed databases, but there was no facility to automatically back them up. This led us to develop a Python based program which was triggered once a day to perform the backup, push it to AWS S3 and send a confirmation or failure email.

We used Python due to familiarity, ease of use and low installation dependencies. I’ll demonstrate this later on in the Dockerfile. S3 was used for storage as DigitalOcean did not have their equivalent, ‘Spaces’, available in their UK data centre. The closest is in Amsterdam, but our clients preferred to have their data in the UK. 

Fast forward to May 2020 and I’m working on a personal project which uses a PostgreSQL database. I tried to use a combination of AWS and Terraform for the project’s infrastructure (as this is what I am using for my day job) but it just became too much effort to bend AWS to my will and it’s also quite expensive. I decided to move back to DigitalOcean and got the equivalent setup sorted in a day. I could have taken advantage of AWS’ free tier for the database for 12 months, but AWS backup storage is not free and I wanted as much as possible with one provider and within the same virtual private network (VPC).

I was back to needing my own backup solution. The new project I am working on uses Docker to run the main service. My Droplet (that’s what Digital Ocean calls its Linux server instances) setup up is  minimal: non-root user setup, firewall configuration and Docker install. The DigitalOcean Market Place includes a Docker image so most of that is done for me with a few clicks. I could have also installed Python and configured a backup program to run each evening. I’d also have to install the right version of the PostgreSQL client, which isn’t currently in the default Ubuntu repositories, so is a little involved. As I was already using Docker it made sense to create a new Docker image to install everything and run a Python programme to schedule and perform the backups. Of course some might argue that a whole Ubuntu install and configure in a Docker image is a bit much for one backup scheduler, but once it’s done it’s done and can easily be installed and run elsewhere as many times as is needed.

There are two more decisions to note. My new backup solution will use DigitalOcean spaces, as I’m not bothered about my data being in Amsterdam and I haven’t implemented an email server yet so there are no notification emails. This resulted in me jumping out of bed as soon as I woke each morning to check Spaces to see if the backup had worked, rather than just checking for an email. It took two days to get it all working correctly!

What

I reached for Naked Element’s trusty Python backup program affectionately named Greenback after the arch enemy of Danger Mouse (Green-back up, get it? No, me neither…) but discovered it was too specific and would need some work, but would serve as a great template to start with.

It’s worth nothing that I am a long way from a Python expert. I’m in the ‘reasonable working knowledge with lots of help from Google’ category. The first thing I needed the program to do was create the backup. At this point I was working locally where I had the correct PostgreSQL client installed, db_backup.py:

db_connection_string=os.environ['DATABASE_URL']

class GreenBack:
    def backup(self):    
        datestr = datetime.now().strftime("%d_%m_%Y_%H_%M_%S")
        backup_suffix = ".sql"
        backup_prefix = "backup_"

        destination = backup_prefix + datestr + backup_suffix
        backup_command = 'sh backup_command.sh ' + db_connection_string + ' ' + destination
        subprocess.check_output(backup_command.split(' '))
        return destination

I want to keep anything sensitive out of the code and out of source control, so I’ve brought in the connection string from an environment variable. The method constructs a filename based on the current date and time, calls an external bash script to perform the backup:

# connection string
# destination
pg_dump $1 > $2

and returns the backup file name. Of course for Ubuntu I had to make the bash script executable. Next I needed to push the backup file to Spaces, which means more environment variables:

region=''
access_key=os.environ['SPACES_KEY']
secret_access_key=os.environ['SPACES_SECRET']
bucket_url=os.environ['SPACES_URL']
backup_folder='dbbackups'
bucket_name='findmytea'

So that the program can access Spaces and another method:

class GreenBack:
    ...
    def archive(self, destination):
        session = boto3.session.Session()
        client = session.client('s3',
                                region_name=region,
                                endpoint_url=bucket_url,
                                aws_access_key_id=access_key,
                                aws_secret_access_key=secret_access_key)

        client.upload_file(destination, bucket_name, backup_folder + '/' + destination)
        os.remove(destination) 

It’s worth noting that DigitalOcean implemented the Spaces API to match the AWS S3 API so that the same tools can be used. The archive method creates a session and pushes the backup file to Spaces and then deletes it from the local file system. This is for reasons of disk space and security. A future enhancement to Greenback would be to automatically remove old backups from Spaces after a period of time.

The last thing the Python program needs to do is schedule the backups. A bit of Googling revealed an event loop which can be used to do this:

class GreenBack:
    last_backup_date = ""

    def callback(self, n, loop):
        today = datetime.now().strftime("%Y-%m-%d")
        if self.last_backup_date != today:
            logging.info('Backup started')
            destination = self.backup()
            self.archive(destination)
            
            self.last_backup_date = today
            logging.info('Backup finished')
        loop.call_at(loop.time() + n, self.callback, n, loop)
...

event_loop = asyncio.get_event_loop()
try:
    bk = GreenBack()
    bk.callback(60, event_loop)
    event_loop.run_forever()
finally:
    logging.info('closing event loop')
    event_loop.close()

On startup callback is executed. It checks the last_back_date against the current date and if they don’t match it runs the backup and updates the last_backup_date. If the dates do match and after running the backup, the callback method  is added to the event loop with a one minute delay. Calling event_loop.run_forever after the initial callback call means the program will wait forever and the process continues.

Now that I had a Python backup program I needed to create a Dockerfile that would be used to create a Docker image to setup the environment and start the program:

FROM ubuntu:xenial as ubuntu-env
WORKDIR /greenback

RUN apt update
RUN apt -y install python3 wget gnupg sysstat python3-pip

RUN pip3 install --upgrade pip
RUN pip3 install boto3 --upgrade
RUN pip3 install asyncio --upgrade

RUN echo 'deb http://apt.postgresql.org/pub/repos/apt/ xenial-pgdg main' > /etc/apt/sources.list.d/pgdg.list
RUN wget https://www.postgresql.org/media/keys/ACCC4CF8.asc
RUN apt-key add ACCC4CF8.asc

RUN apt update
RUN apt -y install postgresql-client-12

COPY db_backup.py ./
COPY backup_command.sh ./

ENTRYPOINT ["python3", "db_backup.py"]

The Dockerfile starts with an Ubuntu image. This is a bare bones, but fully functioning Ubuntu operating system. The Dockerfile then installs Python, its dependencies and the Greenback dependencies. Then it installs the PostgreSQL client, including adding the necessary repositories. Following that it copies the required Greenback files into the image and tells it how to run Greenback.

I like to automate as much as possible so while I did plenty of manual Docker image building, tagging and pushing to the repository during development, I also created a BitBucket Pipeline, which would do the same on every check in:

image: python:3.7.3

pipelines:
  default:
    - step:
          services:
            - docker
          script:
            - IMAGE="findmytea/greenback"
            - TAG=latest
            - docker login --username $DOCKER_USERNAME --password $DOCKER_PASSWORD
            - docker build -t $IMAGE:$TAG .
            - docker push $IMAGE:$TAG

Pipelines, BitBucket’s cloud based Continuous Integration and Continuous Deployment feature, is familiar with Python and Docker so it was quite simple to make it log in to Docker Hub, build, tag and push the image. To enable the pipeline all I had to do was add the bitbucket-pipelines.yml file to the root of the repository, checkin, follow the BitBucket pipeline process in the UI to enable it and add then add the build environment variables so the pipeline could log into Docker Hub. I’d already created the image repository in Docker Hub.

The Greenback image shouldn’t change very often and there isn’t a straightforward way of automating the updating of Docker images from Docker Hub, so I wrote a bash script to do it, deploy_greenback:

sudo docker pull findmytea/greenback
sudo docker kill greenback
sudo docker rm greenback
sudo docker run -d --name greenback  --restart always --env-file=.env findmytea/
greenback:latest
sudo docker ps
sudo docker logs -f greenback

Now, with a single command I can fetch the latest Greenback image, stop and remove the currently running image instance, install the new image, list the running images to reassure myself the new instance is running and follow the Greenback logs. When the latest image is run, it is named for easy identification, configured to restart when the Docker service is restarted and told where to read the environment variables from. The environment variables are in a local file called .env:

DATABASE_URL=...
SPACES_KEY=...
SPACES_SECRET=...
SPACES_URL=https://ams3.digitaloceanspaces.com

And that’s it! Greenback is now running in a Docker image instance on the application server and backs up the database to Spaces just after midnight every night.

Finally

While Greenback isn’t a perfect solution, it works, is configurable, a good platform for future enhancements and should require minimal configuration to be used with other projects in the future.

Greenback is checked into a public BitBucket repository and the full code can be found here:

https://bitbucket.org/findmytea/greenback/

The Greenback Docker image is in a public repository on Docker Hub and can be pulled with Docker:

docker pull findmytea/greenback

Test Driven Terraform [Online – Video Conf] – 7pm, 2 April 2020.

Paul Grenyer from Paul Grenyer


We'll use TDD to create a Terraform module which builds a Heroku app and deploys a simple React application.

If you'd like to follow along, you'll need the following prerequisites

  • Terraform installed
  • Go 1.14 installed
  • Heroku account - HEROKU_API_KEY added to environment variables.
  • Git installed
  • BitBucket account

This meetup will be via Zoom:- https://zoom.us/j/902141920

Please RSVP here: https://www.meetup.com/Norfolk-Developers-NorDev/events/269640463/

Insomnium – Norwich

Paul Grenyer from Paul Grenyer

Something my first proper girlfriend said to me has stuck with me my entire life as I disagree with it (mostly).  She said that the best way to discover a new band was to see them live first. The reason I disagree is because I get most pleasure from knowing the music I am listening to live - most of the time.

I’m a member of the Bloodstock Rock Society and their Facebook page is often a place of band discussion. Lots of people there were saying how good Insomnium are, but they didn’t do a great deal for me when I listened to them on Spotify. Then it was early 2020, I hadn’t been to a gig since Shakespears Sister in November, I fancied a night out and Insomnium were playing in Norwich. So I took a chance….

From the off they were great live and I really enjoyed it. I came to the conclusion that I must like some of their older stuff as it was the new album which hadn’t done much for me. There were lots of things I like, like widdly guitars, metal riffs and blast beats, but what really lets Insomnium down is the vocals. Death metal vocals, to a certain extent, are death metal vocals, but this guy sounded like he was singing a different song in a different band - it’s the same on the album I tried. If the vocals were more suited to the music, like there are with Wintersun, it would be even better. I also learned that Norwich City’s current start player is from the same town in Finland as the band.

The first thing I did this morning was look up which albums the setlist was from an make a list:

  • One for Sorrow
  • Across the Dark
  • Above the Weeping World
  • Shadows of the Dying Sun
  • Heart like a Grave

And then die a little inside at the prices on Amazon and eBay. I think I’ll be playing a lot of Insomnium on Spotify for the time being so I’m ready to enjoy them to the full next time.



A review: .Net Core in Action

Paul Grenyer from Paul Grenyer

.Net Core in Action
by Dustin Metzgar
ISBN-13: 978-1617294273

I still get a fair amount of flack for buying and reading technical books in the 21st Century - almost as much as I get for still buying and listening to CDs. If I was a vinyl loving hipster, it would be different of course…. However, books like .Net Core in Action are a perfect example of why I do it.  I needed to learn what .Net Core was and get a feel for it very quickly and that is what this book allowed me to do.

I’ve been very sceptical of .Net development for a number of years, mostly due to how large I perceived the total cost of ownership and the startup cost to be and the fact that you have to use Windows.  While this was previously true, .Net Core is different and .Net Core in Action made me understand that within the first few pages of the first chapter. It also got me over my prejudice towards Docker by the end of the second chapter.

The first two chapters are as you would expect, an introduction followed by various Hello World examples. Then it gets a bit weird as the book dives into the build system next and then Unit testing (actually, this is good so early) and then two chapters on connecting to relational databases, writing data access layers and ORMs. There’s a sensible chapter on micro services before the weirdness returns with chapters on debugging performance profiling and internationalisation. I can kind of see how the author is trying to show the reader the way different parts of .Net core work on different platforms (Windows, Linux, Mac), but this relatively small volume could have been more concise.




DevelopHER Overall Award 2019

Paul Grenyer from Paul Grenyer

I was honoured and delighted to be asked to judge and present the overall DevelopHER award once again this year. Everyone says choosing a winner is difficult. It may be a cliche, but that doesn’t change the fact that it is.

When the 13 category winners came across my desk I read through them all and reluctantly got it down to seven. Usually on a first pass I like to have it down to three or four and then all I need to agonise over is the order. Luckily on the second pass I was able to be ruthless and get it down to four.

To make it even more difficult, three of my four fell into three categories I am passionate about:

  • Technical excellence and diversity
  • Automated Testing
  • Practical, visual Agile

And the fourth achieved results for her organisation which just couldn’t be ignored.

So I read and reread and ordered and re-ordered. Made more tea, changed the CD and re-read and re-ordered some more. Eventually it became clear.

Technical excellent and the ability for a software engineer to turn their hand to new technologies is vital. When I started my career there were basically two main programming languages, C++ and Java. C# came along soon after, but most people fell into one camp or another and a few of us crossed over. Now are are many, many more to choose from and lots of young engineers decide to specialise in one and are reluctant to learn and use others. This diminishes us all as an industry. So someone who likes to learn new and different technologies is a jewel in any company’s crown.

The implementation of Agile methodologies in Software Development is extremely important. Software, by its very nature is complex. Only on the most trivial projects does the solution the users need look anything like what they thought they wanted at the beginning. Traditional waterfall approaches to software development do not allow for this. The client requires flexibility and we as software engineers need the flexibility to deliver what they need. Software development is a learning process for both the client and the software engineer. Agile gives us a framework for this. Unlike many of the traditional methods, Agile has the flexibility to be agile itself, giving continuous improvement.

When implementing Agile processes, the practices are often forgotten or neglected and in many ways they are more important. Not least of which is automated testing. The practice of writing code which tests your code and running it at least on every checkin. This gives you a safety net that code you’ve already written isn’t broken by new code you write. And when it is, the tests tell you, they tell you what’s wrong and where it’s wrong.  We need more of this as an industry and that is why I chose Rita Cristina Leitao, an automated software tester from Switch Studios as the overall DevelopHER winner.


Shakespear Sister Ipswich November 2019

Paul Grenyer from Paul Grenyer

I was very surprised and excited and then immediately disappointed to see Shakespere Sister on the Graham Norton show. They performed Stay, which is their big hit (longest single at number in the UK be a female artist, 8 weeks), but Marcella wasn’t even trying to hit the high notes and it was awful. We decided to go and see them on tour anyway as it was potentially a once in a lifetime experience before they fell out again.

The Ipswich Regent was half empty in the stalls and the circle was closed and oddly there were quite a few security guards - apparently at the request of the band. Encouragingly Shakespear Sister came on on time and they sounded good! As they ploughed through many of their well known songs, new songs and a few older more obscure songs, the vocals were strong from both Marcella and Siobhan.

The rhythm section was incredible.  The drumming was tight, varied and interesting, but what really stood out was the bass. I think part of this was that the player had fantastic bass lines to play, but also oozed talent. It’s really uncommon for a bass player to need to change bass guitars between songs but Clare Kenny swapped frequently. It’s just a shame that the lead guitar player was totally unremarkable and I’ve no idea what the keyboard player was for.

The highlight I, and I imagine many others, had been looking forward to was Stay. It was better than with Graham Norton, but it’s clear that Marcella can not get to the highest notes and live, she doesn’t try. It was still a good performance of a fantastic song.

Would I go and see them again? Probably not, unless I was dragged.

Borknagar

Paul Grenyer from Paul Grenyer

I was pretty sure I had seen Borknagar support Cradle of Filth at the Astoria 2 in the ‘90s. It turns out that was Opeth and Extreme Noise Terror, so I don’t really remember how I got into them now.

Whatever the reason was, I really got into their 2000 album Quintessence. At the time I didn’t really enjoy their previous album, The Archaic Course, much so with the exception of the occasional relisten to Quintessence, Borknagar went by the wayside for me.  That was until ICS Vortex got himself kicked out of Dimmu Borgir for allegedly poor performances, produced a really rather bland and unlistenable solo album called Storm Seeker, and then got back properly with Borknagar.  That’s when things got interesting.

ICS Vortex has an incredible voice. When he joined Dimmu Borgir as bassist and second vocalist in time for Spiritual Black Dimensions, he brought a new dimension (pun intended) to an already amazing band. I’ve played Spiritual Black Dimensions to death since it came out and I think only Death Cult Armageddon is better.

ICS Vortex’s first album back with Borknagar is called Winter Thrice. Loving his voice and being bitterly disappointed with Storm Seeker I bought it desperately hoping for something more and I wasn’t disappointed. It’s an album with a cold feel and lyrical content about winter and the north. I loved it and played it constantly after release and regularly since. It’s progressive black metal which is the musical equivalent to walking through the snow early on a cold crisp morning.

This year Borknagar released a new album called True North. When I’ve loved an album intensely and the band brings out something new I always feel trepidation. Machine Head never bettered Burn My Eyes, WASP never bettered the Crimson Idol. I could go on, but you get the picture. True North is another album about winter and the north. So I ought to have been on safe ground, but then Arch Enemy have pretty much recorded the same album since Doomsday Machine, but never bettered it. They’re all good though.

My first listen to True North was tense, but it didn’t take long for that to dissipate. I had it on daily
play for a few weeks, together with the new albums from Winterfylleth and Opeth. True North was so brilliant I thought it might be even better than Winter Thrice. So cautiously I tried Winter Thrice again, but I wasn’t disappointed to find it was the slightly better album. The brilliant thing is that I now have two similar, but different enough albums I can enjoy again and again and other than Enslaved’s In Times, I haven’t found anything else like it.

I hope they do what Evergrey did with Hymns for the Broken, The Storm Within and The Atlantic and make it a set of three. Cross your fingers for me.

Winterfylleth

Paul Grenyer from Paul Grenyer

At school I had this friend, Jamie, and he once said to me that he always preferred having a band’s live album to their studio album of the same songs. His example was Queen’s Live Magic. To me, then, this was madness. It didn’t have all the same songs as It’s a Kind of Magic and the production isn’t as good. Let’s face it, unless it’s Pink Floyd, the production of a live album is never as good as in the studio. I avoided live albums for years.

Pink Floyd’s Pulse was the first live album I really got into and then there was nothing until the teenies when live albums from Emperor, Immortal, Arch Enemy, Dimmu Borgir and Blind Guardian got me completely hooked.

The latest live album I’ve bought is ‘The Siege Of Mercia: Live At Bloodstock 2017’ by Winterfylleth. It’s amazing for a number of reasons. I was at Bloodstock, watching the band, when it was recorded. It’s a fantastic performance of of some brilliant songs. It’s got a, atmospheric, synth version of an old track at the end. It’s encouraged me to relisten to their studio albums and enjoy them so much more.

Winterfylleth are a Black Metal band from Manchester. Their lyrics are based around England’s rich culture and heritage.  Some of their album covers and songs depict the Peak District. You’d have thought a song about Mam Tour, a hill in the Peak District, might be a bit boring, but it’s not! Maybe because, being black metal, you can’t really hear the words, but as always the vocal style, heavy guitars and fast drums make for the perfect mix.

I currently have the Siege of Mercia, along with some similar new albums from the likes of Borknager, on my regular daily playlist and it just gets better and better.

nor(DEV):biz Big Dinner with Roarr! Dinosaur Adventure

Paul Grenyer from Paul Grenyer



What: nor(DEV):biz Big Dinner with Roarr! Dinosaur Adventure

When: 7th October, 2019

Where: Norwich City Football Club

How much: £40.99

Book: https://nordevbiz-oct-2019.eventbrite.co.uk

Join the best Norfolk and Norwich tech companies for dinner, while enjoying good food and great company.

Roarr! Dinosaur Adventure

A desire to innovate, with continual reinvestment creating bigger and bolder attractions – this is what our guest speakers have in mammoth (or should I say dinosaur!) proportions.

Owners of award-winning, Roarr! Dinosaur Adventure in Lenwade, Martin and Adam Goymour will be sharing their aspirations to develop this thriving business both in Norfolk and further afield. Not ones to rest on their laurels, they’ve already rebranded and invested millions so they can appeal to a broader market.

In 2018, they won the Best Large Visitor Attraction award in the Norfolk and Suffolk Tourism Awards. With more projects ‘in the pipeline’, their hard work and enthusiasm for innovation and redevelopment are evident.

From advancing their green energy strategy by placing solar panels on their indoor play area to a fossil dig and a steampunk-inspired restaurant in the Victorian walled garden, they are delighting thousands of visitors of all ages in Norfolk’s very own Jurassic Park.

About nor(DEV):biz

The aims of nor(DEV):biz (Norfolk Developers Business) are:

  • to be the go-to group for local businesses requiring a technology solution.
  • to facilitate and increase referrals and collaboration among Norfolk’s tech businesses.
  • to help close the digital skills gap.
  • to facilitate better collaboration between technology businesses and academic institutions.
  • to have a great meal with great company

Tickets prices do include a donation to the nor(DEV): chosen charity of the year, for 2019/2020.