The Kings of Leon

Paul Grenyer from Paul Grenyer

Sheffield is in the North and things, well people, are very different in the North. They’re friendlier than other places. They apologise in a friendly way when they knock into you and several people run after your ticket when it blows away in the wind after you’ve been through security.

Given the recent events in Manchester security was tight at Sheffield Arena. There were plenty of police, some of them visibly armed. You weren’t allowed to take in a bag any bigger than A4 and everyone was searched before they could enter the arena foyer. Having said that, we had no problems parking (getting out of the car park was a different matter) and were through the security check in no time. Everyone there, including the security, was friendly! Even the armed police were posing for selfies and chatting at the end of the night.

I’m not a fan of the Kings of Leon. They’ve got a few good songs, I mean who doesn’t like having their sex on fire? I find them bland, monotonous and a bit boring. Live it’s a different story. They’ve still only got one sound, but it’s much more palatable and they’re very good at it. The lead guitarist can play, but is nothing special, the bass player looks like Billy Idol on a good day and the drummer spent most of the show chewing gum and blowing bubbles, but the singer, his range and the effortless delivery were incredible. He just needs to work on his interaction and eye contact with the crowd.


What was also quite cool was, about halfway through, a complete stage rearrange, the moving of the drum riser and the introduction of a third guitarist and a keyboard player all performed behind a curtain with sometimes just the front man and sometimes all of the main band playing out the front. It was fun, exciting and interesting to see.

They played most of the hits, as far as I could tell and didn’t do an encore, which always makes things easier and means you get more music and less messing around.



Python – printing UTC dates in ISO8601 format with time zone

Andy Balaam from Andy Balaam's Blog

By default, when you make a UTC date from a Unix timestamp in Python and print it in ISO format, it has no time zone:

$ python3
>>> from datetime import datetime
>>> datetime.utcfromtimestamp(1496998804).isoformat()
'2017-06-09T09:00:04'

Whenever you talk about a datetime, I think you should always include a time zone, so I find this problematic.

The solution is to mention the timezone explicitly when you create the datetime:

$ python3
>>> from datetime import datetime, timezone
>>> datetime.fromtimestamp(1496998804, tz=timezone.utc).isoformat()
'2017-06-09T09:00:04+00:00'

Note, including the timezone explicitly works the same way when creating a datetime in other ways:

$ python3
>>> from datetime import datetime, timezone
>>> datetime(2017, 6, 9).isoformat()
'2017-06-09T00:00:00'
>>> datetime(2017, 6, 9, tzinfo=timezone.utc).isoformat()
'2017-06-09T00:00:00+00:00'

Pride Vibes 2017: Birmingham Pride

Samathy from Stories by Samathy on Medium

Marching in Birmingham’s Parade

Pride Vibes: As a photographer for Gay Pride Pics, I see lots of Prides across the UK every year. Each Pride has a different feel. This series will describe what each Pride was like and what the vibe of the pride was like.

The entire series is my opinion and mine only. Take it as you will. Note that this opinion comes from a 20 something extroverted transwoman who is herself a pride organiser.

I’m still working out what this series is going to be like. Bear with me.

Next: Coventry Pride
Previous: Exeter Pride

Birmingham Pride is the biggest Pride in West Midlands. Its one of the biggest in the UK too, competing directly with the likes of Manchester, London and Brighton.
It draws a totally different crowd to that of smaller, community Prides like Exeter, Coventry and Harrogate, for example.

Birmingham Pride is a very extravagant Pride, with costs in the range of several £100,000's.

Birmingham’s parade featured big companies.

Birmingham’s Parade is a huge affair. It features 10s of organisations from big corporate companies like Asda and Virgin to charities like Birmingham LGBT and Stonewall.
Birmingham’s parade also features a lot of Students, given that each one of its Universities were represented there.

Christians and other religious organisations were represented.

Despite the size of the parade through Birmingham, everyone who walked in it was with some kind of organisation. Everyone was there to represent someone.
Like most bigger pride parades, with Birmingham you’re required to register your marching group and identify roughly how many people are walking with you.
Such a practice prevents the more organic ‘marching for pride’ that you get with smaller Parades.

The result is a parade that features a lot more older people associated with their companies and organisations. Which is great for seeing just how many companies support the Pride movement, but not great if you simply want to join in the march and stand up for what you believe in.

Birmingham Pride does however, feature protesting groups.
There were several groups chanting in protest of current issues.
Stand out groups included a several religious groups, Out in the UK (LGBT+ Asylum seekers), and of course Black Lives Matter.

It was great that Birmingham’s pride march supports protesting groups and is clearly in favour of providing the platform for Pride as A Protest.
It really felt like Birmingham Pride’s parade is a good balance of Pride as a celebration and show of big society support and Pride as a Protest march aimng to raise awareness of issues we still face.

People used Birmingham Pride’s parade as a protest platform.

Birmingham’s large Pride parade draws people out onto the streets.
The whole parade route is lined with people who, despite not being in the parade itself, are just as involved.

The people of Birmingham and the West Midlands seem to make a decisive effort to come out and see and support the Pride Parade.
It creates a busy atmosphere, but in my opinion its not necessarily one of support for the causes.
I felt like a lot of people watching were watching simply for the spectacle of seeing a Pride, rather than because they whole heartily support the Pride movement.
Not that there were not lots of people who clearly supported the movement too, mind.

The Gay Village at Birmingham Pride is a rather different experience to the free Prides in the UK.

At £22 for a standard entry ticket, lots of people are deterred from entering the branded ‘Gay Village’. A naming I Strongly disagree with.
I noted, that as soon as the parade broke away and the village opened, the type of people around suddenly changed and the vibe pivoted to something completely different.

Instead of feeling like I was on a semi-protest march for LGBT+ rights, I was now in some alcohol and nicotine fueled festival that seemed to have absolutely no purpose except to provide a reasonably cheap way of seeing lots of musical artists in one place.

For me, as soon as we went into the village area the Pride lost all sense of, well, Pride.
The people there completely changed, there were practically no young people at all around. Suddenly a huge amount of straight people arrived, and I noticed a sharp decline in the amount of people of colour. I even spotted a couple of hen parties (not that hen parties are bad, but it felt exploitative of the Pride movement).

Thats not to say ALL people from the parade left. This photo of a fetish (?) man getting his face painted highly amuses me.

There were clearly people having fun, and at the start, Idid enjoy seeing so many people meeting their friends and having a jolly gay time.

But as the day drew on, the haze of cigarette smoke increased over the absolutely packed venue area.
I was struggling to move around, I struggled to breath properly and I started to feel very uncomfortable presenting as myself. Entirely not what I expected from an event branding itself as a Pride.

In general, for me, the vibe went from feeling like a reasonably inclusive Pride to feeling like a hell hole of drunk people whome I was not safe around.

Your mileage may vary on this! Its totally possible that I felt uncomfortable because I was not with friends, I was not drinking and I was not particularly enthused by any of the acts on.

Clearly, there were lots of people having a great time in Birmingham Pride’s gay village, and I don’t believe that its going to be a terrible experience for everyone.

Birmingham Pride does provide a fantastic set of acts on a number of stages with some great venues in which to have a fun old time with your favourite people.

They do a great job of making your £22 ticket go a long way too.

Despite my compaints about the event, I’d like to congratulate the the Birmingham Pride committee and all the volunteers on pulling off its largest Pride yet!

Like all Prides, its unique and special in its own way.

I await seeing next years Pride.

Volunteers at Birmingham Pride.

Pride Vibes 2017: Exeter Pride

Samathy from Stories by Samathy on Medium

Young people in Exeter’s Pride Parade

Pride Vibes: As a photographer for Gay Pride Pics, I attend lots of Prides across the UK every year. Each Pride has a different feel. This series will describe what each Pride was like and what the vibe of the pride was like.

The entire series is my opinion and mine only. Take it as you will. Note that this opinion comes from a 20 something extroverted transwoman who is herself a pride organiser.

I’m still working out what this series is going to be like. Bear with me.

Next: Birmingham Pride

My first Pride of the 2017 season was eagerly anticipated. The weather looked promising and the prospect of attending a Pride I hadn't yet visited was exciting.

Exeter Pride 2017 is the 9th Pride in the city, run by an experienced team.

The flag is lifted at the start of the parade

Exeter’s Pride parade starts off at a Church in the centre of the city.
I arrived and was greeted by a very friendly and seemingly super well organised set of volunteers and organisers.

Gathering for the start of the parade took place in the grounds of the church.

I was really happy to see tens of young people being hustled by the volunteers to carry the super long rainbow flag. It felt super welcoming to see so many people younger than myself happy to be carrying the flag through the city.

I noted the lack of corporate groups in the initial parade organising.
The church grounds were buzzing with excitement.

As well as young people featuring in the parade there were a suitable amount of costume wearing people too. I spotted someone in a horse mask, furries, dragged up people and the parade was lead by The Centurion.

A huge rainbow flag carried by mostly young people, onlookers watch.

The parade wound through the city, attracting hundreds of onlookers, most of which didn’t seem to have planned to see a parade that day.

The parade was a very happy affair, everyone in it was really pleased to be there.

There were no chants during the parade and very few groups taking advantage of the platform to use it for protest.
I wouldn’t say it was a ‘carnival’ parade since it was mostly people walking holding onto the flag, but it certainly was not a protest march.

The police marched in the parade, as did a group of Christian pastors.

The parade was just about the right length and generally felt like a really nice place to be. I think this was mainly because it was almost entirely young people marching for themselves and for Pride, rather than corporations and charitable organisations.

Exeter Pride was relaxed

The main event was an outdoor field affair with a single stage, and a fair amount of community and commercial stalls.

Being a completely free to attend event, Exeter Pride maintained the relaxed and welcoming feel that the parade exuded.

Mostly people found their spot on the sunny grass and stayed there all day.
The event felt very chill and very calm.

Although there were bars were not fenced in, there was not a huge amount of uncomfortably drunk people around. People seemed to be content with eating the great food and didn't need to go wild, even into the evening.

I was made really comfortable by the Chair of Exeter Pride delivering a speech on the main stage at the beginning of the event, he said, amongst other things:

‘Straight people are welcome here, but you’re our guests in our space, remember that’

as well as making sure to remind people that:

‘Although many of us have won our battles, there are still members of our community who face adversity every day.’

That really helped me to to feel welcome.

Exeter Pride attracted all kinds of people

The event was attended by lots of different people. Although it was certainly awash with white faces, we saw young people, older people and disabled people attending and having a great time.

Overall, Exeter Pride managed to create a very safe feeling happy Pride with a great community feel and welcoming atmosphere.

Great job, organising committee!!

Next in the Series: Birmingham Pride

Volunteers at Exeter Pride.

Visual Lint 6.0.3.278 has been released

Products, the Universe and Everything from Products, the Universe and Everything

Visual Lint 6.0.3.278 is now available. This is a recommended maintenance update for Visual Lint 6.0, and includes the following changes:
  • Renamed the Visual Studio 2017 VSIX extension to VisualLintPlugIn_vs2017.vsix.
  • Added a VSIX extension for Atmel AVR Studio 5.x and Atmel Studio 6.x/7.x as an optional component in the installer.
  • Added support for the analysis of Atmel AVR Studio 5.1 projects.
  • The "Analysis" Options page now allows more than 16 analysis threads to be selected on systems which have more than 16 logical cores.
  • Fixed a bug which could cause the displays to be incorrectly configured when the Visual Studio plug-in is started within Visual Studio 2015 or 2017.
  • Fixed a bug in the Analysis Statistics Display which affected the "Show/Hide all issues with this ID" context menu command.
  • Fixed a bug in the Display Filter Dialog which affected the "Check selected Items" context menu command.
  • Corrected the toolchain used for Atmel Studio 6.x projects from "AVR6.0" to "AS6.0". Note that this is a breaking change (but fortunately a minor one).
Download Visual Lint 6.0.3.278

Python 3 – large numbers of tasks with limited concurrency

Andy Balaam from Andy Balaam's Blog

Series: asyncio basics, large numbers in parallel, parallel HTTP requests, adding to stdlib

I am interested in running large numbers of tasks in parallel, so I need something like asyncio.as_completed, but taking an iterable instead of a list, and with a limited number of tasks running concurrently. First, let’s try to build something pretty much equivalent to asyncio.as_completed. Here is my attempt, but I’d welcome feedback from readers who know better:

# Note this is not a coroutine - it returns
# an iterator - but it crucially depends on
# work being done inside the coroutines it
# yields - those coroutines empty out the
# list of futures it holds, and it will not
# end until that list is empty.
def my_as_completed(coros):

    # Start all the tasks
    futures = [asyncio.ensure_future(c) for c in coros]

    # A coroutine that waits for one of the
    # futures to finish and then returns
    # its result.
    async def first_to_finish():

        # Wait forever - we could add a
        # timeout here instead.
        while True:

            # Give up control to the scheduler
            # - otherwise we will spin here
            # forever!
            await asyncio.sleep(0)

            # Return anything that has finished
            for f in futures:
                if f.done():
                    futures.remove(f)
                    return f.result()

    # Keep yielding a waiting coroutine
    # until all the futures have finished.
    while len(futures) > 0:
        yield first_to_finish()

The above can be substituted for asyncio.as_completed in the code that uses it in the first article, and it seems to work. It also makes a reasonable amount of sense to me, so it may be correct, but I’d welcome comments and corrections.

my_as_completed above accepts an iterable and returns a generator producing results, but inside it starts all tasks concurrently, and stores all the futures in a list. To handle bigger lists we will need to do better, by limiting the number of running tasks to a sensible number.

Let’s start with a test program:

import asyncio
async def mycoro(number):
    print("Starting %d" % number)
    await asyncio.sleep(1.0 / number)
    print("Finishing %d" % number)
    return str(number)

async def print_when_done(tasks):
    for res in asyncio.as_completed(tasks):
        print("Result %s" % await res)

coros = [mycoro(i) for i in range(1, 101)]

loop = asyncio.get_event_loop()
loop.run_until_complete(print_when_done(coros))
loop.close()

This uses asyncio.as_completed to run 100 tasks and, because I adjusted the asyncio.sleep command to wait longer for earlier tasks, it prints something like this:

$ time python3 python-async.py
Starting 47
Starting 93
Starting 48
...
Finishing 93
Finishing 94
Finishing 95
...
Result 93
Result 94
Result 95
...
Finishing 46
Finishing 45
Finishing 42
...
Finishing 2
Result 2
Finishing 1
Result 1

real    0m1.590s
user    0m0.600s
sys 0m0.072s

So all 100 tasks were completed in 1.5 seconds, indicating that they really were run in parallel, but all 100 were allowed to run at the same time, with no limit.

We can adjust the test program to run using our customised my_as_completed function, and pass in an iterable of coroutines instead of a list by changing the last part of the program to look like this:

async def print_when_done(tasks):
    for res in my_as_completed(tasks):
        print("Result %s" % await res)
coros = (mycoro(i) for i in range(1, 101))
loop = asyncio.get_event_loop()
loop.run_until_complete(print_when_done(coros))
loop.close()

But we get similar output to last time, with all tasks running concurrently.

To limit the number of concurrent tasks, we limit the size of the futures list, and add more as needed:

from itertools import islice
def limited_as_completed(coros, limit):
    futures = [
        asyncio.ensure_future(c)
        for c in islice(coros, 0, limit)
    ]
    async def first_to_finish():
        while True:
            await asyncio.sleep(0)
            for f in futures:
                if f.done():
                    futures.remove(f)
                    try:
                        newf = next(coros)
                        futures.append(
                            asyncio.ensure_future(newf))
                    except StopIteration as e:
                        pass
                    return f.result()
    while len(futures) > 0:
        yield first_to_finish()

We start limit tasks at first, and whenever one ends, we ask for the next coroutine in coros and set it running. This keeps the number of running tasks at or below limit until we start running out of input coroutines (when next throws and we don’t add anything to futures), then futures starts emptying until we eventually stop yielding coroutine objects.

I thought this function might be useful to others, so I started a little repo over here and added it: asyncioplus/limited_as_completed.py. Please provide merge requests and log issues to improve it – maybe it should be part of standard Python?

When we run the same example program, but call limited_as_completed instead of the other versions:

async def print_when_done(tasks):
    for res in limited_as_completed(tasks, 10):
        print("Result %s" % await res)
coros = (mycoro(i) for i in range(1, 101))
loop = asyncio.get_event_loop()
loop.run_until_complete(print_when_done(coros))
loop.close()

We see output like this:

$ time python3 python-async.py
Starting 1
Starting 2
...
Starting 9
Starting 10
Finishing 10
Result 10
Starting 11
...
Finishing 100
Result 100
Finishing 1
Result 1

real	0m1.535s
user	0m1.436s
sys	0m0.084s

So we can see that the tasks are still running concurrently, but this time the number of concurrent tasks is limited to 10.

See also

To achieve a similar result using semaphores, see Python asyncio.semaphore in async-await function and Making 1 million requests with python-aiohttp.

It feels like limited_as_completed is more re-usable as an approach but I’d love to hear others’ thoughts on this. E.g. could/should I use a semaphore to implement limited_as_completed instead of manually holding a queue?

Basic ideas of Python 3 asyncio concurrency

Andy Balaam from Andy Balaam's Blog

Series: asyncio basics, large numbers in parallel, parallel HTTP requests, adding to stdlib

Python 3’s asyncio module and the async and await keywords combine to allow us to do cooperative concurrent programming, where a code path voluntarily yields control to a scheduler, trusting that it will get control back when some resource has become available (or just when the scheduler feels like it). This way of programming can be very confusing, and has been popularised by Twisted in the Python world, and nodejs (among others) in other worlds.

I have been trying to get my head around the basic ideas as they surface in Python 3’s model. Below are some definitions and explanations that have been useful to me as I tried to grasp how it all works.

Futures and coroutines are both things that you can wait for.

You can make a coroutine by declaring it with async def:

import asyncio
async def mycoro(number):
    print("Starting %d" % number)
    await asyncio.sleep(1)
    print("Finishing %d" % number)
    return str(number)

Almost always, a coroutine will await something such as some blocking IO. (Above we just sleep for a second.) When we await, we actually yield control to the scheduler so it can do other work and wake us up later, when something interesting has happened.

You can make a future out of a coroutine, but often you don’t need to. Bear in mind that if you do want to make a future, you should use ensure_future, but this actually runs what you pass to it – it doesn’t just create a future:

myfuture1 = asyncio.ensure_future(mycoro(1))
# Runs mycoro!

But, to get its result, you must wait for it – it is only scheduled in the background:

# Assume mycoro is defined as above
myfuture1 = asyncio.ensure_future(mycoro(1))
# We end the program without waiting for the future to finish

So the above fails like this:

$ python3 ./python-async.py
Task was destroyed but it is pending!
task: <Task pending coro=<mycoro() running at ./python-async:10>>
sys:1: RuntimeWarning: coroutine 'mycoro' was never awaited

The right way to block waiting for a future outside of a coroutine is to ask the event loop to do it:

# Keep on assuming mycoro is defined as above for all the examples
myfuture1 = asyncio.ensure_future(mycoro(1))
loop = asyncio.get_event_loop()
loop.run_until_complete(myfuture1)
loop.close()

Now this works properly (although we’re not yet getting any benefit from being asynchronous):

$ python3 python-async.py
Starting 1
Finishing 1

To run several things concurrently, we make a future that is the combination of several other futures. asyncio can make a future like that out of coroutines using asyncio.gather:

several_futures = asyncio.gather(
    mycoro(1), mycoro(2), mycoro(3))
loop = asyncio.get_event_loop()
print(loop.run_until_complete(several_futures))
loop.close()

The three coroutines all run at the same time, so this only takes about 1 second to run, even though we are running 3 tasks, each of which takes 1 second:

$ python3 python-async.py
Starting 3
Starting 1
Starting 2
Finishing 3
Finishing 1
Finishing 2
['1', '2', '3']

asyncio.gather won’t necessarily run your coroutines in order, but it will return a list of results in the same order as its input.

Notice also that run_until_complete returns the result of the future created by gather – a list of all the results from the individual coroutines.

To do the next bit we need to know how to call a coroutine from a coroutine. As we’ve already seen, just calling a coroutine in the normal Python way doesn’t run it, but gives you back a “coroutine object”. To actually run the code, we need to wait for it. When we want to block everything until we have a result, we can use something like run_until_complete but in an async context we want to yield control to the scheduler and let it give us back control when the coroutine has finished. We do that by using await:

import asyncio
async def f2():
    print("start f2")
    await asyncio.sleep(1)
    print("stop f2")
async def f1():
    print("start f1")
    await f2()
    print("stop f1")
loop = asyncio.get_event_loop()
loop.run_until_complete(f1())
loop.close()

This prints:

$ python3 python-async.py
start f1
start f2
stop f2
stop f1

Now we know how to call a coroutine from inside a coroutine, we can continue.

We have seen that asyncio.gather takes in some futures/coroutines and returns a future that collects their results (in order).

If, instead, you want to get results as soon as they are available, you need to write a second coroutine that deals with each result by looping through the results of asyncio.as_completed and awaiting each one.

# Keep on assuming mycoro is defined as at the top
async def print_when_done(tasks):
    for res in asyncio.as_completed(tasks):
        print("Result %s" % await res)
coros = [mycoro(1), mycoro(2), mycoro(3)]
loop = asyncio.get_event_loop()
loop.run_until_complete(print_when_done(coros))
loop.close()

This prints:

$ python3 python-async.py
Starting 1
Starting 3
Starting 2
Finishing 3
Result 3
Finishing 2
Result 2
Finishing 1
Result 1

Notice that task 3 finishes first and its result is printed, even though tasks 1 and 2 are still running.

asyncio.as_completed returns an iterable sequence of futures, each of which must be awaited, so it must run inside a coroutine, which must be waited for too.

The argument to asyncio.as_completed has to be a list of coroutines or futures, not an iterable, so you can’t use it with a very large list of items that won’t fit in memory.

Side note: if we want to work with very large lists, asyncio.wait won’t help us here – it also takes a list of futures and waits for all of them to complete (like gather), or, with other arguments, for one of them to complete or one of them to fail. It then returns two sets of futures: done and not-done. Each of these must be awaited to get their results, so:

asyncio.gather

# is roughly equivalent to:

async def mygather(*args):
    ret = []
    for r in (await asyncio.wait(args))[0]:
        ret.append(await r)
    return ret

I am interested in running very large numbers of tasks with limited concurrency – see the next article for how I managed it.

Are Refactoring Tools Less Effective Overall?

Chris Oldwood from The OldWood Thing

Prior to the addition of automatic refactoring tools to modern IDEs refactoring was essentially a manual affair. You would make a code change, hit build, and then fix all the compiler errors (at least for statically typed languages). This technique is commonly known as “leaning on the compiler”. Naturally the operation could be fraught with danger if you were far too ambitious about the change, but knowing when you could lean on the compiler was part of the art of refactoring safely back then.

A Hypothesis

Having lived through both eras (manual and automatic) and paired with developers far more skilled with the automatic approach I’ve come up with a totally non-scientific hypothesis that suggests automatic refactoring tools are actually less effective than the manual approach, overall.

I guess the basis of this hypothesis pretty much hinges on what I mean by “effective”. Here I’m suggesting that automatic tools help you easily refactor to a local minima but not to a global minima [1]; consequently the codebase as a whole ends up in a less coherent state.

Shallow vs Deep Refactoring

The goal of an automatic refactoring tool appears to be to not break your code – it will only allow you to use it to perform a simple refactoring that can be done safely, i.e. if the tool can’t fix up all the code it can see [2] it won’t allow you to do it in the first place. The consequence of this is that the tool constantly limits you to taking very small steps. Watching someone refactor with a tool can sometimes seem tortuous as they may need to use so many little refactoring steps to get the code into the desired state because you cannot make the leaps you want in one go unless you switch to manual mode.

This by itself isn’t a bad thing, after all making a safe change is clearly A Good Thing. No, where I see the problem is that by fixing up all the call sites automatically you don’t get to see the wider effects of the refactoring you’re attempting.

For example the reason you’d choose to rename a class or method is because the existing one is no longer appropriate. This is probably because you’re learned something new about the problem domain. However that class or method does not exist in a vacuum, it has dependencies in the guise of variable names and related types. It’s entirely likely that some of these may now be inappropriate too, however you won’t easily see them because the tool has likely hidden them from you.

Hence one of the “benefits” of the old manual refactoring approach was that as you visited each broken call site you got to reflect on your change in the context of where it’s used. This often led to further refactorings as you began to comprehend the full nature of what you had just discovered.

Blue or Red Pill?

Of course what I’ve just described could easily be interpreted as the kind of “black hole” that many, myself included, would see as an unbounded unit of work. It’s one of those nasty rabbit holes where you enter and, before you know it, you’re burrowing close to the Earth’s core and have edited nearly every file in the entire workspace.

Yes, like any change, it takes discipline to stick to the scope of the original problem. Just because you keep unearthing more and more code that no longer appears to fit the new model it does not mean you have to tackle it right now. Noticing the disparity is the first step towards fixing it.

Commit Review

It’s not entirely true that you won’t see the entire outcome of the refactoring – at the very least the impact will be visible when you review the complete change before committing. (For a fairly comprehensive list of the things I go through at the point I commit see my C Vu article “Commit Checklist”.)

This assumes of course that you do a thorough review of your commits before pushing them. However by this point, just as writing tests after the fact are considerably less attractive, so is finishing off any refactoring; perhaps even more so because the code is not broken per-se, it just might not be the best way of representing the solution.

It’s all too easy to justify the reasons why it’s okay to go ahead and push the change as-is because there are more important things to do. Even if you think you’re aware of technical debt it often takes a fresh pair of eyes to see how you’re living in a codebase riddled with inconsistencies that make it hard to see it’s true structure. One is then never quite sure without reviewing the commit logs what is the legacy and what is the new direction.

Blinded by Tools

Clearly this is not the fault of the tool or their vendors. What they offer now is far more favourable than not having them at all. However once again we need to be reminded that we should not be slaves to our tools but that we are the masters. This is a common theme which is regularly echoed in the software development community and something I myself tackled in the past with “Don’t Let Your Tools Pwn You”.

The Boy Scout Rule (popularised by Uncle Bob) says that we should always leave the camp site cleaner than we found it. While picking up a handful of somebody else’s rubbish and putting it in the bin might meet the goal in a literal sense, it’s no good if the site is acquiring rubbish faster than it’s being collected.

Refactoring is a technique for improving the quality of a software design in a piecewise fashion; just be careful you don’t spend so long on your hands and knees cleaning small areas that you fail to spot the resulting detritus building up around you.

 

[1] I wasn’t sure whether to say minima or maxima but I felt that refactoring was about lowering entropy in some way so went with the reduction metaphor.

[2] Clearly there are limits around published APIs which it just has to ignore.