Best seller – Succeeding with OKRs in Agile

AllanAdmin from Allan Kelly Associates

I’m delighted that my new book Succeeding with OKRs in Agile went on sale at Amazon yesterday. By this morning it was the number #1 best seller in Amazon’s IT Project Management category – and not doing badly in Computer Programming and Business Management & Leadership either. (Although the publisher has some power over which categories a book is in it is still a black-art.)

It is hard to express just how great it is to see the book in the number #1 slot. While I hope it stays at #1 for a while I expect it will drop down before long.

Print and audio versions of the book are in the works and should be released in the next few weeks so if you would rather read a physical version or listen, watch this space as they say.

The book has taken a little under a year to write and a few more months to make production ready. The wonders of LeanPub mean many readers have already been enjoying early versions of the book. If you would like to read the book on iBooks or as a PDF then LeanPub is the place to buy from.

I recorded the little video below to explain why I wrote the book.

The post Best seller – Succeeding with OKRs in Agile appeared first on Allan Kelly Associates.

Widely used programming languages: past, present, and future

Derek Jones from The Shape of Code

Programming languages are like pop groups in that they have followers, fans and supporters; new ones are constantly being created and some eventually become widely popular, while those that were once popular slowly fade away or mutate into something else.

Creating a language is a relatively popular activity. Science fiction and fantasy authors have been doing it since before computers existed, e.g., the Elf language Quenya devised by Tolkien, and in the computer age Star Trek’s Klingon. Some very good how-to books have been written on the subject.

As soon as computers became available, people started inventing programming languages.

What have been the major factors influencing the growth to widespread use of a new programming languages (I’m ignoring languages that become widespread within application niches)?

Cobol and Fortran became widely used because there was widespread implementation support for them across computer manufacturers, and they did not have to compete with any existing widely used languages. Various niches had one or more languages that were widely used in that niche, e.g., Algol 60 in academia.

To become widely used during the mainframe/minicomputer age, a new language first had to be ported to the major computers of the day, whose products sometimes supported multiple, incompatible operating systems. No new languages became widely used, in the sense of across computer vendors. Some new languages were widely used by developers, because they were available on IBM computers; for several decades a large percentage of developers used IBM computers. Based on job adverts, RPG was widely used, but PL/1 not so. The use of RPG declined with the decline of IBM.

The introduction of microcomputers (originally 8-bit, then 16, then 32, and finally 64-bit) opened up an opportunity for new languages to become widely used in that niche (which would eventually grow to be the primary computing platform of its day). This opportunity occurred because compiler vendors for the major languages of the day did not want to cannibalize their existing market (i.e., selling compilers for a lot more than the price of a microcomputer) by selling a much lower priced product on microcomputers.

BASIC became available on practically all microcomputers, or rather some dialect of BASIC that was incompatible with all the other dialects. The availability of BASIC on a vendor’s computer promoted sales of the hardware, and it was not worthwhile for the major vendors to create a version of BASIC that reduced portability costs; the profit was in games.

The dominance of the Microsoft/Intel partnership removed the high cost of porting to lots of platforms (by driving them out of business), but created a major new obstacle to the wide adoption of new languages: Developer choice. There had always been lots of new languages floating around, but people only got to see the subset that were available on the particular hardware they targeted. Once the cpu/OS (essentially) became a monoculture most new languages had to compete for developer attention in one ecosystem.

Pascal was in widespread use for a few years on micros (in the form of Turbo Pascal) and university computers (the source of Wirth’s ETH compiler was freely available for porting), but eventually C won developer mindshare and became the most widely used language. In the early 1990s C++ compiler sales took off, but many developers were writing C with a few C++ constructs scattered about the code (e.g., use of new, rather than malloc/free).

Next, the Internet took off, and opened up an opportunity for new languages to become dominant. This opportunity occurred because Internet related software was being made freely available, and established compiler vendors were not interested in making their products freely available.

There were people willing to invest in creating a good-enough implementation of the language they had invented, and giving it away for free. Luck, plus being in the right place at the right time resulted in PHP and Javascript becoming widely used. Network effects prevent any other language becoming widely used. Compatible dialects of PHP and Javascript may migrate widespread usage to quite different languages over time, e.g., Facebook’s Hack.

Java rode to popularity on the coat-tails of the Internet, and when it looked like security issues would reduce it to niche status, it became the vendor supported language for one of the major smart-phone OSs.

Next, smart-phones took off, but the availability of Open Source compilers closed the opportunity window for new languages to become dominant through lack of interest from existing compiler vendors. Smart-phone vendors wanted to quickly attract developers, which meant throwing their weight behind a language that many developers were already familiar with; Apple went with Objective-C (which evolved to Swift), Google with Java (which evolved to Kotlin, because of the Oracle lawsuit).

Where does Python fit in this grand scheme? I don’t yet have an answer, or is my world-view wrong to treat Python usage as being as widespread as C/C++/Java?

New programming languages continue to be implemented; I don’t see this ever stopping. Most don’t attract more users than their implementer, but a few become fashionable amongst the young, who are always looking to attach themselves to something new and shiny.

Will a new programming language ever again become widely used?

Like human languages, programming languages experience strong networking effects. Widely used languages continue to be widely used because many companies depend on code written in it, and many developers who can use it can obtain jobs; what company wants to risk using a new language only to find they cannot hire staff who know it, and there are not many people willing to invest in becoming fluent in a language with no immediate job prospects.

Today’s widely used programmings languages succeeded in a niche that eventually grew larger than all the other computing ecosystems. The Internet and smart-phones are used by everybody on the planet, there are no bigger ecosystems to provide new languages with a possible route to widespread use. To be widely used a language first has to become fashionable, but from now on, new programming languages that don’t evolve from (i.e., be compatible with) current widely used languages are very unlikely to migrate from fashionable to widely used.

It has always been possible for a proficient developer to dedicate a year+ of effort to create a new language implementation. Adding the polish need to make it production ready used to take much longer, but these days tool chains such as LLVM supply a lot of the heavy lifting. The problem for almost all language creators/implementers is community building; they are terrible at dealing with other developers.

It’s no surprise that nearly all the new languages that become fashionable originate with language creators who work for a company that happens to feel a need for a new language. Examples include:

  • Go created by Google for internal use, and attracted an outside fan base. Company languages are not new, with IBM’s PL/1 being the poster child (or is there a more modern poster child). At the moment Go is a trendy language, and this feeds a supply of young developers willing to invest in learning it. Once the trendiness wears off, Google will start to have problems recruiting developers, the reason: Being labelled as a Go developer limits job prospects when few other companies use the language. Talk to a manager who has tried to recruit developers to work on applications written in Fortran, Pascal and other once-widely used languages (and even wannabe widely used languages, such as Ada),
  • Rust a vanity project from Mozilla, which they have now abandoned. Did Rust become fashionable because it arrived at the right time to become the not-Google language? I await a PhD thesis on the topic of the rise and fall of Rust,
  • Microsoft’s C# ceased being trendy some years ago. These days I don’t have much contact with developers working in the Microsoft ecosystem, so I don’t know anything about the state of the C# job market.

Every now and again a language creator has the social skills needed to start an active community. Zig caught my attention when I read that its creator, Andrew Kelley, had quit his job to work full-time on Zig. Two and a-half years later Zig has its own track at FOSEM’21.

Will Zig become the next fashionable language, as Rust/Go popularity fades? I’m rooting for Zig because of its name, there are relatively few languages whose name starts with Z; the start of the alphabet is over-represented with language names. It would be foolish to root for a language because of a belief that it has magical properties (e.g., powerful, readable, maintainable), but the young are foolish.

Making Smolpxl work on phones and tablets

Andy Balaam from Andy Balaam's Blog

I’ve added the first features intended to make Smolpxl games work well on touch interfaces like phones and tablets:

Spring game with touch controls

I’ve added a button bar at the bottom (and moved the navigation buttons to the top).

I’m looking for feedback on this:

  • Does it work on your device?
  • Are the buttons the right size?
  • Do they look ok? If not, how could they look better?
  • For games that require arrow keys, do you need them in the normal arrow-keys layout, or is a simple row fine?

Duckmaze game with touch controls in a single row

If you’re writing a game and you want to add buttons like this, you just need to add a single line like this:

game.showControls(["MENU", "SELECT", "BUTTON1", "BUTTON2"]);

or this:

game.showControls(["MENU", "SELECT", "LEFT", "DOWN", "UP", "RIGHT"]);

and they should appear.

Found In Space – a.k.

a.k. from thus spake a.k.

Some time ago we saw how Newton's method used the derivative of a univariate scalar valued function to guide the search for an argument at which it took a specific value. A related problem is finding a vector at which a multivariate vector valued function takes one, or at least comes as close as possible to it. In particular, we should often like to fit an arbitrary parametrically defined scalar valued functional form to a set of points with possibly noisy values, much as we did using linear regression to find the best fitting weighted sum of a given set of functions, and in this post we shall see how we can generalise Newton's method to solve such problems.

Visual Lint 7.0.11.332 has been released

Products, the Universe and Everything from Products, the Universe and Everything

This is a recommended maintenance update for Visual Lint 7.0. The following changes are included:

  • Updated the values of _MSC_VER and _MSC_FULL_VER in the PC-lint Plus compiler indirect file co-rb-vs2017.lnt to reflect those in the latest Visual Studio 2017 update (VS2017 v15.9.31).

  • Updated the values of _MSC_VER and _MSC_FULL_VER in the PC-lint Plus compiler indirect file co-rb-vs2019.lnt to reflect those in the latest Visual Studio 2019 update (VS2019 v16.8.4).

  • The project variables $(CEVER), $(ARCHFAM) and $(_ARCHFAM_) are now automatically defined when analysing Visual Studio 2008 projects for the WEBMAINT_SDK (ARMV4I) platform.

Download Visual Lint 7.0.11.332

Visual Lint 7.0.11.332 has been released

Products, the Universe and Everything from Products, the Universe and Everything

This is a recommended maintenance update for Visual Lint 7.0. The following changes are included:

  • Updated the values of _MSC_VER and _MSC_FULL_VER in the PC-lint Plus compiler indirect file co-rb-vs2017.lnt to reflect those in the latest Visual Studio 2017 update (VS2017 v15.9.31).

  • Updated the values of _MSC_VER and _MSC_FULL_VER in the PC-lint Plus compiler indirect file co-rb-vs2019.lnt to reflect those in the latest Visual Studio 2019 update (VS2019 v16.8.4).

  • The project variables $(CEVER), $(ARCHFAM) and $(_ARCHFAM_) are now automatically defined when analysing Visual Studio 2008 projects for the WEBMAINT_SDK (ARMV4I) platform.

Download Visual Lint 7.0.11.332

Setting up enchant for use with flyspell-mode on macOS

Timo Geusch from The Lone C++ Coder's Blog

I have a few more loose ends to tidy up before switching to the static version of the blog. One of the important tasks was to make sure I had a spell checker available. Back in the dim and distant past I had set up flyspell-mode with hunspell, but I wanted to check if there […]

The post Setting up enchant for use with flyspell-mode on macOS appeared first on The Lone C++ Coder's Blog.

Setting up enchant for use with flyspell-mode on macOS

The Lone C++ Coder's Blog from The Lone C++ Coder's Blog

I have a few more loose ends to tidy up before switching to the static version of the blog. One of the important tasks was to make sure I had a spell checker available. Back in the dim and distant past I had set up flyspell-mode with hunspell, but I wanted to check if there was something better available these days. Enter enchant, which acts as a front end to multiple, different spell checkers. I like that Emacs has included support for enchant since version 26, plus one of the backends enchant supports is AppleSpell. In other words, when running on macOS, flyspell can make use of the OS’s built in spell checker and dictionaries.

Instructions on how to actually set up enchant on macOS are a bit thin on the ground, so I decided that I’ll put together a quick write up.

Technical Debt: Engineers, you are not alone

Allan Kelly from Allan Kelly Associates

I don’t read many books about software or technology these days, I tend to read outside the domain: economics, business and management – which after all is much of what I do in the technology world these days.

Recently I’ve been reading Winning now, winning later by David Cote and find really interesting. He hardly mentions software and never mentions agile but he is giving me a new perspective on technical issues, particularly technical debt (or technical liabilities as I prefer to call them). He talks about issues which have similar characteristics to tech debt but are completely different, legal issues for example. He sees these issues as conflicts between short-term thinking and long-term thinking.

Cote’s argument is that short term actions should support, not conflict, with long term goals. I agree. It might not be easy but if actions in the hear-and-now conflict with longer term goals then the chances of reaching those goals is diminished.

Cote is writing about his time as CEO of Honeywell – a US industrial conglomerate if you don’t know. Unusually Cote is honest about many of the dirty problems the company faced when he took over – a lot of business books glossy over such problems or talk about “challenges” or “opportunities”.

For example, Cote describes how Honeywell managers were chasing numbers and targets every quarter. They had no time for long term improvements because they were so busy “making the numbers”. One of his managers cut down a forest to sell as timber in order to make the end of quarter numbers. Sales people would give products away to new customers or offer large discounts at the end of the quarter. However customers knew this would happen so delayed orders until they were sweetened.

Making the short term numbers meant the company undercut itself so lost revenue next quarter. Management time was spent finding accounting tricks to “make the numbers” rather than improving the business. And since targets ratcheted up the next quarter was more difficult and required more diversions.

Other examples included legal cases Honeywell was fighting: spending time and money on lawyers, building up bad will with customers, politicians and local people. This in turn made it more difficult to get support when the company needed it.

I read these examples, and others, and I hear an engineer saying “Technical debt.” That is exactly what it is.

A software engineer who does a dirty job on a code change because they feel under pressure stores up problems for themselves and future engineers who need to do the next change. Which is exactly the same as a factory which dumps waste into a lake as a quick fix and then needs to clean up the late later.

Actually economists have a term for this: externalities. These are the costs which are forced onto other payer, e.g. the factory saves money on waste disposal but the local government has to pay to clean the lake. I’ve long thought a lot of “technical debt” could be considered an externality because it pushed the cost onto someone else.

Today it is probably harder than ever to escape these cost – in code, in law, in financing – because there are more and more people out there looking for these things. Environmentalists look at waste in lakes, society expects companies to pay if they pollute and courts make companies pay. Smart investors will look closely at a firms accounts and discount the firm, or short them, if they see dubious practices.

This is Cote’s argument: in the short term it might save or generate money to fight legal cases (deny deny deny), sell off forests, discount sales and such, but, in the longer term – and the longer term might just be weeks – it will costs. And when it costs it will damage growth.

Doesn’t that sound just like technical debt/liabilities?

Naturally it is hard to see a company that chases numbers, pollutes and fights all legal claims caring about the quality of code. Engineers will have a hard job fighting for technical excellence there.

Cote argues, and I agree with him, that it doesn’t have to be this way. Acting responsibly and thinking about tomorrow – whether that is pollution, sales, accounting, code quality – will make it easier to grow later. Just because it is difficult to act in a manor that meets todays needs and make the world better for tomorrow does not allow use to ignore it: all of us need to think harder and find a solution that doesn’t mortgage tomorrow.

And sometimes the right answer is to accept the slow path, take it on the chin, pay the cost you’ve been avoiding. For Cote that mean settling legal cases and accepting some costs, for software teams that means doing the refactoring, rewriting a module or just saying No to more changes.

As I’ve said before: in software the long term comes along very soon.

As as I’ve blogged before there is no such thing as quick and dirty, only dirty and slow.

We might talk about debt/liabilities but really we are talking short-term v. long-term, a pay-day loan v. investment. Engineers have an unfortunate habit of talking about technical debt as a binary good v. evil debate with no other options.

Finding these less obvious paths which satisfy the short-term and long term is hard(er) but also offers the opportunity for higher, and longer, term improvement, something which is itself a competitive advantage.


Subscribe to this blog by e-mail and download Project Myopia ebook for Free

The post Technical Debt: Engineers, you are not alone appeared first on Allan Kelly Associates.