- Fixed a bug which could cause Visual C++ 2010-2017 project (.vcxproj) files which have configuration names containing brackets to be loaded incorrectly.
- Fixed a race condition which could cause errors or a crash while loading MSBuild projects.
- Added modified versions of several PC-lint 9.0 indirect files which are not supplied with PC-lint Plus 1.0 to the installer.
- Added additional PC-lint Plus suppression directives to the indirect file lib-rb-win32.lnt supplied within the installer.
Passing functions to functions is becoming increasingly prevalent in C++. With common advice being to prefer algorithms to loops, new library features like
std::visit, lambdas being incrementally beefed up12 and C++ function programming talks consistently being given at conferences, it’s something that almost all C++ programmers will need to do at some point. Unfortunately, passing overload sets or function templates to functions is not very well supported by the language. In this post I’ll discuss a few solutions and show how C++ still has a way to go in supporting this well.
We have some generic operation called
foo. We want a way of specifying this function which fulfils two key usability requirements.
1- It should be callable directly without requiring manually specifying template arguments:
2- Passing it to a higher-order function should not require manually specifying template arguments:
A simple first choice would be to make it a function template:
This fulfils the first requirement, but not the second:
That’s no good.
A second option is to write
foo as a function object with a call operator template:
We are now required to create an instance of this type whenever we want to use the function, which is okay for passing to other functions, but not great if we want to call it directly:
We have similar problems when we have multiple overloads, even when we’re not using templates:
We’re going to need a different solution.
As an intermediate step, we could use the normal function template approach, but wrap it in a lambda whenever we want to pass it to another function:
That’s not great. It’ll work in some contexts where we don’t know what template arguments to supply, but it’s not yet suitable for all cases. One improvement would be to add perfect forwarding:
But wait, we want to be SFINAE friendly, so we’ll add a trailing return type:
Okay, it’s getting pretty crazy and expert-only at this point. And we’re not even done! Some contexts will care about
So the solution is to write this every time we want to pass an overloaded function to another function. That’s probably a good way to make your code reviewer cry.
The above is functionally equivalent to the triplicated monstrosity in the example before. Even better, if P0834: Lifting overload sets into objects was accepted, we could write:
That lifts the overload set into a single function object which we can pass around. Unfortunately, all of those proposals have been rejected. Maybe they can be renewed at some point, but for now we need to make do with other solutions. One such solution is to approximating
foo with a macro (I know, I know).
Now our higher-order function call becomes:
Okay, so there’s a macro in there, but it’s not too bad (you know we’re in trouble when I start trying to justify the use of macros for this kind of thing). So
LIFT is at least some solution.
Making function objects work for us
You might recall from a number of examples ago that the problem with using function object types was the need to construct an instance whenever we needed to call the function. What if we make a global instance of the function object?
This works if you’re able to have a single translation unit with the definition of the global object. If you’re writing a header-only library then you don’t have that luxury, so you need to do something different.
This might look innocent, but it can lead to One-Definition Rule (ODR) violations3:
foo is declared
static, each Translation Unit (TU) will get its own definition of the variable. However,
also_sad will instantiate
oh_no which will get different definitions of
&foo. This is undefined behaviour by
In C++17 the solution is simple:
inline allows the variable to be multiply-defined, and the linker will throw away all but one of the definitions.
An advantage of the function object approach is that function objects designed carefully make for much better customisation points than the traditional techniques used in the standard library. See Eric Niebler’s blog post and standards paper for more information.
A disadvantage is that now we need to write all of the functions we want to use this way as function objects, which is not great at the best of times, and even worse if we want to use external libraries. One possible solution would be to combine the two techniques we’ve already seen:
Now we can use
lift::foo instead of
lib::foo and it’ll fit the requirements I laid out at the start of the post. Unfortunately, I think it’s possible to hit ODR-violations with this due to possible difference in closure types cross-TU. I’m not sure what the best workaround for this is, so input is appreciated.
I’ve given you a few solutions to the problem I showed at the start, so what’s my conclusion? C++ still has a way to go to support this paradigm of programming, and teaching these ideas is a nightmare. If a beginner or even intermediate programmer asks how to pass overloaded functions around – something which sounds like it should be fairly easy – it’s a real shame that the best answers I can come up with are “Copy this macro which you have no chance of understanding”, or “Make function objects, but make sure you do it this way for reasons which I can’t explain unless you understand the subtleties of ODR4”. I feel like the language could be doing more to support these use cases.
Maybe for some people “Do it this way and don’t ask why” is an okay answer, but that’s not very satisfactory to me. Maybe I lack imagination and there’s a better way to do this with what’s already available in the language. Send me your suggestions or heckles on Twitter @TartanLlama.
Thanks to Michael Maier for the motivation to write this post; Jayesh Badwaik, Ben Craig, Michał Dominiak and Kévin Boissonneault for discussion on ODR violations; and Eric Niebler, Barry Revzin, Louis Dionne, and Michał Dominiak (again) for their work on the libraries and standards papers I referenced.
While reading some software related books/reports/articles written during the 1950s, I suddenly realized that the word ‘software’ was not being used. This set me off looking for the earliest use of various computer terms.
My search process consisted of using pfgrep on my collection of pdfs of documents from the 1950s and 60s, and looking in the index of the few old computer books I still have.
Software: The Oxford English Dictionary (OED) cites an article by John Tukey published in the American Mathematical Monthly during 1958 as the first published use of software: “The ‘software’ comprising … interpretive routines, compilers, and other aspects of automotive programming are at least as important to the modern electronic calculator as its ‘hardware’.”
I have a copy of the second edition of “An Introduction to Automatic Computers” by Ned Chapin, published in 1963, which does a great job of defining the various kinds of software. Earlier editions were published in 1955 and 1957. Did these earlier edition also contain various definitions of software? I cannot find any reasonably prices copies on the second-hand book market. Do any readers have a copy?
Software engineering: The OED cites a 1966 “letter to the ACM membership” by Anthony A. Oettinger, then ACM President: “We must recognize ourselves … as members of an engineering profession, be it hardware engineering or software engineering.”
The June 1965 issue of COMPUTERS and AUTOMATION, in its Roster of organizations in the computer field, has the list of services offered by Abacus Information Management Co.: “systems software engineering”, and by Halbrecht Associates, Inc.: “software engineering”. This pushes the first use of software engineering back by a year.
Source code: The OED cites a 1965 issue of Communications ACM: “The PUFFT source language listing provides a cross reference between the source code and the object code.”
The December 1959 Proceedings of the EASTERN JOINT COMPUTER CONFERENCE contains the article: “SIMCOM – The Simulator Compiler” by Thomas G. Sanborn. On page 140 we have: “The compiler uses this convention to aid in distinguishing between SIMCOM statements and SCAT instructions which may be included in the source code.”
Running pdfgrep over the archive of documents on bitsavers would probably throw up all manners of early users of software related terms.
A quick explanation of how const and constexpr work on pointers in C++
So I was checking that my knowledge was correct when working on a Firefox bug.
I made a quick C++ file with all the examples I know of how to use const and constexpr on pointers.
As one can see, its pretty confusing!
Because there are several places in a statement where you can put ‘const’ it can be complicated to work out what part of your statement the ‘const’ is referring too.
Generally, its best to read from right to left to work it out. i.e:
static const char * const hello;
Would read like:
hello (is a) const pointer (to) const char
But, that takes a bit of practice.
C++’s constexpr brings another new dimension to the problem too!
It behaves like const in the sense that it makes all pointers constant pointers.
But because it occurs at the start of your statement (rather than after the ‘*’) its not immediately obvious.
Heres my list of all the ways you can use const and constexpr on pointers and how they behave.
PDFs are the format of choice in academia, but extracting the information they contain is annoyingly hard.
I’ve just started working on my degree’s final project. An academic project requires lots of research, which means reading lots of papers.
Papers are normally available in one form only, PDF.
While PDF is a format so ubiquitous nowadays that one can guarantee being able to display it as the writer(s) intended, its not a nice format, as I found out as soon as I needed to do something with it.
During the course of my research, I’ve been using PDF’s highlight annotations to highlight parts of a paper that’re particularly interesting.
I wanted to be able to retrieve the highlighted text at a later date so I didn’t have to open the paper again to find the parts I found interesting when I read it the first time.
You’d think that exporting annotations on text would be something that all PDF readers which support annotations (most of them do) would be capable of. I mean, surely its easy enough even if there arnt that many reasons why you’d want to do it.
Alas, none that I found running on Linux had this feature, so I delved into trying to write something to do what I needed.
I based my project on a tool I found in a StackOverflow answer to a question similar to mine.
The Python code in the answer utilises poppler-qt4 to export annotated text from a PDF. Unfortunately, the code is Python2 and the python poppler-qt4 package wouldn't install properly on my system anyway, even after installing the poppler-qt4 package.
Neither did Python’s poppler-qt5 bindings.
Convinced I could do a better job than a Python 2 script which depended on a package last updated in 2015, I translated the answer into the equivalent in C++.
I started with trying to use poppler-cpp, the C++ bindings for poppler where one has objects and namespaces, and none of the guff associated with GUI frameworks that I wouldn't need here. However, to my dismay, poppler-cpp doesn't support annotations at all. For whatever reason, annotation support only works with the bindings to a GUI framework, like glib or QT.
So instead I used poppler-glib (i.e glib from the GNOME project). Purely because I use GNOME, so wouldn't have to install anything extra.
Now, the PDF format is really odd. Annotations seem to be an after-thought to the format tacked on later.
Specifically highlighting is weird, because a highlight annotation has no connection to the document’s text.
As such, poppler’s poppler_annot_get_contents(PopplerAnnot *) which should return the annotation’s contents, returns nothing.
Instead, to get the text associated with a highlight annotation, one has to get the coordinates of the highlight annotation (A PopplerRectangle) and then utilise the function poppler_page_get_text_for_area(PopplerPage*, PopplerRectangle*) which returns the text in a defined area.
What an entirely baffling way to go about implementing highlighting. Attaching it as purely a visual element, rather than actually marking up the text.
Even more baffling is the fact that although my application works, it only mostly works.
Sometimes I get the full text highlighted, other times it chops off characters, and sometimes it adds things that’re nowhere near the highlighted text at all!
This is a problem I’m yet to solve, and I might never solve, because its ridiculous and the tool mostly does what I needed anyway.
In conclusion; The PDF format is weird, I wrote a thing.
If you use it, let me know how it goes!
I have been reading two very different computer books written for a general readership: Giant Brains or Machines that Think, published in 1949 (with a retrospective chapter added in 1961) and LET ERMA DO IT, published in 1956.
Berkeley marvels at a computer performing 5,000 additions per second; performing all the calculations in a week that previously required 500 human computers (i.e., people using mechanical adding machines) working 40 hours per week. His mind staggers at the “calculating circuits being developed” that can perform 100,00 additions a second; “A mechanical brain that can do 10,000 additions a second can very easily finish almost all its work at once.”
The chapter discussing the future, “Machines that think, and what they might do for men”, sees Berkeley struggling for non-mathematical applications; a common problem with all new inventions. Automatic translator and automatic stenographer (typist who transcribe dictation) are listed. There is also a chapter on social control, which is just as applicable today.
The ‘ERMA’ book paints a very rosy picture of the future with computer automation removing the drudgery that so many jobs require; it is so upbeat. A year later the USSR launched Sputnik and things suddenly looked a lot less rosy.
Over Christmas I was thinking, reflecting, drinking…
Once upon a time I was asked by a manager to teach his team Agile so the team could become Agile. It went downhill from there…
I turned up at the clients offices to find a room of about 10 people. The manager wasn’t there – shame, he should be in the room to have the conversations with the team. In fact half the developers were missing. This company didn’t allow contractors to attend training sessions.
For agile introduction courses I always try and have a whole team, complete with decision makers, in the room. If you are addressing a specialist topic (say user stories or Cucumber) then its OK to have only the people the topic effects in the room. But I am talking about teams and processes, well I want everyone there!
We did a round of introductions and I learned that the manager, and other managers from the company, had been on a Scrum Master course and instructed the team to be Agile. Actually, the company had decided to be Agile and sent all the managers on Scrum Master courses.
So the omens were bad and then one of the developers said something to the effect:
“I don’t think Agile can help us. We have lots of work to do, we don’t have enough time, we are already struggling, there is masses of technical debt and we can’t cut quality any further. We need more time to do our work not less.”
What scum am I? – I pretend to be all nice but underneath I allow myself to be used as a tool to inflict agile pain on others. No wonder devs hate Agile.
My name is Allan and I provide Agile training and consulting services.
I am guilty of training teams in how to do Agile software development.
I am guilty of offering advice to individuals and teams in a directive format.
I have been employed by managers who want to make their teams agile against the will of the team members.
I have absented myself from teams for weeks, even months and failed to provide deep day-in-day-out coaching.
In my defence I plead mitigating circumstances.
One size does not fit all. The Agile Industrial Complex* has come up with one approach (training, certification and enforcement) and the Agile Hippies another (no-pressure, non-directive, content free, coaching).
I don’t fit into either group. Doing things differently can be lonely … still, I’ve had my successes.
I happen to believe that training team members in “Agile” can be effective. I believe training can help by:
- Providing time for individuals to learn
- Sharing the wisdom of one with others
- Providing the opportunity for teams to learn together and create a shared understanding
- Providing rehearsal space for teams to practice what the are doing, or hope to do
- Providing a starting-point – a kick-off or a Kaikaku event – for a reset or change
- and some other reasons which probably don’t come to mind right now
Yes, when I deliver training I’m teaching people to do something, but that is the least important thing. When I stand up at the start of a training session I image myself as a market stall holder. On my market stall are a set of tools and techniques which those in the room might like to buy: stand-up meetings, planning meeting, stories, velocity, and so on. My job is to both explain these tools and inspire my audience to try. I have a few hours to do that.
As much as I hate to say it, part of my job at this point is Sales. I have to sell Agile. In part I do that by painting a picture of how great the world might be with Agile. I like to think I also give the audience some tools for moving towards that world.
At the end of the time individuals get to decide which, if any, of the tools I’ve set out they want to use. Sometimes these are individual decisions, and sometimes individuals may not pick up any tools for months or years.
On other occasions – when I have time – I let the audience decide what they want to do. Mentally I see myself handing the floor over to the audience to decide what they want to do. In reality this is a team based exercise where the teams decide which tools they want to adopt.
If a team wants to say “No thank you” then so be it.
In my experience teams adopting Agile benefit greatly from having ongoing advice on how they are working. Managers benefit from understanding the team, understanding how their own role changes, and understanding how the organization needs to change over time.
Plus: you cannot cram everything a team need to know into a few hours training and it would be wrong todo so. You don’t want to overload people at the start. There are many things that are better talked about when people have had some experience.
Actually, I tend to believe that there are some parts of Agile which people can only learn first hand. They are – almost – incomprehensible, or unbelievable until one has experience. That is one of the reasons I think managers have trouble gasping agile in full: they are too far removed from the work to experience it first hand.
You see, I believe everyone engages in their own sense making, everyone learns to make sense and meaning in the world themselves. In so much as I have a named educational style it is constructivist. But my philosophy isn’t completely joined up and has some holes, I’m still learning myself.
When I do training I want to give people experiences help them learn. And that continues into the work place after the training.
So I also offer coaching, consulting, advice, call it what you will.
But I don’t like being with the team too much. I prefer to drop in. I believe that people, teams, need space to create their own understanding. If I was there they wouldn’t get that space, they wouldn’t have those experiences, and possibly they wouldn’t take responsibility for their own changes.
One of my fears about having a “Scrum Master” type figure attached to a team is that that person becomes the embodiment of the change. Do people really take responsibility and ownership if there is someone else there to do it?
I prefer to drop in occasionally. Talk to individuals, teams, talk about how things are going. Talk about their experience. Further their sense making process. Do some additional exercises if it helps. Run a retrospective.
And then I disappear. Leave things with them. Let them own it.
Whether technical skills are concerned – principally TDD – it is a little different. Because that is a skill that needs to be learned by practice. I don’t tend to do that so I usually involve one of my associates and they are sometimes embedded with a team for a longer period.
Similarly, I do sometimes become embedded in an organization. I can be there for several days a week for many weeks on end. That usually occurs when the organization is larger, or when the problems are bigger. Even then I want to leave as much control with the teams as I can.
On the one hand I’m a very bad person: I accept unwilling participants on my training courses and then don’t provide the day-to-day coaching that many advocate.
On the other hand: what I do works, I’ve seen it work. Sometimes one can benefit from being challenged, sometimes one needs to open ones mind to new ideas.
If I’m guilty of anything I’m guilty of having a recipe which works differently.
And that team I spoke of to start with?
One day two some people did not return: that was a win. They had worked out that it was not for them and they had taken control. That to me is a success.
Most people did return and at the end, the one who had told me Agile could do nothing for them saw that Agile offered hope. That hope was principally an approach to quality which was diametrically opposite to what he initially thought it was going to be and was probably, although I can’t be sure, the opposite of what his manager thought Agile meant.
It is entirely possible that had his manager been in the room to hear my quality message I’d have been thrown out there and then. And its just possible I might have given him food for thought.
But I will never know. I never heard from them again. Which was a shame, I’d love to know how the story ended. But that is something else: I don’t want to force anyone to work with me, I don’t lock people in. That causes me commercial headaches and sometimes I see people who stop taking the medicine before they are fully recovered but thats what happens when you allow people to exercise free will.
*Tongue in cheek, before you flame me, I’ve exaggerated and pandered to stereotypes to effect and humour.
Read more? Subscribe to my newsletter – free updates on blog post, insights, events and offers.