Sunday, 29 April 2012

Thoughts from the final day of ACCU

The final day keynote was a series of lightning keynotes (apparently due to a missing speaker during the Icelandic Volcano problem a few years before and it now being an established tradition).

The opening key note was by a Lisper, Didier Verna and was about language obesity.  The central argument was a familiar one from a Lisp expert, but I still found it pretty compelling!  Natural languages have evolved from a few roots, and have evolved by adding new words, but the grammatical structure has remained mostly static (when was the last time a new grammar structure was added to your language?).  In contrast, the rationale for most new language design tends to involve taking the best bits from previous languages, and adding a sprinkle of some new grammatical elements to simplify things.  For example, both C# and Java have evolved to have for-each loops with new syntax.  C++ is perhaps the best example of language bloat - C++ 11 is probably the most complicated language I have ever seen with very unclear semantics.

Lisp doesn't suffer from this problem (it suffers from syntactic anorexia!), and the reason it doesn't is homoiconicity.  This property is that code is data (and vice-versa).  For example, in Lisp (+ 1 1) is both an expression that applies 1 and 1 to the + function, it's also a list of three expressions atoms.  This property gives Lisp the ability to add new syntax and structures without the need for a new language.  CLOS is perhaps the best example of this, adding an object system to the base language without the need to change the language specification whatsoever.  The loop macro is another good (I'm not a fan; I don't want to learn a big DSL for dealing with collections) example of this.  Someone will probably argue that Boost Spirit is a great example of how you can use template metaprogramming to do the same thing in C++ but that's a Turing tarpit style argument!

Lisp will be around for ever in one form or another simply because of this property.  I'm not sure whether the same can be said of curly-brace languages.  What horrific new structures will be needed in a decade when we're programming on mega-core machines?  I found the argument compelling, and perhaps in light of the previous keynote on the Requiem to C there's a chance that a new homoiconic language will emerge to deal with our multi-core future. 

Charles Bailey talked about the "Important Art of Thinking".  Despite the fact that software should always be about thinking, we sometimes find ourselves developing auto-pilot and thus end up in a mess.  It's always easier to start tapping on the keyboard rather than thinking about the problem (working in Haskell inverts this for me!

He mentioned the Dreyfus model of skill acquisition and tried to relate this to programming by peeling back the layers of abstraction on std::endl.

  • Novice - std::endl ends a line with a newline
  • Advanced Beginner - std::endl is a manipulator
  • Competent - std::endl can be viewed as a function operating on the stream
  • Proficient - std::endl is defined as a function template
  • Expert - std::endl is redundant when connected to a terminal because it will (by POSIX standards) guarantee at least line-buffering.  Just use '\n' already!

I do like the new C++ lambda sequence.  Being able to type [](){}() and have it mean nothing is an achievement by anyone's standards!  In summary, think before coding...

Next up was Emily Bache talking about "Geek Feminism", more specifically programmer geek feminism.  This was defined as the ability for women to influence the programming community (on merit - no-one was asking for a free pass!).  As was evident at the conference, there aren't a lot of women in technology.  In the grand scheme of things, there aren't a lot of software engineers at all, so missing out on 50% of the population seems like a big problem!  Imposter syndrome was something I am familiar with (especially in my first programming gig having come out of a research background).  It's the feeling that you think you are crap despite external evidence that you are good.  There was anecdotal evidence that women are more prone to suffer from this than men.  There was also a reference to the Wisdom of Crowds which (rather obviously) states that gathering opinions from a bigger range of backgrounds leads to better results.  Software's gender inbalance stops us taking full advantage of the wisdom of crowds.

Finally, there was a great presentation from Jurgen Appelo on finding your motivation.  Jurgen presented CHAMPFROGS, an acronym for the various things that can motivate developers.

  • Curiousity
  • Honour
  • Acceptance
  • Mastery
  • Power
  • Freedom
  • Rrelatedness
  • Oorder
  • Goal
  • Status

The Moving Motivators game sounds like something useful to try to see if your needs and desires are aligned with the work you are actually doing.

Next I went to Usable API's in Practice by Giovanni Asproni.  This tried to fill the gap between the principles that we have (Single Responsibility, Principle of Least Astonishment, Don't Repeat Yourself) and the code that we write.  API's are the biggest asset (or liability) that a company has.  A bad API (even an internal one) limits your ability to change and can act as a productivity drain.  The following principles were presented.

  • Code from the users perspective - One example of how to do this is not just to test your API, but also to test a program written using your API.
  • Naming - keep it simple, don't be cute and use one word per concept
  • Control to the caller - don't restrict options.  Previously I've worked somewhere were all the core container functions used their own locks.  This was a terrible decision because the caller didn't have control!
  • Explicit Context - don't use globals to hide context from the user, pass it in at construction time
  • Error Reporting - one of the harder ones.  Use layers (since an error at one layer is not necessarily an error at the next).
  • Logging - this gave some good discussion on why Logger logger = LoggerFactory.GetLogger() is a bad thing!

I chose to leave it there at ACCU.  I really enjoyed most of the talks.  The key themes this year were multi-core is coming, TDD is good and functional is fun.  I suspect these have been messages from the expert community for quite some time, but hopefully the wider software engineering community will start to realize this!

Friday, 27 April 2012

ACCU 2012 - day three

The opening keynote today was "Requiem for C" by (howling mad?) Uncle Bob.  First the easy dead language beginning with C. COBOL is a dead language.  It may still run 40% of the code in the world, but no new projects are being written in it and no new developers are being trained in it.  This is an easy one to agree on!

More contentious is that this is the fate that awaits C.  Bob was at pains to point out that he isn't gunning for C and gave us a recap of his programming career from punchcards on the PDP-8 through to his first usage of C in the late 1970's.  The presentation style was fun, though there was a little controversy.

The first nail in C's coffin was C++.  It gave us strong typing, modularity and the wonders of templates, but importantly it still gave developers the chance to see the metal.  If you worked hard enough, you could still reason about what was going on in the underlying machine.  You can still see the metal, though it's a little fuzzy.

Next Java showed us that we don't need to see the raw metal, we can have an idealised machine running on a machine.  You don't need to see the raw metal at all, have a look at the JVM instead!  Looking back, it does seem that the world had gone bonkers at this point.  Instead of running on a machine, we'll run on a software simulation of a machine, running on a machine.  Eh?

Now the final nail (and a familiar theme for the conference) the multi core revolution.  Now we need radically new views of the hardware to take full advantage of it.  

I agreed with most of the keynote.  C is never going to fully go away, but it's going to be needed by a smaller and smaller percentage of people.  The presentation style was fun, but a little intense when you are nursing a hangover!

Next I went to Kevlin Henney's talk on "Functional Programming you already know".  There's been a lot of hype generally about functional programming, but we've been practicing these concepts for years anyway (I think you could replace functional programming with any of the patterns and practises that have been hyped and still be on safe ground e.g. dependency injection).

There were some examples of first class functions from the C language e.g. qsort, function composition (UNIX pipes) and declarative (SQL).  There were also some great sound bites too:

code generation - do the wrong thing faster!
lock is the anti-fast pattern

There was lots of historical content too.  It's always suprisinging to read notes from the 1970's and hear that it still sounds fresh and new today.  For example Alan Kay's explanation of what object orientation really means.

After this I went to a C++ talk about generic libraries by Andrew Sutton.  Firstly I found that he has the perfect job, researching how to make the perfect data structure and algorithms library.  Unfortunately he has to use C++!  The Origin libraries are the playground for this experimentation.

I learnt a lot of new things about C++ concepts.  A concept is the both the syntactic meaning of a type (e.g. the operands) and the semantic (the behaviour of those operations).  I was very interested in the axiom part of concepts which provides a series of relationships that any I implementation of the concept must obey.  For example, you can specify that the equality concept should be transitive, reflexive and symmetric.   I will be interested to see how verifiability evolves in C++ as it certainly doesn't seem like a natural extension for a language as wart-ridden as C++.

Then as a C# / Java / Haskell programmer, it was clear that my next session should be the C++ 11 pub quiz.  This was a well run session about just how complicated C++ is.  Even with compiler writers, language specification contributors and so on, no one managed to understand all the code.  Compiler bugs were found and changes in behaviour over different versions were also observed.  C++ might still be close to the metal, but I don't think there is a person on earth with the necessary mental acuity to see through the complexity!

ACCU day two thoughts

This is my second attempt to write these notes.  Turns out that writing in an online editor whilst connected to hotel wi-fi is a very bad idea!

Today's keynote was given by Phil Nash, and was entitled "The Congreuent Programmer".  The message seemed to be that we live our lives according to a set of beliefs, values and principles.  If these are aligned then we are fine, otherwise we find ourselves in a state of cognitive dissonance.  One example I have for this is that I believe in TDD (at least to some degree!), but I very rarely find myself practicing it!  

Next I went to a talk on "Go, D, C++ and the multicore revolution" by Russel Winder. The aims of this talk were to convince me that shared memory multi-theading is not appropriate for application development (I don't need much convincing on this point!) and also that the new threading related features of C++ 11 may have saved it from the dustbin.

The multicore revolution is already here.  I'm writing this on my dual core iPad and chances are you (hi mum!) are reading it on a multicore machine of some description.  Since the multicore revolution, software has started to lag behind hardware.  As software engineers, we're the people holding things up!

The distinction was made between concurrency (a design tool) and parallelism (a tool for improving performance).  Three models of controlling concurrency were then presented.  None of the are new ideas, but they are well understood tools that are only now beginning to be recognised as important.

The actor model controls concurrency by having multiple independent processes (important to note that this process need not be a separate OS process, it's just a lump of code with an independent address space) communicating via asynchronous messages.  Erlang is an example of a language that has direct support for the actor model.

The data flow model is a spreadsheet model where the computation reacts to changes in inputs.  this has a role in big data as a counterpart to the map/reduce app.  By running a data flow net as a continuous query that updates based only on the diff, it's possible to get more immediate results.  Lucid is an old, but interesting, example of a data flow language.

Finally, Communicating Sequential Processes (CSP) was presented.  This is a formal model of concurrency where processes communicate via synchronous messages.  Go reinvented CSP and called it go-routines.

Next Russel presented a simple example that calculated Pi.  The example was kept consistent as the languages moved to higher level abstractions, from C and pthreads to Go and D with C++ and MPI in between.  The obvious point was higher level abstractions make things easier to understand.  All in all, I enjoyed this talk!

Henrik Berglund ran a quick 45 minute session on "Real Teams".  I have to admit I initially chose this one purely because the door was open so it was a break from the unbelievably hot rooms!  Luckily for me this turned out to be an excellent choice.  The introduction talked about the curious case of the astronauts that went on strike.  Despite being highly trained, military disciplined and very intelligent a bunch of astronauts went on strike in space simply because the working conditions were so hard.  Building great teams is hard!

Getting the conditions right for a team is very important and Henrik outlined some of the things that he had seen with successful teams.
  • People need influence over their working conditions.  This can be achieved by using self managing teams that have clear authority.  Delegation poker might be one tool to try out for this.
  • Compelling direction.  A vision, a product, a sprint goal.  One of the more interesting points was that goals should be challenging and it was suggested that a 50% failure rate was optimum.  I definitely want to try this when I get back to work!
  • Clear boundaries.  Understand who is on the team.
  • Nobody succeeds unless everyone succeeds.  This avoids the done vs done done dilemma.  Problems should be given to teams not people.
  • Trustworthy feedback based on results.  No proxies between users and the teams.  The role of the product manager is to be the vision guy, not to shield customers from engineers.
  • Need to depend on each others skills.  It's OK to specialize and build up independent skills (got to be careful of the bus number!)
  • Be able to give negative feedback if needed.  This needs trust on the team and this takes time to build up.
    Team building requires effort.  Undertsanding the needs and motivations of your colleagues is an important part of working together
I enjoyed this talk a surprising amount and there's definitely some ideas I can take back to work.
After this talk I caught the tail-end of "When only C will do".  C is supported everywhere and has a standard application binary interface (ABI).  This ubiquity makes it an easy choice.  C++ doesn't have a standard API and embedded devices may choose to drop some features (e.g. runtime type information or exceptions) which makes life harder.  The conclusion seemed to be that most of the time you can use C++, but you may need to expose a C interface to be compatible with the rest of the world.
The afternoon finished with a talk on "Refactoring to Functional".  I don't think I was the target audience for this presentation.  The talk introduced some foundational functional concepts but through the medium of Google's Java's Guava library (curiously its missing a fold/reduce operator).  This has some of the worst syntax I have ever seen.  The noise to useful ratio is stupidly high and it just made me so grateful for C#.  Yuck!

They chose this style because it made the core logic simpler (and presumably eventually their brains learned to unsee the syntactic abominations).  They did try Scala, but found that you had to think at both the Scala and Java layers.  I think using the functional style in a language without first class functions is a mistake.  It's like trying to write C in an object-oriented style.  You can do it, but you can constantly fighting it and that's not good.  Maybe Java 8 will resolve these problems in a few decades.

Wednesday, 25 April 2012

ACCU 2012 - Day One

I'm lucky enough to be at ACCU 2012 this week, and I thought I'd try and write up my notes after each day so that I have more of a chance to digest the information being thrown at me!

The opening keynote today was entitled "Project Patterns: From Adrenalin Junkies to Template Zombies" by Tim Lister of PeopleWare fame. Tim was keen to point out that this wasn't about patterns in the strong Patterns sense, but more about some of the habits from teams he'd worked with.

The first pattern mentioned was "The Safety Valve". Successful teams often have a release mechanism, be it making popcorn, riding a pink tricycle (!) or playing foosball. This pattern rang true. In good projects I've been involved in I can always identify a safety valve. The converse is also true, when working on shitty projects its mostly just been heads down, stress up, writing code until its done.

The second pattern was mañana. This describes the problem with long deadlines. A lack of urgency means apathy. If something is outside of my mañana window, then I don't really care about it. As an engineer, my mañana window is about a sprint in length. If its not needed for this sprint, then it's off my radar. In contrast management types need to have a much longer mañana period.

Next up was "Lessons unlearnt" that made the controversial point that the lessons learnt in a retrospective rarely trigger change in the organisation, they only cause change in those that encountered the problems in the first place. This was backed up by the observation that long running software companies are no better at making software than those with little history.

"Project Sluts" or managers that just can't say no was another pattern that struck a chord. I suspect everyone has experienced this one!

Finally was the "Dead Fish Pattern". It's a project where everyone knows it is doomed from day one, but nobody says anything. I've direct experience of this one (it being the major reason why I left a previous job). We all knew the project was doomed, but the politics of the situation meant this could never be voiced, and the team just hunkered down into a marine corp death march mentality (see Death March by Edward Yourdon, a great book!). There was a great quote about heroics from developers to keep it running by the use of "code defibrillators".

I really enjoyed the talk, and I think I was successfully not-so-subliminally convinced to buy the book!

Next up was the Code Simplicity talk I (I was actively trying to avoid the Goldberg contraption that it C++ 11). It was a good talk using examples from the Qt framework to demonstrate complicated code.

Make it simple. Make it memorable. Make it inviting to look at. Make it fun to read

The biggest takeaway for me was looking at code from the perspective of consumers. There's no point designing a clever, simple API for yourself, if you hoist all the complexity on the users of the API (for example, requiring implementation of a huge interface). I also learnt a new useless acronym TATUC (throw away that useless code). Someone for the audience also gave a memorable composition over inheritance quote, "if you marry you can get divorced, but parents are forever".

The speaker also tried to encourage the boy scout rule. Since bad code is easy to spot,you should take every opportunity you can to tidy it up. This is much too idealistic for me, tidying code is always a risk (most of the time it won't have tests) and its all too easy to make a messier working code base into a prettier, but subtly wrong code base.

Bit of lunch and the onto Parallel Architectures. This turned out to be an interesting review of parallel architectures from the hardware side, and then the software side. In the beginning there was the Z80, and life was simple. Then the was pipelining

Pipelining is a leaky abstraction. Instructions can be reordered by the compiler/hardware and this can break your code in subtle, non-obvious ways. The best example of this is probably double checked locking. Cache coherency is another problematic abstraction. If you write something to cache, and this needs to be available to another core then the memory management unit (MMU) must flush the cache to make that data visible. This causes a huge performance problem (see here for a great example).

Multithreading is just one damn thing, after, before and during another (Alexandrescu)

On the software side of things, there was recognition that both mutex and atomic based approaches aren't suitable long term. Both are non-composable, and (apparently) there's evidence that mutex based approaches simply won't scale to large number of cores. Transactional memory was mentioned, though it's slow. Importantly though, transactional memory is composable and is simple to reason about. The Intel Haswell processor will provide hardware support, so it'll be interesting to see whether this provides an alternative to the actor models that seem to be in vogue at the moment. I'm looking forward to seeing how things change with concurrency over the next few years (I hope for a revolution, but I bet we just find more intricate ways of just about making things work).

Finally, I went to the "Objections on TDD and their refutations". This covered the many excuses that developers come up with for not doing TDD, ranging from the cop-out (my manager wouldn't let me) through to the delusional (I don't make mistakes). My feelings on this are mixed. Code without tests is just a lump of synctactically valid code, but not much more. Tests give substance. I'm not so sure whether unit tests are the best way to do this. There was a quote on Twitter today which stated that unit tests provide existential qualification whereas types provide universal qualification. I dig that!

TDD also seems to be used inappropriately for hard problems. In this case, it feels like simulated annealing; it seeks out a local solution to the problem, rather than finding a global optimum. My favourite example of this is Sudoku. Compare Ron Jefferies epic adventures with Peter Norvig's application of brain power.

Looking forward to day two.