The second conference day definitely had a different feel from the first day. I'm not sure whether it is being away from home, the time difference, or some very big days, but I'm feeling pretty tired.
The PyCon committee had asked the mornings keynote speaker David Beazley to speak on something "diabolical" - something that he has a reputation for after great talks at the past few python events on "programming the superboard" and "the challenges of the python GIL". I think he picked the perfect topic, talking about PyPy (the Python-in-Python) project, which has the potential to have an enormous impact on performance for all code and infrastructure written in Python.
Having attended the PyPy talk earlier in the week and having tried to go through some of the same very challenging process of getting PyPy to build within the last week, the talk struck a chord with me. PyPy is so exciting, yet seems to require an enormous amount of effort to get started to even add small features.
Straight after the keynote I raced over to the exhibitors hall, because I'd manage to finish the Google challenges last night, which meant I earned my pins, and a T-shirt with the slogan on the back "Python: Programming the way Guido Intended"
The talk schedule worked out a little differently today, in particular, I had the flexibility to go to talks that were just plain cool rather than related to work:
- The Journey to Give Every Scientist a Supercomputer
- Python for Makers
- Pragmatic Unicode, or How do I stop the pain?
- Python and HDF5 - Fast Storage for Large Data
- Militarizing Your Backyard with Python: Computer Vision and the Squirrel
- Using fabric to standardize the development process
One of the big standouts for the day was the talk on Ned Batchelder's talk on Pragmatic Unicode, which provided a consistent approach to dealing with Unicode through your programs. It also seemed to be a far more pragmatic than the Joel Spolsky's essay on "The Absolute Minimum Every Programmer Should Know About Unicode".
I was impressed by the "Militarizing Your Backyard with Python: Computer Vision and the Squirrel Hordes" talk, both for using Python for everything from the highest to the lowest level, and also because the speaker managed to get very impressive machine learning results for squirrel classification from a minimal, and what I assumed would have been a very small and coarse set of features.
The other big thing that happened was that as a response to the fabric talk, one of the delegates came off the floor and gave a really succinct answer about what the roles of provisioning tools like chef and puppet, versus fabric. The answer seems to boil down to the fact that there is some cross-over, but fabric is best suited for remote management, and puppet/chef are best suited for keeping your machines in a consistent state.