Tuesday, January 29, 2013

Waiting, fingers crossed

Today I received a letter from the University.  As I suspected, my thesis examination was delayed as my examiners weren't yet selected/confirmed.  As it's a conflict of interest for me to know who my examiners are I don't know much about the process of selecting/confirming them, so I can't speak to the reason for the delay.

The letter I received today explained that the examiners had now confirmed and my (dead tree) thesis had been sent to them.  This was resolved a lot quicker than I expect so I should hear back within a couple of months what the outcome is.  All I can do is continue waiting.  I'm expecting the whole process from submission to graduation to take about 12 months.

Fingers crossed.

Wednesday, January 16, 2013

Happy


I submitted my Ph.D. thesis on the 20th of December 2012.

Friday, December 28, 2012

History of Mercury

I've been converting the Mercury CVS repository to Git.  Here is the final report by cvs2git with some interesting statistics.
Note that cvs2git sometimes calls itself cvs2svn.

cvs2svn Statistics:
------------------
Total CVS Files:              8224
Total CVS Revisions:        101243
Total CVS Branches:          84055
Total CVS Tags:            1956649
Total Unique Tags:             661
Total Unique Branches:          65
CVS Repos Size in KB:       465766
Total SVN Commits:           19206
First Revision Date:    Thu Oct 14 09:57:46 1993
Last Revision Date:     Mon Dec 24 08:37:31 2012
------------------
Timings (seconds):
------------------
 625   pass1    CollectRevsPass
   1   pass2    CleanMetadataPass
   0   pass3    CollateSymbolsPass
  85   pass4    FilterSymbolsPass
   0   pass5    SortRevisionSummaryPass
   2   pass6    SortSymbolSummaryPass
  80   pass7    InitializeChangesetsPass
  16   pass8    BreakRevisionChangesetCyclesPass
  16   pass9    RevisionTopologicalSortPass
  45   pass10   BreakSymbolChangesetCyclesPass
  67   pass11   BreakAllChangesetCyclesPass
  62   pass12   TopologicalSortPass
  53   pass13   CreateRevsPass
   3   pass14   SortSymbolsPass
   3   pass15   IndexSymbolsPass
  89   pass16   OutputPass
1149   total
Then I import it into git with git fastimport:
git-fast-import statistics:
---------------------------------------------------------------------
Alloc'd objects:     155000
Total objects:       151932 (     78357 duplicates                  )
      blobs  :        81245 (     15332 duplicates      70383 deltas)
      trees  :        51602 (     63025 duplicates      44099 deltas)
      commits:        19085 (         0 duplicates          0 deltas)
      tags   :            0 (         0 duplicates          0 deltas)
Total branches:         728 (       238 loads     )
      marks:     1073741824 (    115662 unique    )
      atoms:           6930
Memory total:         11001 KiB
       pools:          3735 KiB
     objects:          7265 KiB
---------------------------------------------------------------------
pack_report: getpagesize()            =       4096
pack_report: core.packedGitWindowSize = 1073741824
pack_report: core.packedGitLimit      = 8589934592
pack_report: pack_used_ctr            =      10379
pack_report: pack_mmap_calls          =        944
pack_report: pack_open_windows        =          1 /          1
pack_report: pack_mapped              =  144052290 /  144052290
---------------------------------------------------------------------

Thursday, September 27, 2012

No photos yet.

Currently in Venice and loving it. No photos yet as our camera is not
my phone and there fore can't get wifi in the hotel's restaurant.
Every thing is good, sitar the highlight for me is Venice. Rome was
nice too.

Friday, September 14, 2012

Interruptions

Lately I've been writing, writing and correcting.  I'm at 79K words and I'm about to crack 150 pages.  It's tedious and so I have not felt like doing any other form of writing such as this blog. Therefore, for this entry, I will mainly refer to other's work.

The other day, The Joel Test came up in conversation, so I looked it up.  The infamous Joel Spolsky wrote about 12 checks for a software development team.  I've always been interested in interruptions and how they affect one's productivity (such as stopping to write a blog article), so when I saw items 2 and 8 in the joel test I found it interesting.  I've long known that a 30 second interruption can cost you 15 minutes of productivity, Joel however cites this as the minimum loss of productivity.  He also links this phenomenon to working memory.

Working memory is the set of things your thinking of right now.  If you want a computer analogy think of the registers in a CPU, (RAM being short term memory and disk being long term memory).  Most people can remember roughly 7 plus-or-minus 2 things (if I remember correctly).  When you're interrupted you forget everything out of your working memory so that you can help your colleague with his spelling, even though it only takes you 30 seconds to remember how to spell something, you spend the next 15 minutes trying to get back into the zone.  Had your colleague used a dictionary it would have cost him a little more time, and would not have cost you that 15 minute minimum.

I was talking with a friend about this, he suggested that the problem is not just working memory, but that it also applies to any creative work.  He said that humans seem to need 8 minutes of rest before they can be creative.  however, we do not have a citation for this statement and my friend isn't sure if he recalls it correctly.  We did find another interesting article on the topic of thinking time and reflection.

On a related note I found another article from Joel on a similar topic of 'context switching'.  Here he discusses that switching between multiple tasks can also be a problem for efficiencly.

Monday, May 21, 2012

I still exist!

So I've been busy. Since my last post I attended Multicore World 2012
in Wellington. It was good to go back there. We spent Easter with my
parents and I gave my completion seminar.

Now I'm spending most of my time writing. It is pretty tedious but I'm
making progress. I'm due to submit in about 6 weeks, I'm concerned
about how much I can get done before then, as there is still more
programming to do and experiments to run.

Tuesday, January 31, 2012

POPL Invited talks

POPL was the main conference running during the week. It ran from Wednesday to
Friday with 2 streams.

There were three invited talks, the first was an award, speech and interview.
I forget the name of the award but the purpose was to recognize a paper 40
years after it's publication based on that paper's impact in the last 40 years.
The recipient was Sir Charles Antony Richard Hoare for his paper An axiomatic
basis for computer programming. I'm sad to say that I wasn't aware of Sir
Hoare before POPL; now I'm very interested to read a few of his papers. The
interview referred to several of his papers during his career, and if I'm not
mistaken the interviewer was one of his students.

The second invited talk was given by Jennifer Rexford; this is probably the
invited talk that spoke the most to me. Jennifer had seen a problem in network
administration and operations. The problem is that each router has it's own
configuration and that this is fine if the network has a single router. But
once there are two or more routers the configurations have to agree about
certain things, and that the software that coordinates large networks doesn't
always do a good job - let alone ensure that the configurations are consistent.
Jenifier wrote a domain specific language to solve this problem. I like this
because it's a problem that many people have, and the solution is
straightforward (at least on the surface). And by providing such a solution
her work has a large impact.

The third invited talk was also very good. J Strother Moore demonstrated a
theorem prover. Moore is the same Moore as in the Boyer-Moore string search
algorithm, a very influential researcher. Theorem provers are increasing in
popularity. I felt like the only person in the room who hadn't used one (let
alone written one). So, for me it was good to see this used and demonstrated
in front of my eyes. The anecdotes of this talk where also very interesting;
for instance, AMD used Moore's theorem prover to check the soundness of
floating point division in the K5, after Intel were dealing with the famous
floating point division bug in the newly released Pentium.
(AMD have continued to use the theorem prover).

I still have more notes to write up.