Week Beginning 4th April 2016

This was my third four-day week in a row as I’ve taken Friday off (and will be off all next week too). It was another mostly Mapping Metaphor week, with a few other bits of work thrown in as well. Last week I managed to get the ‘Metaphoric’ app submitted to both the Apple App Store and the Google Play Store, which was very satisfying. Over the weekend the Android app was approved for sale and appeared on the Google Play Store (the project doesn’t have its official launch until later this month but you can try the app out here: https://play.google.com/store/apps/details?id=com.gla.metaphoric&hl=en_GB). On Thursday the app was approved for sale on the Apple App Store as well. Sometimes the Apple approval process can take a while to complete but thankfully it was rather swift this time. The iOS version of the app can be found here: https://itunes.apple.com/gb/app/metaphoric/id1095336949?mt=8.

With the first version of the apps completed I spent most of this week working on other Mapping Metaphor tasks. The biggest of these was working with the Teaching Materials that will appear on the ‘Metaphoric’ website. I created an Excel template to allow information about the materials to be more easily managed and I then created a page for the materials and some JavaScript that will allow the lists of materials to be filtered by type and topic. I also updated the web version of the app (i.e. the ‘Metaphoric’ website) to include the ‘top bar’ found on the other two Mapping Metaphor sites so that when the site goes live users will be able to navigate between our sites.

I also spent about a day trying to make the loading of the data into the visualisation a bit quicker, especially the top-level visualisation. I’m afraid I have been unable to speed up the process, but I have at least got to the bottom of why it’s taking so long. Basically when you press on a category in the top-level visualisation the code has to go through each of the grey lines in order to work out which should be yellow. It has to take the ID of the pressed category and see if it appears as either a source or a target for each line. There are 37 categories, each of which may be connected to any of the remaining 36 categories, either as source or target. This means the code has to run 2664 times to find all of the potential matches each time a category is pressed on. I’ve been trying to figure out how I might be able to cache this data so it doesn’t have to be processed each time, but unfortunately this would mean picking apart the visualisation code and doing some major reworking, as the whole thing is based on the concept of processing data on a node by node basis. I spent a few hours today trying to do this but I’m afraid it would likely take me a long time to rework it (if I could get it working again at all).

However, there is a small silver lining, in that I’ve figured out how to get a ‘loading’ message to appear when a user presses on a category and then stay on screen until the last node has been processed. Although this appears like a simple thing, I have spent many fruitless hours over the past few weeks trying to get such a thing appearing due to a combination of the node update code operating asynchronously (so the main function can’t tell when the nodes have finished updating) and swamping the processor (resulting in the interface locking up and blocking the appearance of any ‘loading’ message). But now when you press a category on the top-level visualisation a ‘loading’ message is displayed, which I think will be a great help.

I also spent some time this week on The People’s Voice project. I started working on the CSV Upload script last week and I finished working on it this week. The ‘import CSV’ page now displays an area where you can drag and drop CSV files (or click in it to open the ‘attach file’ box). I also updated the CSV Template and the Guidelines, and provided links to these files from this page. The template was missing a field for recording page numbers in the publication so I added that on the end. The guidelines now include information about the publication types and also a warning about the publication date column. Excel seems to want to reformat the ‘yyyy-mm-dd’ dates as ‘dd/mm/yyyy’, which then causes them to fail to be uploaded. I’ve added an explanation of how you can stop Excel from doing this.

I also noticed a small problem with the pseudonyms: I was converting characters like ampersands into their HTML equivalents before splitting the data up into individual names based on a semi-colon. Unfortunately the HTML code for an ampersand is ‘&’ so my script interpreted the semi-colon as the division between two names. I’ve updated things so that the text is split by semi-colon before HTML conversion is done, which should solve the problem.

I spent a further bit of time on the Scots Thesaurus project. Magda had encountered an issue with the search facility not working for some words that had apostrophes. It turned out that this was being caused by slashes being added to the lexeme data whenever a category was updated, with these slashes then preventing the search from working properly. Thankfully it was a relatively easy thing to fix once identified.

My final project of the week was the REELS project. I set up a handy short URL for the project (www.gla.ac.uk/reels) and also addressed a number of issues that Eila had spotted when using the content management system. This included adding in buttons to navigate straight to the next and previous place-name record when editing a record and fixing a few bugs such as buttons not working. There are also some problems relating to accuracy when entering four-figure grid references (the latitude and longitude values that then get generated are sometimes very far away from where they should be). As the code I’m using to generate the figures is third party code I’m just making use of I’m not sure I can really fix this, but as there are options in the CMS to manually override the latitude and longitude values I’ve suggested that when the map point appears off the project staff to quite easily find a better value and manually enter this instead. There are a few further tweaks to the CMS that I still need to make (e.g. adding filters and pagination to the ‘browse place-names’ page) but I’ll have to do these after I’m back from my holidays. I will be on holiday all of next week and will return to work on Monday the 18th.