Week Beginning 21st December 2015

Here’s a brief ‘Yule blog’ before I head off for Christmas. I only worked on Monday and Tuesday this week and during these two days I worked on three projects: Burns, the Medical Humanities Network and Metaphor in the Curriculum. Over the past few weeks I’ve been working with Pauline to restructure the Burns website on a test server I have running in my office (just an old desktop PC). On Monday Pauline made the final updates to the content that she needed to do and after that I made replaced the old version of the site with the new one. This is structured in a much more sensible way and has a more interesting front page. I also had to make a few changes to the site myself before we went live, such as replacing the older interactive map that just had dots on it with the newer version that had visible labels as well. I also now appear listed on the ‘project team’ page too, which is nice. The website can be found here: http://burnsc21.glasgow.ac.uk/ and the interactive map here: http://burnsc21.glasgow.ac.uk/highland-tour-interactive/.

For the Medical Humanities Network I had a fix a couple of bugs, change some site text and incorporate the ‘Welcome Trust’ logo. Not particularly taxing tasks, but good to get cleared out the way before the holidays. The website should hopefully be going live in January, all being well.

For Metaphor in the Curriculum I created a new prototype version of the quiz interface based on the very helpful feedback from the testing session a couple of weeks ago. The changes in this version include:

  1. The ‘Home’ icon in the top left of the exercise page is now slightly more prominent, plus the ‘MetaphorIC’ text also links back to the homepage too.
  2. I’ve removed the ‘Check answer’ and ‘restart’ buttons
  3. Clicking on a possible answer, or dragging and dropping for this quiz type, now automatically evaluates the answer, which streamlines things.
  4. For the non-drag and drop quiz type the background colour of the option you click on now changes – green if correct, red if incorrect and a white tick or cross is also displayed (helpful for colour blind people).
  5. The user’s first answer for each question is the one that is reflected in the overall score.  If you select the wrong answer and then select the right one this will still count as incorrect when you view the quiz summary page.
  6. On the quiz summary page the ‘restart’ button has been relabelled ‘Try again’ and the stored quiz answers are cleared when the user returns to the quiz.  The same thing happens if the user returns to the list of quizzes.
  7. ‘Next question’ and ‘Previous question’ buttons now just say ‘Next’ and ‘Previous’ to cut down on the amount of space they take up.


There’s one possible area of confusion, and that is that users can go back to previously answered quiz questions.  If they return to a question that they got right first time then the correct answer is pre-selected.  But if they got the question wrong, or they got it wrong first time and then chose the right answer then no answer is pre-selected.  We’ll need to consider whether this is too confusing. One possible option would be to remove the ‘previous’ button entirely.  We could also disable to ‘Next’ button until an answer has been given. No doubt the rest of the team will discuss this in January and I’ll update things further after that.

So, that’s all from me for 2015. It’s been a busy, enjoyable and hugely rewarding year and here’s hoping this continues into 2016!

Week Beginning 14th December 2015

So, here we are in the last full working week before the Christmas holidays. It’s certainly sneaked up quickly this year. I was sort of expecting work to be calming down in the run-up to Christmas but somehow the opposite has happened and there has been a lot going on this week, although I can’t really go into too much detail about at least some of it. On Monday I had a meeting with Gerry Carruthers and Catriona MacDonald about the People’s Voice project, which will be starting in January and for which I will be creating an online resource and giving advice on TEI markup and the like. We had a useful meeting where we discussed some possible technical approaches to the issues the project will be tackling and discussed the sorts of materials that the project will be transcribing. We arranged a time for a training session on Oxygen, TEI and XML in January, so I’ll need to ensure I get some materials ready for this. A lot of Monday and Tuesday was spent going through the documentation for the new Burns bid that Gerry is putting together and preparing feedback on this. Gerry is hoping to get the bid submitted soon so fingers crossed that it will be a success.

I spent a fair amount of time this week setting things up to allow me to access the ScotGrid computing resource in order to process the Hansard data for the Samuels project. This included getting my Grid certificate from John Watt and then running through quite a few steps that were required in order to get me SSH access to the Grid. Thankfully Gareth Roy had sent me some useful documentation that I followed and the process all went pretty smoothly. I have now managed to run a test script on the grid, and in the new year I will hopefully be able to set up some scripts to process chunks of the massive text file that I need to work with. On Wednesday I met with Chris McGlashan and Mike Black from Arts IT Support to discuss the possibility of me getting at least 300Gb of server space for a database in which to store all of the data I hope to extract. Unfortunately they are not currently able to offer this space as the only servers that are available host live sites and they fear having a Grid based process inserting data into might be too much load for the server. 300Gb of data is a lot – it’s probably more than all the other Arts hosted databases put together, so I can appreciate why they are reluctant to get involved. I’ll just need to see what we can do about this once I manage to get in touch with Marc Alexander. I believe there were funds in the project budget for server costs, but I’ll need to speak to Marc to make sure.

Also this week I helped Carole Hough out with some issues she’s been having with the CogTop website and Twitter, and spoke further with Pauline about the restructuring of the Burns website. She is now hoping to have this done next Monday so hopefully we can still launch the new version before Christmas. I also spent some time finishing off the final outstanding items on my Medical Humanities Network ‘to do’ list. This included allowing project members to be associated with teaching materials and updating the system so that the different types of data (projects, people, teaching materials, collections, keywords) can be ‘deleted’ by admin users as well as just being ‘deactivated’. Note that ‘deleted’ records do still exist in the underlying database so I can always retrieve these if needs be.

I was also involved in a lot of App based stuff this week. Some people in MVSL have been trying to get a paid app published via the University account for some time now, but there have been many hurdles on the way, such as the need for the University to approve the paid app contract, filling in of tax forms and bank account details, creating of custom EULAs and a seemingly endless stream of other tasks that need to be completed. I’ve been working with various people across the University to try and get this process completed, and this has taken quite a bit of time this week. We’re almost there now and I really hope that everything will be ready next week. However, even if it is the App Store will not be accepting app submissions over the Christmas holidays anyway so things are going to be delayed a little longer at least. I was also involved in a lengthy email discussion with Fraser Rowan about app development in the University. There is something of a push for app development and approval to be more formally arranged in the University, which I think is a good thing. There are lots of things that need to be considered relating to this, but I can’t really go into any detail about them here at this stage.

I will be working on Monday and Tuesday next week and then that is me off until the New Year.

Week Beginning 7th December 2015

It was a week of many projects this week, mostly working on smallish tasks that still managed to take up some time. I was involved in an email discussion this week with some of the University’s data centre people, who would like to see more Arts projects using some of the spare capacity on the ScotGrid infrastructure. This seemed pretty encouraging for the ongoing Hansard work and it culminated in a meeting with Gareth Roy, who works with the Grid for Physics on Friday. This was a very useful meeting, during which I talked through our requirements for data extraction and showed Gareth my existing scripts. Gareth gave some really helpful advice on how to tackle the extraction, such as splitting the file up into 5Mb chunks before processing and getting nodes on the Grid to tackle these chunks one at a time. At this stage we still need to see whether Arts Support will be able to provide us with the database space we require (at least 300Gb) and allow external servers (with specified IP addresses) to insert data. I’m going to meet with Chris next week to discuss this matter. At this stage things are definitely looking encouraging and hopefully some time early in the new year we’ll actually have all of the frequency data extracted.

For the Metaphor in the Curriculum project we had a little Christmas lunch out for the team on Tuesday, which was nice. On Friday Ellen and Rachael had organised a testing session for undergraduates to test out the prototype quiz that we have created, and I met with them afterwards to discuss how it went. The feedback the received was very positive and no-one encountered any problems with the interface. A few useful suggestions were made – for example that only the first answer given should be registered for the overall score, and that questions should be checked as soon as an answer is selected rather than having a separate ‘check answer’ button. I’ll create a new version of the prototype with these suggestions in place.

Hannah Tweed contacted me this week with some further suggestions for the Medical Humanities Network website, including adding facilities to allow non-admin users to upload keywords and some tweaks to the site text. I still need to implement some of the other requests she made, such as associating members with teaching materials. I should be able to get this done before Christmas, though.

Magda also contacted me about updating the Scots Thesaurus search facility to allow variants of words to be searched for. Many words have multiple forms divided with a slash, or alternative spellings laid out with brackets, for example ‘swing(e)’. Other forms were split with hyphens or included apostrophes are Magda wanted to be able to search for these with or without the hyphens. I created a script that generated such variant forms and stored them in a ‘search terms’ database table, much in the same way as I had done for the Historical Thesaurus of English. I then updated the search facilities so that they checked the contents of this new table and I also updated the WordPress plugin so that whenever words are added, edited or deleted the search variants are updated to reflect this. Magda tested everything out and all seems to be working well.

For the SCOSYA project Gary sent me the first real questionnaire to test out the upload system with. My error checking scripts picked up a couple of problems with the contents (a typo in the codes, plus some other codes that hadn’t been entered into my database yet) but after these were addressed the upload went very smoothly. I also completed work on the facilities for editing and deleting uploaded data.

During the week there were times when the majority of internet access was cut off due to some issues with JANET. Unfortunately this had a bit of an impact on the work I could do as I do kind of need internet access to do pretty much everything I’m involved with. However, I made use of the time with some tasks I’d been meaning to tackle for a while. I installed Windows 10 on my MacBook and then reinstalled all of the software I use. I also copied all of my app development stuff from my MacBook onto my desktop computer in preparation for creating the Metaphor in the Curriculum app and also for creating new Android versions of the STELLA apps that still don’t have Android versions available.

I also spent some time this week getting up to speed on the use of Oxygen, XML and TEI in preparation for the ‘People’s Voice’ project that starts in January. I also went through all of the bid documentation for this project and began to consider how the other technical parts of the project might fit together. I have a meeting with Gerry and Catriona next week where we will talk about this further.

Week Beginning 30th November 2015

My time this week was mostly spent on the same three projects as last week. On Monday I met with Pauline Mackay to run through the updates I’d made to my ‘test version’ of the Burns website based on her document of suggested changes. All of the updates have come together remarkable quickly and easily (so far!) and at the meeting we just confirmed what still needed to be done (mostly by Pauline) and when it would be done (ideally we’ll be launching the new version of the website early next week). There are a few further tweaks I’ll need to make, but other than replacing the live version of the site with the new version my work is pretty much done.

For ‘Metaphor in the Curriculum’ we had a further project meeting on Tuesday, where we talked about the mock-up metaphor quizzes that I’d previously produced. Everyone seems very happy with how these are turning out, and Ellen and Rachael showed them to some school children whilst they were visiting a school recently and the feedback was positive, which is encouraging. After the meeting Ellen sent me some updated and extended quiz questions and I set to work on creating a more extensive prototype based on these questions. Ellen and Rachael are hopefully going to be able to test this version out on some undergraduates next week, so it was important that I got this new version completed. This version now ‘tracks’ the user’s answers during a session using HTML5 sessionStorage. This allows us to give the user a final score at the end of the quiz and also allows the user to return to previously answered questions and look at the results again. It took a fair amount of time to get these (and other) updates in place, but I think the quiz is looking pretty good now and once we have further quizzes it should be possible to just plug them into the structure I’ve set up.

Most of the remainder of my week was spent on the content management system for the SCOSYA project. Last week I’d created a nice little drag and drop feature that enables a logged in user to upload CSV files. This week I needed to extend this so that the data contained within the CSV files could be extracted and added into the relevant tables (if it passed a series of validation checks, of course). During the course of developing the upload script I spotted a few possible shortcomings with the way the questionnaire template was structured and Gary and I had a few chats about this, which resulted in a third version of the template being created. Hopefully this version will be the final one. As the data will all be plotted on maps, storing location data for the questionnaires is pretty important. The questionnaire includes the postcode of the interviewee, and also the postcodes of the fieldworker and the interviewer. I found a very handy site called http://uk-postcodes.com/ that gives data for a place based on a postcode that is passed to it. In addition to providing a web form the site also has an API that can spit out data in the JSON format – for example here is the data for the University’s postcode: http://uk-postcodes.com/postcode/G128QQ.json

This data includes latitude and longitude values for the postcode, which will be vital for pinning questionnaire results on a map, and I managed to get my upload script to connect to the postcode API, check the supplied postcode and return the lat/long values for insertion into our database. It seems to work very well. It does mean that we’re dependant on a third party API for our data to be uploaded successfully and I’ll just have to keep an eye on how this works out, but if the API proves to be reliable it will really help with the data management process.

There was a major powercut at the university on Wednesday that knocked out all power to one of the buildings where our servers are located, including the server I’m using for SCOSYA, and this cut down on the amount of time I could spent on the project, but despite this by the end of the week I had managed to complete the upload script, a page for browsing uploaded data and a page for viewing a complete record. Next week I’ll create facilities to allow uploaded data to be edited or deleted, and after that I’ll need to meet with Gary again to discuss what other features are required of the CMS at this stage.

The powercut also took down the DSL website, amongst others (this blog included) so I spent some time on Wednesday evening and Thursday morning ensuring everything was back online again. I also spent a bit of time this week on the Scots Thesaurus project. Magda was having problems uploading new lexemes to a particularly large category, even though new lexemes could still be added with no problems to a smaller category. This was a very odd error that seemed to be caused by the number of lexemes in a category. After a bit of investigation I figured out what was causing the problem. The ‘edit category’ page is an absolutely ginormous form, made even larger because it’s within WordPress and it adds even more form elements to a page. PHP has a limit of 1000 form elements in a POST form and rather astoundingly the ‘edit’ page for the category in question had more than 1000 elements. With this figured out I asked Chris to update PHP to increase the number of elements and that solved the problem. Magda has also been working on updated word forms and I need to create a new ‘search term’ table that allows words with multiple variants to be properly searched. I’ll need to try and find the time to do this next week.