Week Beginning 30th November 2015

My time this week was mostly spent on the same three projects as last week. On Monday I met with Pauline Mackay to run through the updates I’d made to my ‘test version’ of the Burns website based on her document of suggested changes. All of the updates have come together remarkable quickly and easily (so far!) and at the meeting we just confirmed what still needed to be done (mostly by Pauline) and when it would be done (ideally we’ll be launching the new version of the website early next week). There are a few further tweaks I’ll need to make, but other than replacing the live version of the site with the new version my work is pretty much done.

For ‘Metaphor in the Curriculum’ we had a further project meeting on Tuesday, where we talked about the mock-up metaphor quizzes that I’d previously produced. Everyone seems very happy with how these are turning out, and Ellen and Rachael showed them to some school children whilst they were visiting a school recently and the feedback was positive, which is encouraging. After the meeting Ellen sent me some updated and extended quiz questions and I set to work on creating a more extensive prototype based on these questions. Ellen and Rachael are hopefully going to be able to test this version out on some undergraduates next week, so it was important that I got this new version completed. This version now ‘tracks’ the user’s answers during a session using HTML5 sessionStorage. This allows us to give the user a final score at the end of the quiz and also allows the user to return to previously answered questions and look at the results again. It took a fair amount of time to get these (and other) updates in place, but I think the quiz is looking pretty good now and once we have further quizzes it should be possible to just plug them into the structure I’ve set up.

Most of the remainder of my week was spent on the content management system for the SCOSYA project. Last week I’d created a nice little drag and drop feature that enables a logged in user to upload CSV files. This week I needed to extend this so that the data contained within the CSV files could be extracted and added into the relevant tables (if it passed a series of validation checks, of course). During the course of developing the upload script I spotted a few possible shortcomings with the way the questionnaire template was structured and Gary and I had a few chats about this, which resulted in a third version of the template being created. Hopefully this version will be the final one. As the data will all be plotted on maps, storing location data for the questionnaires is pretty important. The questionnaire includes the postcode of the interviewee, and also the postcodes of the fieldworker and the interviewer. I found a very handy site called http://uk-postcodes.com/ that gives data for a place based on a postcode that is passed to it. In addition to providing a web form the site also has an API that can spit out data in the JSON format – for example here is the data for the University’s postcode: http://uk-postcodes.com/postcode/G128QQ.json

This data includes latitude and longitude values for the postcode, which will be vital for pinning questionnaire results on a map, and I managed to get my upload script to connect to the postcode API, check the supplied postcode and return the lat/long values for insertion into our database. It seems to work very well. It does mean that we’re dependant on a third party API for our data to be uploaded successfully and I’ll just have to keep an eye on how this works out, but if the API proves to be reliable it will really help with the data management process.

There was a major powercut at the university on Wednesday that knocked out all power to one of the buildings where our servers are located, including the server I’m using for SCOSYA, and this cut down on the amount of time I could spent on the project, but despite this by the end of the week I had managed to complete the upload script, a page for browsing uploaded data and a page for viewing a complete record. Next week I’ll create facilities to allow uploaded data to be edited or deleted, and after that I’ll need to meet with Gary again to discuss what other features are required of the CMS at this stage.

The powercut also took down the DSL website, amongst others (this blog included) so I spent some time on Wednesday evening and Thursday morning ensuring everything was back online again. I also spent a bit of time this week on the Scots Thesaurus project. Magda was having problems uploading new lexemes to a particularly large category, even though new lexemes could still be added with no problems to a smaller category. This was a very odd error that seemed to be caused by the number of lexemes in a category. After a bit of investigation I figured out what was causing the problem. The ‘edit category’ page is an absolutely ginormous form, made even larger because it’s within WordPress and it adds even more form elements to a page. PHP has a limit of 1000 form elements in a POST form and rather astoundingly the ‘edit’ page for the category in question had more than 1000 elements. With this figured out I asked Chris to update PHP to increase the number of elements and that solved the problem. Magda has also been working on updated word forms and I need to create a new ‘search term’ table that allows words with multiple variants to be properly searched. I’ll need to try and find the time to do this next week.