Week Beginning 18th January 2016

Two new projects that I will be involved with over the coming months and years started up this week. The first one was the People’s Voice project for Catriona MacDonald and Gerry Carruthers. I will be developing a database of poems and establishing a means of enabling the team to transcribe poems using the TEI guidelines. This is a good opportunity for me to learn more about text encoding as although I’ve been involved in some text encoding projects before I’ve never had sole responsibility for such aspects. Since starting back after Christmas I’ve been getting up to speed with TEI and the Oxygen text editing tool and this week I met with the team and we had a two hour introductory session to transcription using TEI and Oxygen. I spent quite a bit of time before the session preparing a worksheet for them and getting my head around the workings of Oxygen and the workshop went very well. It was the first time some of the people had ever written any XML and everyone did very well. It will obviously take a bit of practise for them to be able to transcribe poems rapidly, but hopefully the worksheet I prepared together with the template files I’d made will allow them to get started. There is still lots to get the project up and running and we will be meeting again in the next few weeks to get started on the database and the website, but so far things are progressing well.

The second new project I was involved with this week was Carole Hough’s REELS project (Recovering the Earliest English Language in Scotland). This project will be analysing the placenames of Berwickshire and I’ll be developing a content management system to enable the team to record all of the data. We had a project meeting this week where we went over the project timetable and discussed how and when certain tasks would start up. It was a useful meeting and a good opportunity to meet the rest of the team and we have now arranged a further, more technical meeting for next week where we will think in more detail about the requirements for the database and the CMS and things like that. I also put a request in for a subdomain for the project website and got some shared drive space set up for the project too.

As well as these two projects beginning, another project I’ve been involved with launched this week. Over the past few months I’ve been developing the technical infrastructure for a Medical Humanities Network website for Megan Coyer. This project had its official launch on Friday evening, and all went very well. The project can be accessed here: http://medical-humanities.glasgow.ac.uk/

In addition to these projects I also spent a bit of time trying to figure out what was preventing the Curious Travellers WordPress installation from connecting to external services. Using my test server I managed to fix one of the issues (the instance will now connect to the WordPress RSS feeds) but other services such as Akismet and the searching of new plugins fail to connect. The strange thing is if I copy both the database and the files for the site onto my test server external connections work, which would suggest that the problem is with the server configuration. I’ve spoken to Chris about this and he has done some investigation but as far as he can tell there is nothing at server level that is any different to other server setups. It’s all very odd and we may have to consider moving the site to a different server to see if this fixes the problem.

I also spent some time working with the Grid and preparing the Hansard data for processing on the Grid. Gareth Roy from Physics has been helping me with this and he’d sent me some instructions on how to submit a test job to the Grid before Christmas. This week I managed to successfully submit process and extract the output for my test script, which is encouraging. Gareth thought that splitting my 10Gb text file into small chunks for processing would make the most sense so I wrote a little script that split the file up, with 5000 lines per file. This resulted in about 1200 files with sizes varying from 5Mb to 16Mb, which should hopefully be relatively easy to load and process. I now have to figure out how to write a shell script that will load a file, process it and export SQL statements to another text file. I’ve never written a shell script that does anywhere near as much as this before, so it’s going to take a bit of time to get the hang of things.

My final project of the week was Metaphor in the Curriculum. We had another project meeting this week and as a result of this I added a new feature to the Mapping Metaphor website and did some further work on our prototype app. The new feature is a ‘Metaphor of the Day’ page that does what you’d expect it to: displaying a different metaphorical connection each day. You can view the feature here: http://mappingmetaphor.arts.gla.ac.uk/metaphor-of-the-day/

For the prototype I updated the structure so that the homepage of the app is now a list of links to subsections rather than just displaying the list of quizzes. This list has now been moved to a subsection. The structure is now in place to be able to add other subsections to the prototype, such as the ability to browse the metaphors and access the visualisation. As the app will need to be used without an internet connection I’m going to have to ensure that all of the data can be accessed by the app locally. For the main website the data is stored in a MySQL database and there are a bunch of AJAX calls to some PHP scripts that then generate the required HTML fragments or JSON data that the website’s Javascript then uses. For the app everything is going to have to be handled in Javascript, with the full dataset made available as JSON files. With this in mind I created some scripts that generate the JSON files I will require. In the next few weeks I will then have to figure out how to update the code to work directly with these files.

Week Beginning 11th January 2016

The Medical Humanities Network ‘soft launched’ on Friday this week so I had quite a bit of last minute tweaking and adding of features to manage before this happened. This included updating the structure to allow people and collections to be associated with each other, fixing a number of bugs, ensuring ‘deleted’ content could no longer be accessed through the site (it was previously available for test purposes), adding a new ‘contact’ section and adding a feature that ensures people agree that they have the rights to upload images. It’s all looking pretty good and as far as I’m aware it’s going to be officially launched in a week’s time.

I also received the final pieces of information I required this week to allow paid apps to be published through the Apple App Store and the Google Play Store. This is something that has been dragging on for a while and it is really good to get it out of the way. It’s actually something that’s required by people outside of the Critical Studies and getting it sorted took quite a bit of effort so it’s especially good to get it sorted.

I also took ownership of the Pennant project’s technical stuff this week. This is a temporary arrangement until a new developer is found, but in the meantime I noticed some problems with the project’s WordPress installation. There is some sort of issue that is stopping WordPress connecting to external servers. It’s being blocked somehow and as this is affecting things such as the Akismet anti-spam plugin I thought I’d better try and investigate. I had thought it was some kind of server setting, but I installed the site on my test server and it gave the same errors, even though another WordPress site I had on the server worked fine. I tried a variety of approaches, such as updating the version of WordPress, replacing the data with data from a different instance and deactivating each plugin in turn, and I eventually figured out that it’s something within the wordpress options table that’s causing the problem. If I replace this with a default version the connection works. However, this table contains a lot of information about valid WordPress plugins so I’ll have to carefully go through it to identify what has caused the problem. I’m fairly certain it’s one of the plugins that has somehow managed to block external connections. I’ll need to continue with this next week.

I met with Gary Thoms this week to discuss the technical aspects of the SCOSYA project. The .ac.uk domain name has still not come through yet so I contacted Derek Higgins, who is the person who deals with JANET, to ask him what’s going on. Happily he said JANET have now approved the domain and are awaiting payment, so we should be able to get a project website set up in the next few weeks at least. In the meantime I set Gary’s laptop up so that it could access the test version of the site I developed last year. This now means that he can use the content management system to upload and edit survey data and things like that.

I also tried to help Fraser Dallachy out with a problem he was encountering when using the command line version of the SAMUELS tagger. When running the script on his laptop he was just getting memory errors. I updated the command he was running and this at least got the script to start off, but unfortunately it got stuck loading the HT data and didn’t proceed any further. Fraser spoke to Scott at Lancaster about this and he thought it was a memory issue – apparently the script requires a minimum of 2Gb of RAM. Fraser’s laptop had 4Gb of RAM so I wasn’t convinced this was the problem, but we agreed to try running it on my new desktop PC (with 16Gb of RAM) to see what would happen. Surprisingly, the script ran successfully, so it would appear that 4Gb of RAM is insufficient. I say it ran successfully, which it did with a test file that only included ‘the cat sat on the mat’. Unfortunately, no matter what we did by way of changing the input file, the output resolutely produced the output for ‘the cat sat on the mat’! It was most infuriating and after trying everything I could think of I’m afraid I was stumped. Fraser is going to speak to Scott again to see what the problem might be.

I spent the rest of the week getting up to speed with TEI and Oxygen for the People’s Voice project. Although I have a bit of experience with TEI and XML technologies I have never been responsible for these aspects on any project I’ve been involved with before, and to become the resident expert is going to take some time. Thankfully I found some very handy online tutorials (http://tei.it.ox.ac.uk/Talks/2009-04-galway/) aimed at complete beginners, which I found to be a very useful starting point, despite being a few years old now. With a sample poem from the project in hand and Oxygen opened on my computer I managed to make some pretty good progress with transcribing and figuring out how to cope with the variety of content that needed to be marked up in some way. The thing about TEI is there are often several different ways something could be encoded and it’s difficult to know which is ‘right’, or perhaps most suitable. Thankfully I had Graeme Cannon around to offer advice, which was hugely helpful as Graeme has been using these technologies for a long time now and knows them all inside out. By the end of the week I had familiarised myself with Oxygen as a tool, had created a RelaxNG schema using the TEI P5 Roma tool, had created a stylesheet for the Author View, had transcribed the poem to a standard that Graeme was happy with and had begun work preparing the workshop I’m going to be leaving for the project team next Wednesday.

Week Beginning 4th January 2016

I returned to work on Wednesday this week, after very enjoyable but seemingly all too brief Christmas holiday. I spent a bit of time on Wednesday sorting through my emails, replying to things and checking that there was nothing lurking in my inbox from last year that I’d failed to deal with. I also spent a few hours helping Simon Taylor out with some problems a user of his Fife Placenames website had encountered. I had previously helped Simon to extract the placename data from Word files and created scripts that would split these up and generate a relational database for the resulting data, but it would appear that in a few cases the splitting script hadn’t quite worked properly and some places were being appended to the entries for the previous place. I think I managed to get this all sorted now, though.

I also spent a fair amount of my time this week on the Medical Humanities Network website and database. The project is going to be launching this month and there have been a number of further tweaks and updates that Megan and Hannah have requested, such as allowing Twitter feeds to be associated with collections, changing the way teaching materials are organised and updating the list of icons for projects. Everything is coming together quite nicely, though.

Gavin Miller also contacted me about his SciFiMedHums project. I’d previously created a WordPress plugin that allows him and his RA to create and manage bibliographical data and now he is hoping to get the public involved in submitting data too. We had an email discussion where I described a few possible ways in which this could be handled and I’ll just need to wait and see how Gavin wants to proceed with this.

I also worked on the Metaphor in the Curriculum project, greatly expanding the amount of content that is currently available in the prototype app that I’ve created. Previously we had one quiz available (metaphor and war) but before Christmas Rachael had uploaded some new quizzes to the project’s shared drive (in Word format) so I created interactive versions of these. There are now four quizzes available, each with their own nice background image. Things are shaping up quite nicely, I think.

So, a bit of a short report this week. Next week I’m going to have lots to do, such as starting work on the People’s Voice project, continuing with the SCOSYA project and working with the Grid to try and get the Hansard data processed.

Week Beginning 21st December 2015

Here’s a brief ‘Yule blog’ before I head off for Christmas. I only worked on Monday and Tuesday this week and during these two days I worked on three projects: Burns, the Medical Humanities Network and Metaphor in the Curriculum. Over the past few weeks I’ve been working with Pauline to restructure the Burns website on a test server I have running in my office (just an old desktop PC). On Monday Pauline made the final updates to the content that she needed to do and after that I made replaced the old version of the site with the new one. This is structured in a much more sensible way and has a more interesting front page. I also had to make a few changes to the site myself before we went live, such as replacing the older interactive map that just had dots on it with the newer version that had visible labels as well. I also now appear listed on the ‘project team’ page too, which is nice. The website can be found here: http://burnsc21.glasgow.ac.uk/ and the interactive map here: http://burnsc21.glasgow.ac.uk/highland-tour-interactive/.

For the Medical Humanities Network I had a fix a couple of bugs, change some site text and incorporate the ‘Welcome Trust’ logo. Not particularly taxing tasks, but good to get cleared out the way before the holidays. The website should hopefully be going live in January, all being well.

For Metaphor in the Curriculum I created a new prototype version of the quiz interface based on the very helpful feedback from the testing session a couple of weeks ago. The changes in this version include:

  1. The ‘Home’ icon in the top left of the exercise page is now slightly more prominent, plus the ‘MetaphorIC’ text also links back to the homepage too.
  2. I’ve removed the ‘Check answer’ and ‘restart’ buttons
  3. Clicking on a possible answer, or dragging and dropping for this quiz type, now automatically evaluates the answer, which streamlines things.
  4. For the non-drag and drop quiz type the background colour of the option you click on now changes – green if correct, red if incorrect and a white tick or cross is also displayed (helpful for colour blind people).
  5. The user’s first answer for each question is the one that is reflected in the overall score.  If you select the wrong answer and then select the right one this will still count as incorrect when you view the quiz summary page.
  6. On the quiz summary page the ‘restart’ button has been relabelled ‘Try again’ and the stored quiz answers are cleared when the user returns to the quiz.  The same thing happens if the user returns to the list of quizzes.
  7. ‘Next question’ and ‘Previous question’ buttons now just say ‘Next’ and ‘Previous’ to cut down on the amount of space they take up.

 

There’s one possible area of confusion, and that is that users can go back to previously answered quiz questions.  If they return to a question that they got right first time then the correct answer is pre-selected.  But if they got the question wrong, or they got it wrong first time and then chose the right answer then no answer is pre-selected.  We’ll need to consider whether this is too confusing. One possible option would be to remove the ‘previous’ button entirely.  We could also disable to ‘Next’ button until an answer has been given. No doubt the rest of the team will discuss this in January and I’ll update things further after that.

So, that’s all from me for 2015. It’s been a busy, enjoyable and hugely rewarding year and here’s hoping this continues into 2016!

Week Beginning 14th December 2015

So, here we are in the last full working week before the Christmas holidays. It’s certainly sneaked up quickly this year. I was sort of expecting work to be calming down in the run-up to Christmas but somehow the opposite has happened and there has been a lot going on this week, although I can’t really go into too much detail about at least some of it. On Monday I had a meeting with Gerry Carruthers and Catriona MacDonald about the People’s Voice project, which will be starting in January and for which I will be creating an online resource and giving advice on TEI markup and the like. We had a useful meeting where we discussed some possible technical approaches to the issues the project will be tackling and discussed the sorts of materials that the project will be transcribing. We arranged a time for a training session on Oxygen, TEI and XML in January, so I’ll need to ensure I get some materials ready for this. A lot of Monday and Tuesday was spent going through the documentation for the new Burns bid that Gerry is putting together and preparing feedback on this. Gerry is hoping to get the bid submitted soon so fingers crossed that it will be a success.

I spent a fair amount of time this week setting things up to allow me to access the ScotGrid computing resource in order to process the Hansard data for the Samuels project. This included getting my Grid certificate from John Watt and then running through quite a few steps that were required in order to get me SSH access to the Grid. Thankfully Gareth Roy had sent me some useful documentation that I followed and the process all went pretty smoothly. I have now managed to run a test script on the grid, and in the new year I will hopefully be able to set up some scripts to process chunks of the massive text file that I need to work with. On Wednesday I met with Chris McGlashan and Mike Black from Arts IT Support to discuss the possibility of me getting at least 300Gb of server space for a database in which to store all of the data I hope to extract. Unfortunately they are not currently able to offer this space as the only servers that are available host live sites and they fear having a Grid based process inserting data into might be too much load for the server. 300Gb of data is a lot – it’s probably more than all the other Arts hosted databases put together, so I can appreciate why they are reluctant to get involved. I’ll just need to see what we can do about this once I manage to get in touch with Marc Alexander. I believe there were funds in the project budget for server costs, but I’ll need to speak to Marc to make sure.

Also this week I helped Carole Hough out with some issues she’s been having with the CogTop website and Twitter, and spoke further with Pauline about the restructuring of the Burns website. She is now hoping to have this done next Monday so hopefully we can still launch the new version before Christmas. I also spent some time finishing off the final outstanding items on my Medical Humanities Network ‘to do’ list. This included allowing project members to be associated with teaching materials and updating the system so that the different types of data (projects, people, teaching materials, collections, keywords) can be ‘deleted’ by admin users as well as just being ‘deactivated’. Note that ‘deleted’ records do still exist in the underlying database so I can always retrieve these if needs be.

I was also involved in a lot of App based stuff this week. Some people in MVSL have been trying to get a paid app published via the University account for some time now, but there have been many hurdles on the way, such as the need for the University to approve the paid app contract, filling in of tax forms and bank account details, creating of custom EULAs and a seemingly endless stream of other tasks that need to be completed. I’ve been working with various people across the University to try and get this process completed, and this has taken quite a bit of time this week. We’re almost there now and I really hope that everything will be ready next week. However, even if it is the App Store will not be accepting app submissions over the Christmas holidays anyway so things are going to be delayed a little longer at least. I was also involved in a lengthy email discussion with Fraser Rowan about app development in the University. There is something of a push for app development and approval to be more formally arranged in the University, which I think is a good thing. There are lots of things that need to be considered relating to this, but I can’t really go into any detail about them here at this stage.

I will be working on Monday and Tuesday next week and then that is me off until the New Year.

Week Beginning 7th December 2015

It was a week of many projects this week, mostly working on smallish tasks that still managed to take up some time. I was involved in an email discussion this week with some of the University’s data centre people, who would like to see more Arts projects using some of the spare capacity on the ScotGrid infrastructure. This seemed pretty encouraging for the ongoing Hansard work and it culminated in a meeting with Gareth Roy, who works with the Grid for Physics on Friday. This was a very useful meeting, during which I talked through our requirements for data extraction and showed Gareth my existing scripts. Gareth gave some really helpful advice on how to tackle the extraction, such as splitting the file up into 5Mb chunks before processing and getting nodes on the Grid to tackle these chunks one at a time. At this stage we still need to see whether Arts Support will be able to provide us with the database space we require (at least 300Gb) and allow external servers (with specified IP addresses) to insert data. I’m going to meet with Chris next week to discuss this matter. At this stage things are definitely looking encouraging and hopefully some time early in the new year we’ll actually have all of the frequency data extracted.

For the Metaphor in the Curriculum project we had a little Christmas lunch out for the team on Tuesday, which was nice. On Friday Ellen and Rachael had organised a testing session for undergraduates to test out the prototype quiz that we have created, and I met with them afterwards to discuss how it went. The feedback the received was very positive and no-one encountered any problems with the interface. A few useful suggestions were made – for example that only the first answer given should be registered for the overall score, and that questions should be checked as soon as an answer is selected rather than having a separate ‘check answer’ button. I’ll create a new version of the prototype with these suggestions in place.

Hannah Tweed contacted me this week with some further suggestions for the Medical Humanities Network website, including adding facilities to allow non-admin users to upload keywords and some tweaks to the site text. I still need to implement some of the other requests she made, such as associating members with teaching materials. I should be able to get this done before Christmas, though.

Magda also contacted me about updating the Scots Thesaurus search facility to allow variants of words to be searched for. Many words have multiple forms divided with a slash, or alternative spellings laid out with brackets, for example ‘swing(e)’. Other forms were split with hyphens or included apostrophes are Magda wanted to be able to search for these with or without the hyphens. I created a script that generated such variant forms and stored them in a ‘search terms’ database table, much in the same way as I had done for the Historical Thesaurus of English. I then updated the search facilities so that they checked the contents of this new table and I also updated the WordPress plugin so that whenever words are added, edited or deleted the search variants are updated to reflect this. Magda tested everything out and all seems to be working well.

For the SCOSYA project Gary sent me the first real questionnaire to test out the upload system with. My error checking scripts picked up a couple of problems with the contents (a typo in the codes, plus some other codes that hadn’t been entered into my database yet) but after these were addressed the upload went very smoothly. I also completed work on the facilities for editing and deleting uploaded data.

During the week there were times when the majority of internet access was cut off due to some issues with JANET. Unfortunately this had a bit of an impact on the work I could do as I do kind of need internet access to do pretty much everything I’m involved with. However, I made use of the time with some tasks I’d been meaning to tackle for a while. I installed Windows 10 on my MacBook and then reinstalled all of the software I use. I also copied all of my app development stuff from my MacBook onto my desktop computer in preparation for creating the Metaphor in the Curriculum app and also for creating new Android versions of the STELLA apps that still don’t have Android versions available.

I also spent some time this week getting up to speed on the use of Oxygen, XML and TEI in preparation for the ‘People’s Voice’ project that starts in January. I also went through all of the bid documentation for this project and began to consider how the other technical parts of the project might fit together. I have a meeting with Gerry and Catriona next week where we will talk about this further.

Week Beginning 23rd November 2015

These weeks seem to be zipping by at an alarming rate! I split most of my time this week between three projects and tackled a few bits and bobs for other projects along the way too. First up in Metaphor in the Curriculum. Last week I created a fully functioning mockup of a metaphor quiz and I’d created three basic interface designs. This week I created a fourth design that is an adaptation of the third design, but incorporates some fairly significant changes. The biggest change is the introduction of a background image – a stock image from the very handy free resource http://www.freeimages.com/. The use of a background image really brightens up the interface and some transparency features on some of the interface elements helps to make the interface look appealing without making it difficult to read the actual content. I also reworked the ‘MetaphorIC’ header text so that the ‘IC’ is in a different, more cursive font and added a ‘home’ button to the header. I think it’s coming together quite nicely. We have another project meeting next week so I’ll probably have a better idea about where to focus next on this project after that.

My next project was the Burns project. Last month Pauline sent round a document listing some fairly major changes to the project website – restructuring sections, changing navigation, layout and page content etc. I set up a test version of the live site and set about implementing all of the changes that I could make without further input from project people. After getting it all working pretty well I contacted Pauline and we arranged to meet on Monday next week to go through everything and (hopefully) make all of the changes live.

The third project I worked on this week was the SCOSYA project and this took up the bulk of my time. Last week Gary had sent me a template of the spreadsheet that the project fieldworkers will fill in and email to Gary. Gary will then need to upload these spreadsheets to an online database through a content management system that I need to create. This week I began working on the database structure and the content management system. The project also wants the usual sort of project website and blog, so first of all I set up WordPress on the project’s domain. I toyed with the idea of making the content management system a WordPress ‘plugin’, but as I want the eventual front-end to be non-Wordpress I decided against this. I also looked into using Drupal for the content management system as Drupal is a tool I feel I ought to learn more about. However, the content management system is going to be very straightforward – just file upload plus data browse, edit and delete and using Drupal or other such tools seemed like overkill to me. I was also reluctant to use a system such as Drupal because they seem to change so rapidly. SCOSYA is a 5 year project (I think!) and my worry is that by the end of the project the version of Drupal that I use would have been superseded, no longer supported and seen as a bad thing to have running on a server. So I decided just to create the CMS myself.

I decided that rather than write all of the user authentication and management stuff myself I would tie this in with the WordPress system that I’d set up to power the project website and blog. After a bit of research I figured out that it is remarkably easy for non-Wordpress scripts to access the WordPress authentication methods so I set up the CMS to use these, following the instructions I found here: http://skookum.com/blog/using-wordpress-as-a-user-and-authentication-database. With this in place SCOSYA staff can manage their user accounts via WordPress and use the same details to access the CMS, which seems very neat.

Gary will be uploading batches of CSV files and we met again this week to discuss some questions I came up with whilst thinking about the upload script. We tweaked the template a little and I created a database structure that will be able to store the data in a usable format. I also looked into how best to handle the batch upload of the data. I quite like WordPress’s media upload facility, whereby users can just drag and drop files into the browser window and these are then processed. After a bit of Googling I found a nice little Javascript library that allows similar functionality: Dropzone.js (http://www.dropzonejs.com/). This works really nicely as it can process each ‘dropped’ file as a separate Ajax request in the background and then display content from the server based on the output of the server-side upload script. It also features client-side error checking (e.g. file type checking), can display thumbnails and shows file sizes, which are all nice features too (although of course you can’t rely solely on client-side error checking and must implement lots of checking on the server side too).

By the end of the week I had created an upload script that allows you to drag and drop multiple files into the upload pane, for these to be checked both on the client and server side and for a log of uploads to be built up dynamically as each file is processed in a scrolling section beneath the upload pane. I still need to do quite a lot of work on the server-side script in order to extract the actual data from the uploaded files and to insert this data into the relevant tables, but I feel that I have made very good progress with the system so far. I’ll continue with this next week.

Other than these three main projects I was involved with some others. I fixed a few bugs that had crept into the SciFiMedHums bibliographic search facility when I’d updated the functionality of it last week. I slightly tweaked the Medical Humanities Network system to give a user feedback if they try to log in with incorrect details (previously no feedback was given at all). I also contacted John Watt at NeSC to see whether he might be able to help with the extraction of the Hansard data and he suggested I try to do this on the University’s High Performance Compute Cluster. I’ll need to speak with the HPCC people to see how I might be able to use their facilities for the task that needs performed.

Week Beginning 16th November 2015

I met with Gary Thoms this week to discuss the content management system I’m going to build for the SCOSYA project. Gary has prepared a spreadsheet template that the fieldworkers are going to be using when conducting their interviews and we talked through how this was structured. After that I began to think about the database structure that will be used to store the spreadsheet uploads.   I also had a meeting with the Burns people to discuss the new bid that they are putting together, which is getting very close to completion now. I also talked to Pauline about the restructuring of the Burns online resource and we agreed that I would begin work on this at a temporary URL before moving everything across to replace the existing site. I’ll need to start work on this in the next week or so. I also updated the search functionality of the SciFiMedHums bibliography system to enable users to search for multiple themes and mediums (once the system goes live). I also made a few tweaks to the Medical Humanities Network website, mainly adding in the site text and helping out with some video uploads. I made a couple of small tweaks to the new Thesaurus of Old English content management system and set up some user accounts for people too.

My major focus of the week was the Metaphor in the Curriculum project. At our last project meeting Ellen had given me some sample questions to show how the metaphor quizzes that will be found in the apps and the website will be structured. I spent a lot of this week creating working digital prototypes of these, using a similar approach to the one I’d taken for the interactive STELLA apps I’d previously produced: there is a question, some possible answers, a ‘check answer’ button, a ‘restart’ button and next and previous question buttons (where applicable). The question content itself is pulled in from a JSON file and there are three question types, although really types 1 and 3 are handled in the same way in the current version.  Types 1 and 3 (questions 1-3 and 6-7) present possible answers, the user can click on one and then press the ‘check answer’ button.  A tick will be placed beside their answer if it was correct, a cross if incorrect.  No other feedback is currently offered and there is no tracking of right / wrong answers (this is something that might be changed).

Question type 2 (questions 4-5) allows the user to ‘drag and drop’ an answer into the dotted box in the question.  Once an answer is dropped into place the user can press the ‘check answer’ button and a tick or cross will be displayed beside the sentence and also beside the item they dragged. I’ve tested this drag and drop functionality out on iOS and Android devices and it works fine, so hopefully we’ll be able to include such functionality in the final version of the quizzes.

The prototypes I’d created focused entirely on the functionality of the quiz itself and not on the overall design of the interface, and once I’d completed the prototypes I set to work on some possible interface designs. So far I have created three, but I need to work on these some more before I let anyone see them. Having said that, I think the technical side of the project is currently on schedule, which is encouraging.

On Friday there was a corpus linguistics workshop at the University, which was being organised by Wendy. I had been invited to be part of the panel for a roundtable discussion session at the end of the event so I spent some time this week preparing for this. Unfortunately my son was ill on Friday and I had to leave work to pick him up from school, which meant I had to miss the event which was a real shame.

 

Week Beginning 26th October 2015

I returned to a more normal working week this week, after having spent the previous one at a conference and the one before that on holiday. I probably spent about a day catching up with emails, submitting my expenses claim and writing last week’s rather extensive conference report / blog post. I also decided it was about time that I gathered all of my outstanding tasks together into one long ‘to do’ list as I seem to have a lot going on at the moment. The list currently has 47 items on it split across more than 12 different projects, not including other projects that will be starting up in the next month or two. There’s rather a lot going on at the moment and it is good to have everything written down in one place so I don’t forget anything. I also had some AHRC review duties to perform this week as well, which took up some further time.

With these tasks out of the way I could get stuck into working on some of my outstanding projects again. I met with Hannah Tweed on Tuesday to go through the Medical Humanities Network website with her. She had begun to populate the content management system with projects and people now and had encountered a few bugs and areas of confusion so we went through the system and I made a note of things that needed fixed. These were all thankfully small issues and all easily fixable, such as supressing the display of fields when the information isn’t available and it was good to get things working properly. I also returned to the SciFiMedHums bibliographical database. I updated the layout of the ‘associated information’ section of the ‘view item’ page to make it look nicer and I created the ‘advanced search’ form, that enables users to search for things like themes, mediums, dates, people and places. I also reworked the search results page to add in pagination, with results currently getting split over multiple pages when more than 10 items are returned. I’ve pretty much finished all I can do on this project now until I get some feedback from Gavin. I also helped Zanne to get some videos reformatted and uploaded to the Academic Publishing website, which will probably be my final task for this project.

Wendy contacted me this week to say that she’d spotted some slightly odd behaviour with the Scots Corpus website. The advanced search was saying that there were 1317 documents in the system but a search returning all of them was saying that it matched 99.92% of the corpus. The regular search stated that there were 1316 documents. We figured out that this was being caused by a request we had earlier this year to remove a document from the corpus. I had figured out a way to delete it but evidently there was some data somewhere that hadn’t been successfully updated. I managed to track this down: it turned out that the number of documents and the total number of words was being stored statically in a database table, and the advanced search was referencing this. Having discovered this I updated the static table and everything was sorted. Wendy also asked me about further updates to the Corpus that she would like to see in place before a new edition of a book goes to the printers in January. We agreed that it would be good to rework the advanced search criteria selection as the options are just too confusing as they stand. There is also a slight issue with the concordance ordering that I need to get sorted too.

At the conference last week Marc, Fraser and I met with Terttu Nevalainen and Matti Rissanen to discuss Glasgow hosting the Helsinki Corpus, which is currently only available on CD. This week I spent some time looking through the source code and getting a bit of server space set aside for hosting the resource. The scripts that power the corpus are Python based and I’ve not had a massive amount of experience with Python, but looking through the source code it all seemed fairly easy to understand. I managed to get the necessary scripts and the data (mostly XML and some plain text) uploaded to the server and the scripts executing. The only change I have so far made to the code is to remove the ‘Exit’ tab as this is no longer applicable. We will need to update some of the site text and also add in a ‘hosted by Glasgow’ link somewhere. The corpus all seems to work online in the same way as it does on the CD now, which is great. The only problem is the speed of the search facilities. The search is very slow, and can take up to 30 seconds to run. Without delving into the code I can’t say why this is the case, but I would suspect it is because the script has to run through every XML file in the system each time the search runs. There doesn’t appear to be any caching or indexing of the data (e.g. using an XML database) and I would imagine that without using such facilities we won’t be able to do much to improve the speed. The test site isn’t publicly accessible yet as I need to speak to Marc about it before we take things further.

 

 

Week Beginning 28th September 2015

This week was a return to something like normality after the somewhat hectic time I had in the run-up to the launch of the Scots Thesaurus website last week. I spent a bit of further time on the Scots Thesaurus project, making some tweaks to things that were noticed last week and adding in some functionality that I didn’t have time to implement before the launch. This include differentiating between regular SND entries and supplemental entries in the ‘source’ links and updating the advanced search functionality to enable users to limit their search by source. I also spent the best part of a day working on the Technical Plan for the Burns people, submitting a first draft and a long list of questions to them on Monday. Gerry and Pauline got back to me with some replies by the end of the week and I’ll be writing a second version of the plan next week.

I also had a few meetings this week. I met with Luca Guariento from Music to help him with a JavaScript problem he was having. He’s using the Diva.js library as an image viewer but he was wanting to be able to pass specific image page IDs to the library via links in a search results page. We got something worked out using hashes to pass IDs between pages, but encountered some problems with the initialisation of Diva.js. Thankfully Luca has been in contact with the Diva developers and they told him how he could hook into the Diva initialisation event and all is working nicely now. I must remember to use Diva on a future project as it’s a very nice interface. I also met with Iain Edmonds, who is developing some quizzes for the Sounds of the City project. He was wanting some advice on being able to link to specific parts of a web page and I gave him some pointers.

On Friday we had a team meeting for the Metaphor in the Curriculum project. We spent a couple of hours going over the intended outputs of the project and getting some more concrete ideas about how they might be structured and interconnected, and also about timescales for development. It’s looking like I will be creating some mockups of possible exercise interfaces in early November, based on content that Ellen is going to send to me this month. I will then start to develop the app and the website in December with testing and refinement in January, or there abouts.

I also spent some time this week working on the Medical Humanities Network website for Megan Coyer. I have now completed the keywords page, the ‘add and edit keywords’ facilities and I’ve added in options to add and edit organisations and units. I think that means all the development work is now complete! I’ll still need to add in any site text when this has been prepared and I’ll need to remove the ‘log in’ pop-up when the site is ready to go live. Other than that my development work on this project is now complete.

Continuing on a Medical Humanities them, I spent a few hours this week working on some of the front end features for the ScifFiMedHums website, specifically features that will allow users to browse the bibliographical items for things like years and themes. There’s still a lot to implement but it’s coming along quite nicely. I also helped Alison Wiggins out with a new website she’s wanting to set up. It’s another WordPress based site and the bare-bones site is now up and running and ready for her to work with when she has the time available.

On Friday afternoon I received my new desktop PC for my office and I spent quite a bit of the afternoon getting it set up, installing software, copying files across from my old PC and things like that. It’s going to be so good to have a PC that doesn’t crash if you tell it to open Excel in the afternoons!