
Month: December 2014
Week Beginning 15th December 2014
This week was my last week before my Christmas holidays and it was rather hectic. It was mostly split between Mapping Metaphor and Burns. For Burns, Pauline had sent me the content for two of the three timeline streams that the timeline will feature. The timeline is something that has been bubbling away for a long time now and it was great to finally get some real content to play with. I’m using the freely available timeglider library (http://timeglider.com/widget/?p=intro) and have set it up to have three individual streams (‘Life’, ‘Publications’ and ‘Prose’), each of which can be turned on and off. The default timeglider functionality allows you to do this from a settings menu in the footer but this is rather hidden and I wanted some nice big buttons above the timeline too. This required rather a lot of furtling about with the timeglider source code to extend the default functionality. The public methods of the widget didn’t allow timelines to be turned on and off by a call from beyond the timeglider code, but I extended the methods to allow for such behaviour. I now have three nice buttons that you can use to toggle the timelines (because having all three on at once is rather cluttered). The timeline should launch early next month and I’ll post a link once it’s available. Also for Burns I was asked to create a new section of the site and set up the necessary files to enable a new song to be played. I also set up the basics of a new website for a project Pauline has recently started work on (Bawdry in Scottish Chapbooks) and gave feedback on the content of the map pop-ups that Pauline is working on. The new maps of the tours should also go live early next week.
For Mapping Metaphor I provided some further input to the PI response for the follow-on funding project and then spent the rest of the week working on three rather major tasks:
1. Implementing the renumbering of the MM Categories. As there are now more than 26 level 2 categories we can’t use letters to represent them. We previously decided to add the level 1 category as a number at the start – so A01 becomes 1A01. This has meant changing the structure of the database and implementing some fairly major changes to the front-end as well. It took quite a long time to get this sorted but the new category numbers should now be in place throughout the site and all would appear to be working fine.
2. I created scripts for Ellen and Flora that have stripped out the duplicate rows from Flora’s database and have automatically applied consolidated metaphor strength codes to a lot of the data. The ‘remove duplicates’ script has whittled the rows down from 36,741 to 18,975. The ‘consolidated strength’ script has identified 12,090 rows that didn’t already have a consolidated strength and has supplied such data for 9418 rows, leaving 2672 for manual inspection.
3. I’ve begun the process of replacing connections from the front-end to the database. The front end was previously using dedicated mysql functions to connect to and query the database but these functions are deprecated in the most recent version of PHP, meaning that in few years the functions will likely not be supported. The alternative approach is to use PHP Data Objects (PDO), which should be future proof and also means switching to a different database (if ever required) would involve simply changing one line of code rather than the entire system. Unfortunately to implement PDO I do need to change the entire system at the moment – every single database query needs to be replaced and tested. I had hoped to complete the migration today but it’s taken longer than I expected and there are currently some parts of the site that are broken (e.g. the table view).
Once I’ve completed the migration to PDO we’ll need to do some pretty extensive testing of the site. Changing the data query method combined with the category number update is basically ripping out and rebuilding the heart of the system and it’s likely that some bugs will have crept in. However, fixing any that are identified shouldn’t be too tricky and I think the website is still very much on schedule.
Also this week I completed the online training courses for recruitment and selection, meaning I should be able to attend the workshop in January and then be able to be on the panel for interviews. I also gave Carole some feedback for her Leverhulme bid.
That’s all for this week, and indeed this year. If anyone is reading this I hope you have a wonderful Christmas!
Week Beginning 8th December 2014
This week was another busy one, and I didn’t manage to get stuck into any Mapping Metaphor development tasks at all, due to other things I had to do. On Monday I prepared for and attended a Mapping Metaphor project meeting and, as always, it was good to catch up with the team and to let people know what I’ve recently been working on. I also spent some time reading through and commenting on a bid Carole is developing based on an earlier place-name project that unfortunately didn’t get funded. Here’s hoping this one is more successful.
On Tuesday I had a meeting with Jeremy and it was great to have a chat with him again about the work I’ve been doing. Jeremy is my line manager and this was actually only my second meeting with him this year and he suggested that it might be a good idea for line manager duties to be transferred to someone else who can be more closely involved in what I do. I think this is a good idea. I am very good at just getting on with things, I like to think I am doing a good job and am keeping everyone happy and I don’t need much in the way of hands-on managerial support but it probably does make sense to have a line manager that is more available.
I had to spend a fair bit of time this week preparing for David Shuttleton’s Medidal Humanities event on Thursday – David had asked me to speak about Digital Humanities at Glasgow. Even though it was only a 15 minute talk it still took quite some time to prepare the materials and get everything ready. The talk on Thursday went well and I attended the whole day of the event. It was very interesting to hear about some of the other projects that are underway in the University, such as the digitisation of medical records that is being managed at the library, the Cullen project and others.
During the week Fraser Rowan emailed me to let me know that my ‘Knowledge Exchange’ blog and video clip had been posted. The blog post is available at http://www.keblog.arts.gla.ac.uk/2014/12/11/50-years-digital-college-arts/ and the video can be viewed at https://www.youtube.com/watch?v=YMhTrlSmm4k&feature=youtu.be.
I had a new task to do for the SAMUELS project this week as well. The people at Lancaster needed me to renumber a large section of the Historical Thesaurus (in a temporary table – not the real thing!) so that the primary keys would match up with category numbers as had been assigned by the OED people back when they got their hands on the data. It took a few hours to get this together, as initially there was some confusion as to exactly what was required. In the end I had to import one of the earlier versions of the HT from CSV and I worked with that to get them what they needed.
Flora also contacted me with some problems she’s been having with duplicate rows in the Mapping Metaphor Access database. She’s tried to remove them but hasn’t been able to find a method of doing so using Access. I said I would try to look at the data and see if there was any way to filter out the duplicates using MySQL and PHP. I started to look into this on Friday but didn’t quite have the time. I’ll need to return to this next week. Ellen also sent me the new numbering scheme for the project and I’ll have to implement this next week too. Wendy had received the reviews of the Mapping Metaphor follow-on funding proposal this week and I also spent a bit of time reading through these and providing technical responses where needed.
On Friday we had the Mapping Metaphor Christmas lunch, which was very nice. I took Friday afternoon off to do a bit of shopping. Pauline emailed me on Friday afternoon with a bunch of things to do for the Burns project so I’ll have to sort all those out next week too.
Week Beginning 1st December 2014
I was involved with quite a number of different projects and tasks this week. On Monday I spent some time writing a technical overview of the ‘Medical Humanities Network’ website and content management system for Megan Coyer, following our meeting last week. I also met with Gavin Miller to discuss a project he is putting together. This will likely involve me writing a WordPress plugin for the first time, which I’m quite interested in doing.
I also had a catch-up meeting with Susan Rennie regarding the Historical Thesaurus of Scots project. Susan seemed pretty happy with the HTE / DSL search tool I had previously created (although the BaseX database on the Testbed server had fallen over between me emailing the URL to the tool and our meeting). It was a useful meeting and I now have a clearer idea of what additional development tasks are required of me for the project. In terms of the tool, Susan would like me to be able to somehow automate the process of selecting DSL results for inclusion in the thesaurus, including selecting categories or creating new ones. This is potentially going to be rather problematic as the DSL XML items don’t actually have unique identifiers of any sort, or even an easily identifiable tag around the entry’s headword and part of speech. Susan is going to speak to the SLD people about how they got around this problem for the DSL website, as it is an issue that must certainly have cropped up.
Regarding the front-end for the project, Susan would like this to go live sooner rather than later, to enable members of the public to suggest words and upload images and sound clips. I am going to develop and interface based on the mock-up visualisations I previously created that will allow users to select a category and to then post content (once they are signed in). The current sticking point with this is that we haven’t yet decided on a structure for the thesaurus. I’ve currently been using the structure of the Historical Thesaurus of English, but Susan would like to use something a bit simpler and more akin to the existing paper version of the Thesaurus of Scots. So there are lots of technical tasks to do for the project, but for the time being some further decisions need to be made before I can really continue with the majority of them.
I was asked by Fraser Rowan to write a blog post and record a video clip about ‘digital’ in the College of Arts this week, so I spent a bit of time preparing for these. We recorded the video on Friday and all went pretty well, even though the top of my head has been chopped off and I look even more balding than I really am!
I attended a College of Arts developers meeting this week, which was a useful opportunity to talk to some of the other developers and find out a bit more about what they’re doing and the technologies they are using. Matthew Barr talked briefly about developing WordPress plugins, which was very useful to hear about as I will hopefully be doing this myself soon. I also gave a talk about Web App development, talking about my experiences with Apache Cordova and submitting Apps to the App store. People seemed pretty interested in what I had to say.
I spent most of the rest of the week continuing with Mapping Metaphor duties. On Tuesday I attended a meeting with Wendy, Ellen, Flora and Marc to discuss stages 4, 5 and 6 of the project (dealing with metaphor directionality, sample lexemes and dates and the renumbering of the categories. It was a very useful meeting, at which we decided on a new numbering scheme for the categories. Previously we used letters A-Z to represent a level 2 category and then a two-digit number to represent the level 3 categories (e.g. A01). This isn’t going to work due to the level 2 categories being split up further, beyond 26. What we’ve decided to do is add an extra number at the start of the IDs to represent the level 1 category (External, Mental, Social worlds). So A01 becomes 1A01. This should work quite nicely as it makes it clearer which level 1 category a level 2 category belongs to. I am going to have to rework a lot of the code so it works with the new IDs, plus the database structure will have to be updated too, but it’s a task that needs doing and it’s good to have established exactly how it’s going to be done.
In terms of actual development work, the big task I ticked off my ‘to do’ list this week was handling the metaphor cards plus the ‘card view’ of the data. The pop-ups now reflect the chosen card layout throughout, including having a ‘start era’ timeline running across them. I’ve also completed the ‘card view’ of the data as accessible from the ‘Change view’ option both at aggregate and drilldown levels. It is now possible to view any of the data at any level as cards, tables or visualisations. Timeline view is still to come.
Week Beginning 24th November 2014
I spent a day or so this week continuing to work on the Scots Thesaurus project. Last week I started to create a tool that will allow a researcher to search the Historical Thesaurus of English for a word or phrase, then select words from a list that gets returned and to then automatically search the contends of the Dictionary of the Scots Language for these terms. I completed a first version of the tool this week. In order to get the tool working I needed to get the ‘BaseX’ XML database installed on a server. The Arts Support people didn’t want to install this on a production server so they set it up for me on the Arts Testbed server instead, which is fine as it meant I had a deeper level of access to the database than I would otherwise have got from another server. Using the BaseX command-line tools I managed to create a new database for the DSL XML data and to then import this data. A PHP client is available for BaseX, allowing PHP scripts to connect to the database much in the same way as they would connect to a MySQL database and I created a test script to see how this would work, based on the FLOWR query I had experimented with last week. This worked fine and returned a set of XML formatted results.
The HTE part of the tool that I had developed last week only allowed the user to select or deselect terms before they were passed to the DSL query, but I realised this was a little too limiting – some of the HT words have letters in brackets or with dashes and may need tweaked, plus the user might want to search for additional words or word forms. For these reasons I adapted the tool to present the selected words in an editable text area before they are passed to the DSL query. Now the user can edit and augment the list as they see fit.
The BaseX database on the server currently seems to be rather slow and at least slightly flaky. During my experimentation it crashed a few times, and one of these times it somehow managed to lose the DSL database entirely and I had to recreate it. It’s probably just as well it’s not located on a production server. Having said that, I have managed to get the queries working, thus connecting up the data sources of the HTE and the DSL. For example, a user can find all of the words that are used to mean ‘golf’ in the HTE, edit the list of words and then at the click of a button search the text of the DSL (excluding citations as Susan requested) for these terms, bringing back the entry XML of each entry where the term is found. I’ve ‘borrowed’ the XSLT file I created for the DSL website to format the returned entries and the search terms are highlighted in these entries to make things easier to navigate. It’s working pretty well, although I’m not sure how useful it will prove to be. I’ll be meeting with Susan next week to discuss this.
I also spent a little time this week updating the Digital Humanities Network website. Previously it still have the look and feel of the old University website but I’ve now updated all of the pages to bring it into line with the current University website. I think it looks a lot better. I also had a further meeting with Megan Coyer this week, who is hoping to get a small grant to develop a ‘Medical Humanities Network’ base on the DH Network but with some tweaks. It was a good meeting and I think we now know exactly what is required from the website and Content Management System and how much time it will take to get the resource up and running, if we get the funding.
I spent most of the rest of the week working on the Mapping Metaphor website. Ellen had sent me some text for various pages of the site so I added this. I also continued to work through my ‘to do’ list. I finished off the outstanding tasks relating to the ‘tabular view’ of the data, for example adding in table headings, colour coding the categories that are listed and also extending the tabular view to the ‘aggregate’ level, enabling the user to see a list of all of the level 2 categories (e.g. ‘The Earth’) and see the number of connections each of these categories has to the other categories. I also added in links to the ‘drill down’ view, allowing the user to open a category while remaining in the tabular view. After completing this I turned my attention to the ‘Browse’ facility. This previously just showed metaphor connections of any strength, but I have now added a strength selector. The browse page also previously only showed the number of metaphorical connections each category has, rather than showing the number of categories within each higher level category. I’ve updated this as well now. I also had a request to allow users to view a category’s keywords from the browse page so I’ve added this facility too, using the same ‘drop-down’ mechanism that I’d previously used for the search results page. The final update I made to the browse page was to ensure that links to categories now lead to the new tabular view of the data rather than to the old ‘category’ page which is now obsolete.
After this I began working on the metaphor cards, updating the design of the cards that appear when a connecting line is clicked on in the diagram to reflect the design that was chosen on the Colloquium. I’m almost finished with this but still need to work on the ‘Start Era’ timeline and the aggregate card box. After that I’ll get the ‘card view’ of the data working.
On Friday afternoon I attended the launch event for a new ‘Video Games and Learning’ journal that my old HATII colleague Matthew Barr has been putting together. It’s called ‘Press Start’ and can be found here: http://press-start.gla.ac.uk/index.php/press-start. The launch event was excellent with a very interesting and though provoking lecture by Dr Esther McCallum-Stewart of the University of Surrey.