Week Beginning 24th November 2014

I spent a day or so this week continuing to work on the Scots Thesaurus project.  Last week I started to create a tool that will allow a researcher to search the Historical Thesaurus of English for a word or phrase, then select words from a list that gets returned and to then automatically search the contends of the Dictionary of the Scots Language for these terms.  I completed a first version of the tool this week.  In order to get the tool working I needed to get the ‘BaseX’ XML database installed on a server.  The Arts Support people didn’t want to install this on a production server so they set it up for me on the Arts Testbed server instead, which is fine as it meant I had a deeper level of access to the database than I would otherwise have got from another server.  Using the BaseX command-line tools I managed to create a new database for the DSL XML data and to then import this data.  A PHP client is available for BaseX, allowing PHP scripts to connect to the database much in the same way as they would connect to a MySQL database and I created a test script to see how this would work, based on the FLOWR query I had experimented with last week.  This worked fine and returned a set of XML formatted results.

The HTE part of the tool that I had developed last week only allowed the user to select or deselect terms before they were passed to the DSL query, but I realised this was a little too limiting – some of the HT words have letters in brackets or with dashes and may need tweaked, plus the user might want to search for additional words or word forms.  For these reasons I adapted the tool to present the selected words in an editable text area before they are passed to the DSL query.  Now the user can edit and augment the list as they see fit.

The BaseX database on the server currently seems to be rather slow and at least slightly flaky.  During my experimentation it crashed a few times, and one of these times it somehow managed to lose the DSL database entirely and I had to recreate it.  It’s probably just as well it’s not located on a production server.  Having said that, I have managed to get the queries working, thus connecting up the data sources of the HTE and the DSL.  For example, a user can find all of the words that are used to mean ‘golf’ in the HTE, edit the list of words and then at the click of a button search the text of the DSL (excluding citations as Susan requested) for these terms, bringing back the entry XML of each entry where the term is found.  I’ve ‘borrowed’ the XSLT file I created for the DSL website to format the returned entries and the search terms are highlighted in these entries to make things easier to navigate.  It’s working pretty well, although I’m not sure how useful it will prove to be.  I’ll be meeting with Susan next week to discuss this.

I also spent a little time this week updating the Digital Humanities Network website.  Previously it still have the look and feel of the old University website but I’ve now updated all of the pages to bring it into line with the current University website.  I think it looks a lot better.  I also had a further meeting with Megan Coyer this week, who is hoping to get a small grant to develop a ‘Medical Humanities Network’ base on the DH Network but with some tweaks.  It was a good meeting and I think we now know exactly what is required from the website and Content Management System and how much time it will take to get the resource up and running, if we get the funding.

I spent most of the rest of the week working on the Mapping Metaphor website.  Ellen had sent me some text for various pages of the site so I added this.  I also continued to work through my ‘to do’ list.  I finished off the outstanding tasks relating to the ‘tabular view’ of the data, for example adding in table headings, colour coding the categories that are listed and also extending the tabular view to the ‘aggregate’ level, enabling the user to see a list of all of the level 2 categories (e.g. ‘The Earth’) and see the number of connections each of these categories has to the other categories.  I also added in links to the ‘drill down’ view, allowing the user to open a category while remaining in the tabular view.  After completing this I turned my attention to the ‘Browse’ facility.  This previously just showed metaphor connections of any strength, but I have now added a strength selector.  The browse page also previously only showed the number of metaphorical connections each category has, rather than showing the number of categories within each higher level category.  I’ve updated this as well now.  I also had a request to allow users to view a category’s keywords from the browse page so I’ve added this facility too, using the same ‘drop-down’ mechanism that I’d previously used for the search results page.  The final update I made to the browse page was to ensure that links to categories now lead to the new tabular view of the data rather than to the old ‘category’ page which is now obsolete.

After this I began working on the metaphor cards, updating the design of the cards that appear when a connecting line is clicked on in the diagram to reflect the design that was chosen on the Colloquium.  I’m almost finished with this but still need to work on the ‘Start Era’ timeline and the aggregate card box.  After that I’ll get the ‘card view’ of the data working.

On Friday afternoon I attended the launch event for a new ‘Video Games and Learning’ journal that my old HATII colleague Matthew Barr has been putting together.  It’s called ‘Press Start’ and can be found here: http://press-start.gla.ac.uk/index.php/press-start.  The launch event was excellent with a very interesting and though provoking lecture by  Dr Esther McCallum-Stewart of the University of Surrey.