Week Beginning 2nd May 2016

It was another four-day week this week due to the May Day holiday on Monday.  I spent most of Tuesday finishing off a first version of the Technical Plan for the proposal Murray Pittock is putting together and sent it off to him for comment on Wednesday morning.  There will no doubt be some refinements to make to it before it can be integrated with the rest of the bid documentation but I think overall the technical side of the potential project have been given sufficient consideration.  On Wednesday I heard from Christine Ferguson that a proposal she had submitted which I had given technical advice on had been awarded funding, which is great news.  I’m going to meet with her next week to discuss the technical requirements in more detail.

Also on Wednesday we had a College of Arts developers meeting.  This consisted of Matt Barr, Graeme Cannon, Neil McDermott and me so not a huge gathering, but it was very useful to catch up with other developers and discuss some of the important issues for developers in the College.

I also spent some time on Wednesday investigating a bug in the Mapping Metaphor tabular view.  Wendy had noticed that ordering the table by category two was not working as it should have done.  It looks like I introduced this bug when I allowed the table to be properly ordered by the ‘direction’ column a few weeks ago.  For the table to be ordered by the direction column I needed to look at the full HTML of the data in the columns to get some info from within the ‘image’ tag of the arrow.  But for some reason the reordering isn’t working properly for the other columns when the full HTML rather than the plain text is used.  I’ve updated things so that the full HTML is used only when the ‘direction’ column is clicked on and the plain text is used for all other columns.  I’ve fixed this in the main site and the Metaphoric site.  I’m afraid the problem also exists in the App, but I’ll wait to fix this until we have the next batch of data to add later on this month.

On Thursday I contacted Gareth Roy about the possibility of the Hansard data being hosted on one of the servers managed by Physics and Astronomy.  Gareth had previously been hugely helpful in giving advice on how to use the ScotGrid infrastructure to extract all of the Hansard data and to convert it into millions of SQL insert statements, but without a server to host the data I couldn’t proceed any further.  In March Gareth and I attended the same ‘big data’ meeting and after wards he suggested that there might be the possibility of getting the data hosted on one of the servers he has access to and now that Metaphoric is out of the way I have a bit of time to return to the Hansard data and consider how it can be used.  I’m going to meet with Gareth next week to consider the options.  In the meantime I tried to access the test server that Chris had set up for me last year, on which I had a subset of the Hansard data running through a graph-based front end that I’d created.  Unfortunately when I tried to connect to it nothing happened.  As the box is physically in my office (it’s just an old desktop PC set up as a server) I tried manually turning it off and on again but it just made a worrying series of low pitch beeps and then nothing happened.  I tried plugging a monitor and keyboard into it and there was no display and the caps lock key wouldn’t light up or anything.  I took out the hard drive and put it in my old desktop PC and that made the same worrying beeps and then did nothing when I started it too.  I then attached the drive in place of the secondary hard drive in my old PC and after it had successfully booted into Windows it didn’t find the drive.  So I guess there’s been some sort of hard drive failure!  Thankfully I have all the code I was working with on my desktop PC anyway.  However, I’ve realised that the version of the database I have on my desktop PC is not the most recent version.  I’ll be able to reconstruct the database from the original data I have but it’s a shame I’ve lost the working version.  It’s completely my own fault for not backing things up.  Chris is going to try and see if he can access the hard drive too, but I’m not holding out much hope.

On Friday I had a meeting with Alison Wiggins to discuss a proposal she is putting together that will involve crowdsourcing.  I did a bit of research into the Zooniverse Scribe tool (http://scribeproject.github.io/), which she was keen to use.  I also tried installing the tool in OSX but despite following their detailed installation instructions all I ended up with was a bunch of errors.  Nevertheless, it was useful to learn about the tool and also to try out some online examples of the tool as well.  I think it has potential, if we can only get it working.  The meeting with Alison went well and we discussed the technical issues relating to her project and how crowdsourcing might fit in with it.  She is still at the planning stages at the moment and we’ll need to see if the project will feature a technical aspect or if it will involve some kind of event discussing issues relating to crowdsourcing instead.  I think either might work pretty well.

Also this week I received an external hard drive in the mail from the Royal College of Physicians in Edinburgh, who want access to the raw TIFF images that were produced for the Cullen project.  Project PI David Shuttleton had previously agreed to this so I started the process off.  Copying half a terabyte of images from a Network drive to an external hard drive takes rather a long time, but by Friday morning I had completed the process and had mailed the hard drive off.  Hopefully it will arrive in one piece.

 

Week Beginning 22nd February 2016

I divided my time this week primarily between three projects: REELS, The People’s Voice and the Mapping Metaphor follow-on project. For REELS I continued with the content management system. After completing the place-name element management systems last week I decided this week to begin to tackle the bigger issue of management scripts for place-names themselves. This included migrating parish details into the database from a spreadsheet that Eila had previously sent me and migrating the classification codes from the Fife place-name database. I began work on the script that will process the addition of a new place-name record, creating the form that project staff will fill in, including facilities to add any number of map sheet records.

I initially included facilities to associate place-name elements with this ‘add’ form, which proved to be rather complicated. A place-name may have any number of elements and these might already exist in our element database. I created an ‘autocomplete’ facility whereby a user starts to type an element and the system queries the database and brings back a list of possible matching items. This was complicated by the fact that elements have different languages, and the list that’s returned should be different depending on what language has been selected. There are also many fields that the user needs to complete for each element, more so if the element doesn’t already exist in the database. I began to realise that including all of this in one single form would be rather too overwhelming for users and decided instead to split the creation and management of place-names across multiple forms. The ‘Add’ page would allow the user to create the ‘core’ record, which wouldn’t include place-name elements and historical forms. These materials will instead be associated with the place-name via the ‘browse place-names’ table, with separate pages specifically for elements and historical forms. Hopefully this set-up will be straightforward to use.

After reaching this decision I shelved the work I’d done on associating place-name elements and instead set to work on completing the ‘core’ place-name data upload form. This led me onto another interesting task. The project will be recording grid references for places, and I had previously worked out that it would be possible for a script to automatically generate the latitude and longitude from this figure, which in turn would allow for altitude to be retrieved from Google Maps. I used a handy PHP based library available here: http://www.jstott.me.uk/phpcoord/ to generate the latitude and longitude from the grid reference and then I integrated a Google Map in order to get the altitude (or elevation as Google calls it) based on instructions found here: https://developers.google.com/maps/documentation/javascript/elevation. By the end of the week I had managed to get this working, other than actually storing the altitude data, which I should hopefully be able to get sorted next week.

For The People’s Voice project I had an email conversation with the RA Michael Shaw about the structure of the database. Michael had met with Catriona to discuss the documentation I had previously created relating to the database and the CSV template form. Michael had sent me some feedback and this week I created a second version of the database specification, the template form and the accompanying guidelines based on this feedback. I think we’re pretty much in agreement now on how to proceed and next week I hope to start on the content management system for the project.

For Metaphor in the Curriculum I continued with my work to port all of the visualisation views from relying on server-side data and processing to a fully client-side model instead. Last week I had completed the visualisation view and had begun on the tabular view. This week I managed to complete the tabular view, the card view and also the timeline view. Although that sentence was very quick to read, actually getting all of this done took some considerable time and effort, but it is great to get it all sorted, especially as I had some doubts earlier on as to whether it would even be possible. I still need to work on the interface, which I haven’t spent much time adapting for the App yet. I also managed to complete the textual ‘browse’ feature this week as well, using jQuery Mobile’s collapsible lists to produce an interface that I think works pretty well. I still haven’t tackled the search facilities yet, which is something I hope to start on next week.

In addition to this I attended a meeting with the Burns people, who are working towards publishing a new section on the website about song performance. We discussed where the section should go, how it should function and how the materials will be published. It was good to catch up with the team again. I also had a chat with David Shuttleton about making some updates to the Cullen online resource, which I am now responsible for. I spent a bit of time going through the systems and documentation and getting a feel for how it all fits together. I also made a couple of small tweaks to the Medical Humanities Network website to ensure that people who sign up have some connection to the University.

Week Beginning 8th December 2014

This week was another busy one, and I didn’t manage to get stuck into any Mapping Metaphor development tasks at all, due to other things I had to do.  On Monday I prepared for and attended a Mapping Metaphor project meeting and, as always, it was good to catch up with the team and to let people know what I’ve recently been working on.  I also spent some time reading through and commenting on a bid Carole is developing based on an earlier place-name project that unfortunately didn’t get funded.  Here’s hoping this one is more successful.

On Tuesday I had a meeting with Jeremy and it was great to have a chat with him again about the work I’ve been doing.  Jeremy is my line manager and this was actually only my second meeting with him this year and he suggested that it might be a good idea for line manager duties to be transferred to someone else who can be more closely involved in what I do.  I think this is a good idea.  I am very good at just getting on with things, I like to think I am doing a good job and am keeping everyone happy and I don’t need much in the way of hands-on managerial support but it probably does make sense to have a line manager that is more available.

I had to spend a fair bit of time this week preparing for David Shuttleton’s Medidal Humanities event on Thursday – David had asked me to speak about Digital Humanities at Glasgow.  Even though it was only a 15 minute talk it still took quite some time to prepare the materials and get everything ready.  The talk on Thursday went well and I attended the whole day of the event.  It was very interesting to hear about some of the other projects that are underway in the University, such as the digitisation of medical records that is being managed at the library, the Cullen project and others.

During the week Fraser Rowan emailed me to let me know that my ‘Knowledge Exchange’ blog and video clip had been posted.  The blog post is available at http://www.keblog.arts.gla.ac.uk/2014/12/11/50-years-digital-college-arts/ and the video can be viewed at https://www.youtube.com/watch?v=YMhTrlSmm4k&feature=youtu.be.

I had a new task to do for the SAMUELS project this week as well.  The people at Lancaster needed me to renumber a large section of the Historical Thesaurus (in a temporary table – not the real thing!) so that the primary keys would match up with category numbers as had been assigned by the OED people back when they got their hands on the data.  It took a few hours to get this together, as initially there was some confusion as to exactly what was required.  In the end I had to import one of the earlier versions of the HT from CSV and I worked with that to get them what they needed.

Flora also contacted me with some problems she’s been having with duplicate rows in the Mapping Metaphor Access database.  She’s tried to remove them but hasn’t been able to find a method of doing so using Access.  I said I would try to look at the data and see if there was any way to filter out the duplicates using MySQL and PHP.  I started to look into this on Friday but didn’t quite have the time.  I’ll need to return to this next week.  Ellen also sent me the new numbering scheme for the project and I’ll have to implement this next week too.  Wendy had received the reviews of the Mapping Metaphor follow-on funding proposal this week and I also spent a bit of time reading through these and providing technical responses where needed.

On Friday we had the Mapping Metaphor Christmas lunch, which was very nice.  I took Friday afternoon off to do a bit of shopping.  Pauline emailed me on Friday afternoon with a bunch of things to do for the Burns project so I’ll have to sort all those out next week too.

Week Beginning 11th February 2013

I had an afternoon of meetings on Friday so it’s another Monday morning blog post from me.  It was another busy week for me, more so because my son was ill and I had to take Tuesday off as holiday to look after him.  This meant trying to squeeze into four days what I had hoped to tackle in five, which led to me spending a bit less time than I would otherwise have liked on the STELLA app development this week.  I did manage to spend a few hours continuing to migrate the Grammar book to HTML5 but there are still a couple of sections still to do.  I’m currently at the beginning of Section 8.

I did have a very useful meeting with Christian Kay regarding the ARIES app on Monday, however.  Christian has been experiencing some rather odd behaviour with some of the ARIES exercises in the web browser on her office PC and I offered to pop over and investigate.  It all centres around the most complicated exercise of all – the dreaded ‘Test yourself’ exercise in the ‘Further Punctuation’ section (see how it works for you here: http://www.arts.gla.ac.uk/STELLA/briantest/aries/further-punctuation-6-test-yourself.html). In stage 2 of the exercises clicking on words fails to capitalise them while in stage 3 adding an apostrophe also makes ‘undefined’ appear in addition to the apostrophe.  Of course these problems are only occurring in Internet Explorer, but very strangely I am unable to replicate the problems in IE9 in Windows 7, IE9 in Windows Vista and IE8 in Windows XP!  Christian is using IE8 in Windows 7, and it looks like I may have to commandeer her computer to try and fix the issue.  As I am unable to replicate it on the three Windows machines I have access to it’s not really possible to try and fix the issue any other way.

Christian also noted that clicking quickly multiple times to get apostrophes or other punctuation to appear was causing the text to highlight, which is a bit disconcerting.  I’ve implemented a fix for this that blocks the default ‘double click to highlight’ functionality for the exercise text.  It’s considered bad practice to do such a thing (jQuery UI used to provide a handy function that did this very easily but they removed it – see http://api.jqueryui.com/disableSelection/ ) but in the context of the ARIES exercise its use is justifiable.

Christian is also wanting some further updates to the ‘Test Yourself’ exercise, specifically the option to show where the correct answers are in Stage 2, so there is going to be some further ARIES work in the next week or so.  This kind of testing is absolutely vital when developing apps where the logic is handled on the client side.  Different browsers implement Javascript slightly differently and I’ll need to ensure that all the major operating systems, browsers and hardware are tested.  Marc suggested getting the English Language Society to help with the testing, which I think will be a very useful thing to do, once we get to that stage.

I also spent a little bit of time this week reworking the layout for the ICOS2014 conference website, although there is still some work to do with this.  I’ve been experimenting with responsive web design, whereby the interface automatically updates to be more suitable on smaller screens (e.g. mobile devices).  This is currently a big thing in interface design so it’s good for me to get a bit of experience with the concepts.

Following on from my meeting with Susan Rennie last week I created a three page technical specification document for the project that she is hoping to get funding for.  This should hopefully include sufficient detail for the bid she is putting together and gives us a decent amount of information about how the technology used for the project will operate.  Susan has also sent me some sample data and I will begin working with this to get some further, more concrete ideas for the project.

I also began work on the technical materials for the bid for the follow-on project for Bess of Hardwick.  This is my first experience with the AHRC’s ‘Technical Plan’, which replaced the previous ‘Technical Appendix’ towards the end of last year.  In addition to the supporting materials found on the AHRC’s website, I’m also using the Digital Curation Centre’s Data Management Planning Tool (https://dmponline.dcc.ac.uk/) which provides additional technical guidance tailored to many different funding applications, including the AHRC.

On Thursday I had a meeting with the Burns people about the choice of timeline software for the Burns Timeline that I will be putting together for them.  In last week’s post I listed a few of the pieces of timeline software that I had been looking at as possibilities and at the meeting we went through the features the project requires.  More than 6 categories are required, and the ability to search is a must, therefore the rather nice looking VeriteCo Timeline was ruled out.  It was also decided that integration with WordPress would not be a good thing as they don’t want the Timeline to be too tightly coupled with the WordPress infrastructure, thus enabling it to have an independent existence in future if required.  We decided that Timeglider would be a good solution to investigate further and the team is going to put together a sample of about 20 entries over two categories in the next couple of weeks so I can see how Timeglider may work.  I think it’s going to work really well.

On Friday I met with Mark Herraghty to discuss some possibilities for further work for him and also for follow-on funding for Cullen.  After that I met with Marc Alexander to discuss the bid we’re going to put together for the Chancellors’ fund to get someone to work on migrating the STELLA corpora to the Open Corpus Workbench.  We also had a brief chat about the required thesaurus work and the STELLA apps.  Following this meeting I had a conference call with Marc, Jeffrey Robinson and Karen Jacobs at Colorado University about Jeffrey’s Wordsworth project.  It was a really useful call and Jeffrey and Karen are going to create a ‘wishlist’ of interactive audio-visual ideas for the project that I will then give technical input, in preparation for a face to face meeting in May.

Week Beginning 24th September 2012

This week involved slightly less meetings than previous weeks, and for the first time since I started I managed to tackle some redevelopment tasks.  I managed to get access to some of the STELLA resources this week; access to some of the websites (but not all, yet) and access to the desktop-based applications.  Access to the latter was made possible by Arts Support bringing me one of the old STELLA lab machines for my desk.  I haven’t tried any of the applications yet but I have started to go through some of the server-based resources.

As a ‘quick win’ I tackled the issue of the ‘Basics of English Metre’ website (http://www.arts.gla.ac.uk/STELLA/Metre/MetreHome.html) only working in Internet Explorer.  In other browsers none of the coloured text was working, and more importantly none of the Flash-based exercises were displaying.  The first problem was solved by removing non-standard comment tags from the CSS file while the Flash no-show was fixed by adding an additional tag (<embed>) within the <object> tag.  The <embed> tag was deprecated in HTML4 but has since reappeared in HTML5.  The Metre website isn’t actually HTML5 so use of the tag isn’t entirely valid but it does at least mean the Flash exercises are now working in Firefox and Chrome.  It should be noted that this is a temporary fix and eventually the whole website will be overhauled as there is still much about it that needs modernising.

Also this week I began to think about how to systematically implement an overhaul of the STELLA resources for both PC and Mobile devices.  I have begun the creation of mock-ups of both versions for one of the STELLA sites (ARIES – Assisted Revision in English Style) and should hopefully be in a position to show these to people next week for comment.  The redeveloped versions of the resources will be fully client-side, using nothing more than HTML5, CSS3 and Javascript.  This should be sufficient for all of the STELLA resources, but it will mean migrating functionality from a very broad array of current systems including Flash, Applets, JSP, Moodle, PHP, ancient desktop based applications and others.  It’s going to take a long time to migrate every resource, and the task will need to be fitted in around other priorities within to school, so it’s unlikely that anything will be completed in the near future.

Also this week I met with Flora Edmonds to discuss the work she has done on a variety of projects such as the Historical Thesaurus, the Thesaurus of Old English and Mapping Metaphor.  It was really useful to see the databases for these systems and to learn more about how they work.  Flora is in the process of migrating some sites to a new server and will potentially be updating the interfaces to the thesaurus websites.  I offered to give Flora some ideas for enhancements to the user interfaces and to help out with PHP based issues if any crop up.  I also gave some advice to Alison Wiggins about the possibility of making a mobile version of the Bess of Hardwick site.

I had a meeting with Marc and Wendy about legacy corpus issues and I learnt a lot more about the system used by the corpora and some of the compatibility issues that have been encountered.  I will spend some time next week identifying problems with the interface when using IE and Chrome and will then aim to address these problems.  I also read through a lot of documentation relating to the corpus websites and attended this week’s Course 20 lecture, which was handily about corpora.

My final meeting of the week was with Mark Herraghty of the Cullen project.  This took up most of Friday morning and was very useful.  Mark is at the stage where he is investigating different possible avenues for working with the XML encoded letters for searching and display on the public website.  I was able to talk him through a number of previous projects I have been involved with that took quite different approaches to this problem, for example the French (http://www.emblems.arts.gla.ac.uk/french/ ) and Italian (http://italianemblems.arts.gla.ac.uk/ ) Emblems sites and the House of Fraser Archive (http://www.housefraserarchive.ac.uk/).

I also attended Graeme Caie’s retirement do on Thursday, which brought back very fond memories of being taught Old English as an undergraduate by Graeme.

 

Week Beginning 10th September 2012

It was Fresher’s Week at the University this week, so University Avenue was mobbed, with loud music blaring from vans and free noodles on offer to passers by.  It’s definitely a sign of getting middle-aged when you can walk through a throng of people handing out advertising bumf to students and not one leaflet is pushed in your direction.

My second week as Digital Humanities Research Officer was spent meeting people in the School, emailing more people and gathering further information on the wide range of current and legacy projects that have a digital component in the School.

A big thing this week was the first meeting of the SCS Digital Resources Owners Group, which took place on Tuesday.  This was a really useful meeting, bringing together people involved in projects with a digital component from across the School so we know what’s going on, what the current priorities for development are and what’s on the horizon.  The meeting really helped to define what tasks I should and shouldn’t be tackling.  For example, itmy responsibility.  Instead this was considered an administrative task and ideally the School will get someone else to manage such updates, although I will help in identifying problems with the pages.

At the meeting it was agreed that my current priorities should be working with the Cullen project, fixing the issues that exist when using the Corpora sites with Internet Explorer and looking into updating the STELLA teaching resources to use HTML5 so that web and app versions can be released.

Also this week I had very useful meetings with Marc Alexander and Jane Stuart-Smith, which gave me further insight into the projects associated with STELLA and GULP respectively.  There was also a post-work beer on Monday evening for the prospective tenants of the top floor of 13 University Gardens, and it was great to meet such people as Jenny Bann, Tamara Rathcke, Brian Jose and Ellen Bramwell for the first time.

Speaking about the top floor of 13 University Gardens, the building work there has still not been completed so as of yet I’m still based next door in HATII.  There was some talk of it maybe taking until October before the big move can take place so we’ll just need to see how that goes.  Thankfully my HATII colleagues are managing to cope with my continued presence with good grace.

Last week I started compiling a spreadsheet of projects within the School that have a digital component.  I’m documenting each project’s name, its type (e.g. if it’s a STELLA resource, a Thesaurus resource etc), associated URLs, a note as to whether there is an associated digital resource available, a project description, the primary contact for the project, whether I have spoken to the contact about development / redevelopment of any associated digital resource, the funder, the status (i.e. Active or Inactive), the priority for redevelopment and some further notes.  I’ve managed to complete the list this week, with a total of 70 projects listed.  I will send the document to the participants of the DROG to see if they know of are any further important projects I’ve missed off.

Also this week I attended the Cullen project’s induction workshop for new transcribers, which was a really useful couple of hours.  I found out lots about the editing and transcription process and Mark gave a demonstration of the Cullen database system as well, which was really useful to see.  It seems like the project is really going well and all the groundwork is in place and very well established.

Next week I have meetings arranged with the Mapping Metaphor project and some further meetings with SCS staff to discuss projects past and present.  I’m also hoping to be able to get started on the modernisation of one of the STELLA teaching packages, which should be an enjoyable task.

Week Beginning 3rd September 2012

This was my first week as Digital Humanities Research Officer for the School of Critical Studies at the University of Glasgow.  As mentioned in the ‘About’ page, I’m writing this blog primarily as an ‘aide memoire’ for me so I can keep track of what I’ve been up to each week, and also so that other people within the School can see what I’ve been doing, if they’re interested.  It’s not really intended for anyone else, although if other people feel like reading that’s absolutely fine with me.

This week was mainly spent getting up to speed with the projects and people within the School and getting myself all set up. I’ve emailed a number of people who are involved with projects that have a digital component in the School and I’ve arranged to have meetings next week with Marc Alexander of STELLA and Jane Stuart-Smith of GULP.  I haven’t heard back from other people I emailed yet, but I think some people are away before the start of term.  I’ll chase these folks up next week if I still haven’t heard anything.

Also this week I set up this blog, which I aim to add an entry to every Friday, and also a wiki that will hopefully be used to document my involvement with projects and can be a place where project materials can be shared with others.  I’ve also tackled a number of other administrative tasks this week, such as creating a suitable email signature, clearing my inbox of all pre-SCS emails, getting myself added to the School mailing lists and writing a biography for the SCS staff page. At the time of writing is not currently online – searching for me leads to a ‘not found’ page as my previous School of Humanities entry has now been removed.  Hopefully that will be fixed soon!

Apart from the above I’ve been going through all of the current and legacy projects associated with the School and adding details of them to a spreadsheet that I hope will eventually be used to plan redevelopment of some of the older and more outdated digital resources.  Jeremy has arranged for a ‘Digital Resources Owners Group’ meeting to take place next Tuesday and I hope to be able to talk about this list there.

I also had a couple of meetings last week, before I had officially started working for the School.  One was with Jeremy, which provided some very handy information about my role and the projects and people in the School.  Jeremy also introduced me to Wendy Anderson and we talked a little about the Mapping Metaphors project.

I also met with David Shuttleton and Mark Herraghty of the Cullen project and learnt a lot more about this project.  Mark showed me the staff interface for managing transcriptions and associated content, which was very interesting.  It is likely that I will be spending quite a bit of time engaged with this project, possibly working on TEI / XSLT issues and next week I will get back in touch with Mark for further information about this.

I’m currently still in my old office in HATII as building work is still being carried out at 13 University Gardens next door.  Hopefully next week I’ll be able to move into my new office, but we’ll just have to see.