I spent quite a bit of time this week continuing to work on the systems for the Place-names of Mull and Ulva project. The first thing I did was to figure out how WordPress sets the language code. It has a function called ‘get_locale()’ that bring back the code (e.g. ‘En’ or ‘Gd’). Once I knew this I could update the site’s footer to display a different logo and text depending on the language the page is in. So now if the page is in English the regular UoG logo and English text crediting the map and photo are displayed whereas is the page is in Gaelic the Gaelic UoG logo and credit text is displayed. I think this is working rather well.
I managed to get all of the new Gaelic fields added into the CMS and fully tested by Thursday and asked Alasdair to testing things out. I also had a discussion with Rachel Opitz in Archaeology about incorporating LIDAR data into the maps and started to look at how to incorporate data from the GB1900 project for the parishes we are covering. GB1900 (http://www.gb1900.org/) was a crowdsourced project to transcribe every place-name that appears on OS maps from 1888-1914, which resulted in more than 2.5 million transcriptions. The dataset is available to download as a massive CSV file (more than 600Mb). It includes place-names for the three parishes on Mull and Ulva and Alasdair wanted to populate the CMS with this data as a starting point. On Friday I started to investigate how to access the information. Extracting the data manually from such a large CSV file wasn’t feasible so instead I created a MySQL database and wrote a little PHP script that iterated through each line of the CSV and added it to the database. I left this running over the weekend and will continue to work with it next week.
Also this week I continued to add new project records to the new Digital Humanities at Glasgow site. I only have about 30 more sites to add now, and I think it’s shaping up to be a really great resource that we will hopefully be able to launch in the next month or so.
I also spent a bit of further time on the SCOSYA project. I’d asked the university’s research data management people whether they had any advice on how we could share our audio recording data with other researchers around the world. The dataset we have is about 117GB, and originally we’d planned to use the University’s file transfer system to share the files. However, this can only handle files that are up to 20Gb in size, which meant splitting things up. And it turned out to take an awfully long time to upload the files, a process we would have to do each time the data was requested. The RDM people suggested we use the University’s OneDrive system instead. This is part of Office365 and gives each member of staff 1TB of space, and it’s possible to share uploaded files with others. I tried this out and the upload process was very swift. It was also possible to share the files with users based on their email addresses, and to set expiration dates and password for file access. It looks like this new method is going to be much better for the project and for any researchers who want to access our data. We also set up a record about the dataset in the Enlighten Research Data repository: http://researchdata.gla.ac.uk/951/ which should help people find the data.
Also for SCOSYA we ran into some difficulties with Google’s reCAPTCHA service, which we were using to protect the contact forms on our site from spam submissions. There was an issue with version 3 of Google’s reCAPTCHA system when integrated with the contact form plugin. It works fine if Google thinks you’re not a spammer but if you somehow fail its checks it doesn’t give you the option of proving you’re a real person, it just blocks the submission of the form. I haven’t been able to find a solution for this using v3, but thankfully there is a plugin that allows the contact form plugin to revert back to using reCAPTCHA v2 (the ‘I am not a robot’ tickbox). I got this working and have applied it to both the contact form and the spoken corpus form and it works for me as someone Google somehow seems to trust and for me when using IE via remote desktop, where Google makes me select features in images before the form submits.
Also this week I met with Marc and Fraser to discuss further developments for the Historical Thesaurus. We’re going to look at implementing the new way of storing and managing dates that I originally mapped out last summer and so we met on Friday to discuss some of the implications of this. I’m hoping to find some time next week to start looking into this.
We received the reviews for the Iona place-name project this week and I spent some time during the week and over the weekend going through the reviews, responding to any technical matters that were raised and helping Thomas Clancy with the overall response, that needed to be submitted the following Monday. I also spoke to Ronnie Young about the Burns Paper Database, that we may now be able to make publicly available, and made some updates to the NME digital ode site for Bryony Randall.
With work completed on the interactive map for the Regional Romanticism last week, I turned instead this week to some other outstanding items on my ‘to do’ list. I spent quite a bit of time continuing to work on the new version of the Digital Humanities Network site, going through all of the existing projects and deciding which to keep. I keep all projects that still have a functioning website that is in some way connected to digital humanities, and for each of these I then need to generate new icons, banner images and screenshots, associate developers and expand upon the available descriptive text. So far I have set up enhanced records for 48 projects, meaning I’m over halfway to completion. It is a time-consuming process but I believe it is worth it so we have a place to showcase these valuable assets.
I spent at least half the week working on the website for the Place-names of Mull and Ulva, a new project that has started up in Celtic and Gaelic for which I am adapting the place-names system I originally created for the Berwickshire Place-names project. The project PI is Alasdair Whyte and I met with him on Thursday to discuss developments. Before that I worked on the interface for the site, as Alasdair had sent me some images he wanted me to use and had decided which font he wanted the site to use. I made a nice header image using a photograph of Mull that Alasdair sent, blended with an image of a historical map from NLS. I also added the bottom part of the image to the footer, and added in the required logos and image credits. I also got the multilingual side of things, which I’d started working on last week, working properly, added in the required fonts, changed the way the site menus were displayed and did some other tweaking and refinement of the original theme. Below is an example of how things currently look:
I also made some updates to the content management system for the site. As of yet I haven’t added in full multilingual support, but I have added in the additional fields that Alasdair had requested. This includes a new facility to upload captioned images that can be associated with a place-name record, a new dedicated field for ‘translation’ and another new field for specifying which island a place-name is on. I also started to look into how a Lidar map layer might be added to the public maps interface, although this is going to need some further work.
Also this week I spoke to Craig Lamont about the Burns Scotland website, which needs some updating. I looked over the site and gave him some ideas as to what could possibly be done with it. I also helped out Rob Maslen with an issue relating to his ‘Fantasy’ blog.
On Friday afternoon I met with Matt Sangster and Katie Halsey to discuss their Books and Borrowers project. This is a major AHRC project that I helped write the proposal for. We heard before Christmas that the project has been funded, which is excellent news, so we met this week to discuss our next steps. The project doesn’t actually start until June, but I’m going to try and get some of the technical aspects in place before then in order to allow the project’s RAs to get started straight away. It’s all very exciting and hopefully it will be a great project to work on when the time comes.
This was my first week back after a two-week break for Christmas. On Monday I replied to a few emails, looked at the stats for SCOSYA (which had another couple of sizeable peaks over the holidays) and upgraded all of the WordPress sites that I manage to the latest version of both WordPress and the various plugins that I use. I did encounter a strange issue with the Jetpack plugin, which caused scary error messages about the site being broken during installation, but thankfully this was just a minor issue caused by the plugin having changed its file locations. I also arranged for a new subdomain to be set up for the new Mull and Ulva place-names project I’m setting up the technical infrastructure for. Once this had been set up later in the week I set up an initial WordPress site of the project and migrated the content management system I’d made for the Berwickshire place-names project across and configured it for the new project. The new project needs to be entirely bilingual, with all content available in both English and Gaelic, so I spent some time experimenting with WordPress plugins to find a solution that would allow a bilingual site with a URL structure that would also work for the non-Wordpress pages where the place-name data will be presented. I used a plugin called Polylang, which provides a language switcher widget and facilities for every page to have multiple versions in different languages. It provides the option to do all of this without changing the URL structure (e.g. no /EN/ for English in the URL) which is what I needed for my non-Wordpress pages to work. It took a bit of tweaking to get the plugin working properly, as for some reason the Gaelic version of the blog post page didn’t work to start with and there were some issues with setting up a different menu for the Gaelic site, but I managed to get it all working as I’d hoped. I then started looking into how to make the CMS multilingual. This is going to be rather tricky as both content and labels need to be stored in both English and Gaelic, with cross-references from one to the other. I contacted the project PI Alasdair Whyte with some questions about this and I’ll continue with the development once I hear back from him.
Other than dealing with some minor managerial issues with SCOSYA and responding to a few other emails I spent the rest of the week developing the interactive map feature for Gerry McKeever’s Regional Romanticism project. Gerry wanted a map where sections of a novel (Paul Jones by Allan Cunningham) that relate to specific geographical areas can be associated with these locations, and a trail from one location to another throughout the novel can be visualised. Before Christmas I’d experimented with the Storymap tool that I’d previously used for the Romantic National Song Network project (e.g. https://rnsn.glasgow.ac.uk/english-scots-and-irishmen/) only using a geographical map rather than an image. I’d come up with a working version using the 40 entries Gerry had compiled, as the following screenshot demonstrates:
However, this version was not really going to work as Gerry wanted people to be able to control the zoom level, which Storymap doesn’t allow out of the box. Plus we wanted the layout to be very different, to be able to categorise entries and to handle the joining lines differently. For these reasons I decided to create my own interface, based on the interactive stories I’d recently made for the SCOSYA project, for example: https://scotssyntaxatlas.ac.uk/atlas/?j=y#6.25/57.929/-4.448/0/0/0/1
We’d been given approval by Christ Fleet of NLS Maps to use one of their georeferenced historical maps (Arrowsmith, 1807), so I created an initial version using this map as an overlay and a freely available tileset called ‘Thunderforest Pioneer’ as a full basemap. Here’s a screenshot of this version:
In this version the map markers were just Leaflet’s default blue markers with tooltips when you hover over them and there were no connecting lines or arrows. The story pane in the top right of the map displays the slide content and an opacity slider that allows you to see the base map through the historical map. You can navigate between slides using the ‘Next’ and ‘Previous’ buttons. In this version the zoom transitions between slides needed refinement and there was no highlighting of the marker that the slide corresponds to. There is also no differentiation between marker types (‘real’, ‘fictional’ etc) and book volumes.
I also realised that there were some limitations with the Arrowsmith map. Firstly, it is not possible to zoom into it any further than the zoom level shown in the above screenshot. The NLS simply doesn’t provide tiles for any greater zoom level, so although I can allow users to zoom in further the overlay then disappears. At the maximum zoom level available for the Arrowsmith map there’s not much distance between many of the markers so the connecting lines / arrows are possibly not going to work very well. Also, the section of the Arrowsmith map where all the action happens is unfortunately not at all accurate. If you make the overlay more transparent the area around Sandyhills is really very wrong, with markers currently identified as ‘shoreline’ massively inland, for example. I also realised that what happens when someone clicks on a marker needs further thought as many locations are associated with multiple entries so it can’t just be a case of ‘click on a marker and view a specific entry’, even though in the current map there is a marker for each entry, even when this means multiple markers appearing at the same point on the map.
After some discussions with Gerry we decided to try an alternative historical map, the OS six-inch map from 1843-1882. We also decided that entries should be numbered and locations should have a one to many relationship to entries, i.e. a location is stored separately from entries in the JSON data and any number of entries may connect to a specific location by referencing the entry’s ID. This meant I had to manually reorganise the data, but this was both necessary and worth it.
With the new data and the new historical map ready to use I set about creating some further prototypes, using different Leaflet plugins to get nicely formatted lines and arrows. One version featured straight grey lines between locations with arrows all the way along the line to show direction. Where a location is connected in both directions there are arrows in both directions. The line colour, number of arrows and arrow style can be changed. Below is a screenshot of this version:
I created another version that had curved, dashed lines, which I think look nicer than the straight lines. Unfortunately it’s not possible to add arrows to these lines, or at least it’s not without spending a long time redeveloping the plugin, and even then it might not work. I also added in different icons here to denote location type (book for fictional, globe for real, question mark for approximate). Icons also have different colours, as the following screenshot demonstrates:
After discussions with Gerry I decided to use the version with curved lines, and created a further version that included a full-width panel for the introduction, entry IDs appearing in the slides and also information about the location type and the volume the entry appears in. For the final version I created I replaced the map markers with circular markers featuring the icons (retaining the colours). This is because the mapping library Leaflet doesn’t include a method of associating IDs with markers, which makes doing things to markers programmatically rather tricky (e.g. highlighting a marker when a new slide loads or triggering the loading of data when a marker is selected). Thankfully it is possible to pass IDs when using HTML based markers, which is what the new version features. When you click on a map marker now two things may happen. If the location is associated with multiple entries (e.g. The Mermaid Bay) then the story pane lists all of the entries, featuring their volume and number, the first 100 characters of the entry and a ‘view’ button that when pressed displays the entry. If the location is associated with a single entry (e.g. Ford) then the entry loads immediately. Map markers are now highlighted with a yellow border either when they’re clicked on or when navigating through the entries and the line joining the previous slide to the current slide now turns yellow when navigating between slides, as the following screenshot demonstrates:
With all this in place my work on the interactive map is now complete until Gerry does some further work on the data, which will probably be in a month or so.