This week the Scots Syntax Atlas project was officially launched, and it’s now available here: https://scotssyntaxatlas.ac.uk/ for all to use. We actually made the website live on Friday last week so by the time of the official launch on Tuesday this week there wasn’t much left for me to do. I spent a bit of time on Monday embedding the various videos of the atlas within the ‘Video Tour’ page and I updated the data download form. I also ensured that North Queensferry appeared in our grouping for Fife, as it had been omitted and wasn’t appearing as part of any group. I also created a ‘how to cite’ page in addition to the information that is already embedded in the Atlas.
The Atlas uses MapBox for its base maps, a commercial service that allows you to apply your own styles to maps. For SCOSYA we wanted a very minimalistic map, and this service enabled us to create such an effect. MapBox allows up to 200,000 map tile loads for free each month, but we figured this might not be sufficient for the launch period so arranged to apply some credit to the account to cover the extra users that the launch period might attract. The launch itself went pretty well, with some radio interviews and some pieces in newspapers such as the Scotsman. We had several thousand unique users to the site on the day it launched and more than 150,000 map tile loads during the day, so the extra credit is definitely going to be used up. It’s great that people are using the resource and it appears to be getting some very positive feedback, which is excellent.
I spent some of the remainder of the week going through my outstanding ‘to do’ items for the place-names of Kirkcudbrightshire project. This is another project that is getting close to completion and there were a few things relating to the website that I needed to tweak before we do so, namely:
I completely removed all references to the Berwickshire place-names project from the site (the system was based on the one I created for Berwickshire and there were some references to this project throughout the existing pages). I also updated all of the examples in the new site’s API to display results for the KCB data. Thomas and Gilbert didn’t want to use the icons that I’d created for the classification codes for Berwickshire so I replaced them with more simpler coloured circular markers instead. I also added in the parish boundaries for the KCB parishes. I’d forgotten how I’d done this for Berwickshire, but thankfully I’d documented the process. There is an API through which the geoJSON shapes for the parishes can be grabbed: http://sedsh127.sedsh.gov.uk/arcgis/rest/services/ScotGov/AreaManagement/MapServer/1/query and through this I entered the text of the parish into the ‘text’ field and selected ‘polygon’ for ‘geometry type’ and ‘geoJSON’ for the ‘format’ and this gave me exactly what I needed. I also needed the coordinates for where the parish acronym should appear too, and I grabbed these via an NLS map that display all of the parish boundaries (https://maps.nls.uk/geo/boundaries/#zoom=10.671666666666667&lat=55.8481&lon=-2.5155&point=0,0), finding the centre of a parish by positioning my cursor over the appropriate position and noting the latitude and longitude values in the bottom right corner of the map (the order of these needed to be reversed to be used in Leaflet).
I also updated the ‘cite’ text, rearranged the place-name record page to ensure that most data appears above the map and that the extended elements view (including element certainty) appeared. I also changed the default position and zoom level of the results map to ensure that all data (apart from a couple of outliers) are visible by default and rearranged the advanced search page, including fixing the ‘parish’ part of the search. I also added the ‘download data for print’ facility to the CMS.
Also this week I met with Alasdair Whyte from Celtic and Gaelic to discuss his place-names of Mull project. I’m going to be adapting my place-names system for his project and we discussed some of the further updates to the system that would be required for his project. The largest of these will be making the entire site multilingual. This is going to be a big job as every aspect of the site will need to be available in both Gaelic and English, including search boxes, site text, multilingual place-names, sources etc. I’ll probably get started on this in the new year.
Also this week I fixed a couple of further issues regarding certificates for Stuart Gillespie’s NRECT site and set up a conference website that is vaguely connected to Critical Studies (https://spheres-of-singing.gla.ac.uk/).
I spent the rest of the week continuing with the redevelopment of the Digital Humanities Network site. I completed all of the development of this (adding in a variety of browse options for projects) and then started migrating projects over to the new system. This involved creating new icons, screenshots and banner images for each project, checking over the existing data and replacing it where necessary and ensuring all staff details and links are correct. By the end of the week I’d migrated 25 projects over, but there are still about 75 to look over. I think the new site is looking pretty great and is an excellent showcase of the DH projects that have been set up at Glasgow. It will be excellent once the new site is ready to go live and can replace the existing outdated site.
This is the last week of work for me before the Christmas holidays so there will be no further posts until the New Year. If anyone happens to be reading this I wish you a very merry Christmas.
After the disruption of the recent strike, this was my first full week back at work, and it was mostly spent making final updates to the SCOSYA website ahead of next Tuesday’s official launch. We decided to make the full resource publicly available on Friday as a soft launch so if you would like to try the public and linguist’ atlases, look at the interactive stories, view data in tables or even connect to the API you can now do it all at https://scotssyntaxatlas.ac.uk/. I’m really pleased with how it’s all looking and functioning and my only real worry is that we get lots of users and a big bill from MapBox for supplying the base tiles for the map. We’ll see what happens about that next week. But for now I’ll go into a bit more detail about the tasks I completed this week.
We had a team meeting on Tuesday morning where we finalised the menu structure for the site, the arrangement of the pages, the wording of various sections and such matters. Frankie MacLeod, the project RA, had produced some walkthrough videos for the atlas that are very nicely done and we looked at those and discussed some minor tweaks and how they might be integrated with the site. We did wonder about embedding a video in the homepage, but decided against it as E reckons ‘pivot to video’ is something people thought users did but has proved to be false. I personally never bother with videos so I kind of agree with this, but I know from previous experience with other projects such as Mapping Metaphor that other users do really appreciate videos, so I’m sure they will be very useful.
We still had the online consent form for accessing the data to finalise so we discussed the elements that would need to be included and how it would operate. Later in the week I implemented the form, which will allow researchers to request all of the audio files, or ones for specific areas. We also decided to remove the apostrophe from “Linguists’ Atlas” and to just call it “Linguist Atlas” throughout the site as the dangling apostrophe looks a bit messy. Hopefully our linguist users won’t object too much! I also added in a little feature to allow images of places to appear in the pop-ups for the ‘How do people speak in…’ map. These can now be managed via the project’s Content Management System and I think having the photos in the map really helps to make it feel more alive.
During the week I arranged with Raymond Brasas of Arts IT Support to migrate the site to a new server in preparation for the launch, as the current server hosts many different sites, is low on storage and is occasionally a bit flaky. The migration went smoothly and after some testing we updated the DNS records and the version of the site on the new server went live on Thursday. On Friday I went live with all of the updates, which is when I made changes to the menu structure and did things such as update all internal URLs in the site so that any links to the ‘beta’ version of the atlases now point to the final versions. This includes things such as the ‘cite this page’ links. I also updated the API to remove the warning asking people to not use it until the launch as we’re now ready to go. There are still a few tweaks to make next week, but pretty much all my work on the project is now complete.
In addition to SCOSYA I worked for a few other projects this week, and also attended the English Language & Linguistics Christmas lunch on Tuesday afternoon. I had a chat to Thomas Clancy about the outstanding tasks I still need to do for the Place-names of Kirkcudbrightshire project. I’m hoping to find the time to get all these done next week. I made a few updates to Stuart Gillespie’s Newly Recovered English Classical Translations annexe (https://nrect.gla.ac.uk/) site, namely adding in a translation of Sappho’s Ode and fixing some issues with security certificates. I also arranged for Carole Hough’s CogTop site to have its domain renewed for a year, as it was due to expire this month and had a chat with Alasdair Whyte from Celtic and Gaelic about his new place-names of Mull project. I’m going to be adapting the system I created for the Berwickshire Place-names project for Alasdair, which will include adding in multilingual support and other changes. I’m meeting with him next week to discuss the details. I also managed to dig out the Scots School Dictionary Google Developer account details for Ann Ferguson at DSL, as she didn’t have a record of these and needed to look at the stats.
Other than the above, I continued to work on a new version of the Digital Humanities Network site. I completed work on the new Content Management System, created a few new records (well, mostly overhauling some of the old data), and started work on the front-end. This included creating the page where project details will be viewed and starting work on the browse facilities that will let people access the projects. There’s still some work to be done but I’m pretty happy with how the new site is coming on. It’s certainly a massive improvement on the old, outdated resource. Below is a screenshot of one of the new pages:
Next week we will officially launch the SCOSYA resource and I will hopefully have enough time to tie up a few loose ends before finishing for the Christmas holidays.
I participated in the UCU strike action for all of last week and the first three days of this week, meaning I only worked on Thursday and Friday this week. I spent some of Thursday going through emails that had accumulated and tackled a few items on my ‘to do’ list. I managed to fix a couple of old websites that had lost a bit of functionality due to connecting to a remote server that had stopped accepting connections. These were the two ‘Emblems’ websites that I created about 15 years ago (http://emblems.arts.gla.ac.uk/french/ and http://emblems.arts.gla.ac.uk/alciato/) and the emblems they contain are categorised using the Iconclass classification system for art and iconography. The Iconclass terms applied to each emblem, and all associated Iconclass search functionality are stored on a server in the Netherlands, with the server at Glasgow connecting to this in order to execute an Iconclass search and display any matching results. Unfortunately the configuration of this remote server had changed and no requests from Glasgow were getting through. Thankfully Etienne Posthumus, who helped set up the system all those years ago and is thankfully still looking after the service in the Netherlands was able to suggest an alternative means of connecting, and with the update in place the site were restored to their original level of functionality.
I also did a bit of work for the DSL. Firstly I continued an ongoing discussion with Ann Fergusson about updates to the data. Whilst working on a new order for the ‘browse’ feature I had noticed a small selection of entries that didn’t have any data in their ‘headword’ column, despite having headwords in the XML entries. Ann had investigated this and suggested it might be caused by non-alphanumeric characters in the headword, but after I’d investigated this doesn’t seem to tell the whole story. It’s a very strange situation. The headwords are only missing in the data I processed from the XML files from the work in progress server (i.e. the V3 API) – they’re present in the data from the original API. Apostrophes can cause issues when inserting data into a database, but having looked through my script I can confirm that it uses an insert method that can process apostrophes successfully. Indeed, there are some 439 DOST and SND entries that contain apostrophes that have been successfully inserted. Plus the script also successfully inserted the headwords for entries such as ‘Pedlar’s Drouth’ (an entry with a blank headword) into the separate ‘forms’ table during upload, but then didn’t add the headword field containing the same data. It’s all very strange. And there’s no reason why other special characters or punctuation shouldn’t have been inserted. Plus some entries that are missing headwords don’t have special characters or punctuation, such as ‘GEORGEMAS FAIR or MARKET’. I didn’t manage to figure out why the headwords for these entries were missing, but I added them to the database, and I think I’ll just need to watch out for these entries when I process the new data when it’s ready.
My second DSL task was getting some information to Rhona about the Scots School Dictionary app. I sent her a copy of all of the sounds files contained within the app, wrote a query to list all entries that contained sound files and a tally of the number of sound files each of these has, and gave her some information about how the current version of the app stores and uses its sound files.
I also responded to Alasdair Whyte from Celtic and Gaelic who has a research fellowship to explore the place-names of Mull and is wanting to make use of the place-name system I initially created for the Berwickshire place-names project. Hopefully we’ll be able to arrange for this to happen.
On Friday I met with Gerry McKeever to discuss the interactive map I’m going to create for his ‘Regional Romanticism’ project. Gerry had sent me some sample data, consisting of about 40 entries with latitudes, longitudes, titles and text. I created an initial interactive map based on this data using the StoryMap service (https://storymap.knightlab.com/). I showed this to Gerry, but he thought it wasn’t quite flexible enough as the user is not able to control the zoom level of the map, plus he wanted a greater amount freedom to style the connecting lines as well – adding in directional arrows, for example. We decided that we would see if the maps people at NLS would let us use one of their geocoded historical maps as a base map, and that I would then create my own bespoke interface based on the ‘stories’ I’d created for the SCOSYA project. Gerry contacted the NLS people and hopefully I’ll be able to proceed with things once I have the maps.
I spent the rest of Friday continuing to rework the Digital Humanities Network website, working on a new content management system, completing work on the new underlying database, migrating data over and creating facilities to create a new project record. There’s still a lot to be done here, not just from a technical point of view but also deciding what projects should continue to be featured, and I’ll continue to work on this over the next few weeks as time allows.