I split my time this week mostly between DSL and SCOSYA, with the bulk of my time spent making further updates to the SCOSYA public atlas based on feedback from a meeting I had last week with Jennifer and E, and also from a document that I had been given based on another project meeting which I hadn’t been invited to where the interface was discussed. Updates included changing the ‘story’ view so that the rating data was displayed as points only rather than as areas and points. It would appear that the team is going off the idea of using areas, which is a shame in some ways as a lot of work went into the development of the area display, but in other ways it is better as the GeoJSON data for each area is rather large and it makes the download of the rating data take quite some time over a slow internet connection. The areas are still available in the ‘examples’ view, so are still included in the API, but it’s possible that this might be dropped too in future. I removed the option to display both point and area data together on the map too, at the request of the team. I also updated the rating colours with some slightly tweaked ones that E had sent me, although I still think it is far too difficult to differentiate the rating levels, especially levels 1 and 2 and 4 and 5, which are really hard to tell apart, and I’m not entirely pleased with how things are looking from the point of view of wanting a useable to easy to understand visualisation. On a purely aesthetic level the shades look nice, but I don’t think that’s really good enough, personally.
I also removed the group selection entirely from the public atlas at the request of the team, which again I was happy to get rid of as it was only half implemented and still needed a lot of work. I will still need to ensure that this feature works in the experts interface once I move on to creating that, though. Other changes implemented included ensuring that the left-hand panel height resizes to fit the screen height. This was sort of working before but there were some instances where the panel wasn’t resizing. I think I’ve caught all these now. Also, when a story is selected the left-hand panel now slides away. The first story slide now features a ‘choose another story’ button which makes the left-hand panel appear again. The story panel also now scrolls if the content is longer than the panel. The panel also now resizes every time a slide loads so should adjust to screen dimensions a bit better. Hopefully with these changes the story view is a bit more usable now on mobile devices.
Further updates to the atlas that I made this week included introducing a new fractional zoom level. It’s now possible to zoom in and out at 0.25 increments rather than increments of 1. This works for scroll-zooming, pinch zooming on touchscreens and when using the ‘+/-‘ buttons. This more granular approach makes it a lot easier to display just the data you’re interested in and reduces the difficulty some people had of accidentally zooming in on the map and ending up at a very high zoom level in the middle of the sea very quickly. It also means it’s possible to get all of Scotland including Shetland positioned on a screen at once, as the following map demonstrates:
It also demonstrates how difficult it currently is to differentiate between rating level colours. You can also see that the ‘examples’ tab has now been renamed ‘Who says what’, and that there is another new tab labelled ‘Community Voices’. This latter tab is the feature that was previously being called the ‘listening atlas’, which is going to present sound clips and transcriptions for all the questionnaire locations. I spent about a day working on this feature. When the section is expanded any locations that have community voices data specified get displayed. Currently locations are displayed as green circles, the green being taken from the logo and used to differentiate the markers from other marker types. Clicking on a marker opens a pop-up containing links to the sound files and the transcriptions. I’ve used the full HTML5 Audio player here because the clips are longer and people may wish to jump to points in the recordings. Transcriptions are hidden and scroll down when you click on the link to open them. The supplied transcription text was just plain text, and I’ve had to add some formatting to it. I’ve made the transcriptions into tables with a different shade behind the speaker column. The screenshot below shows how the feature currently works:
The Community Voices data is stored in the database and I’ve updated the API to provide access to the data. There are two new endpoints, one for listing all locations that have data, and one that outputs all data for a given location. I’ve also updated the CMS to provide facilities to enable the team to upload this information to the database. Within the CMS there is now a new ‘Browse community voices’ menu item. For each location there are columns noting whether there are young and old soundfiles and transcriptions present. If you press the ‘Edit’ button for a location you can supply new information or edit the existing information. Note that you don’t have to supply all the information for a location at once – you could provide soundfiles first and transcriptions later, or provide both for ‘young’ and supply ‘old’ later. As soon as there is information for ‘young’ or ‘old’ for a location it will automatically be added to the ‘community voices’ map in the front-end.
I also spent some time going through the document produced after the team meeting I hadn’t been invited too, responding to some of the suggestions that I didn’t think were a good idea (e.g. disabling zooming by scroll-wheel or pinch gesture and forcing people to use the ‘+/-‘ buttons), discussing the options for large changes that had been proposed (e.g. amalgamating the ‘stories’ and ‘examples’ rather than having them as separate tabs) and implementing things that were requested that didn’t raise any other issues (e.g. removing the ‘feature’ text from the story slide display and ensuring the story pane doesn’t overlap with the ‘+/-‘ buttons.).
For the DSL I continued to work with processing the data and preparing it for ingest into Solr. There were a few issues with the data, which were being caused by unescaped characters appearing within the XML files (e.g. ‘<’ or ‘&’). I updated my script to add htmlspecialchars to the output and when I regenerated the data and passed it on to Raymond he successfully managed to add it to Solr. This was both for the ‘V2’ data and the ‘V3’ data. I did unfortunately notice that some rows in the ‘V2’ data seem to be missing from my output script and I’m not sure why. E.g. in the Solr browser ‘snd7232’ and ‘snds4956’ only have one search field from an earlier upload and these IDs are not found in my output file, even though these rows are in the database the output script connects to. I’ll need to investigate this once I get back to working with the data.
Also for DSL I engaged in an email discussion with the DSL’s new IT people and UoG IT people about the new dsl.ac.uk email addresses. Although we had updated things last week that should have ensured that the emails worked no emails were getting through. It turned out that this was being caused by a pointer record in the DNS that was pointing to a subdomain rather than the main domain, which was confusing the email system. Hopefully this issue should now be sorted, though.
Also this week I heard from Bryony Randall in English Literature about an AHRC follow-on funding project that I’d helped her write the proposal for. The proposal was accepted, which is really great news, and I’ll by helping Bryony with the technical aspects of her project later on this year. I also had a further discussion with Thomas Clancy about his Iona project that is inching closer to submission. I also set up new App and Play Store developer accounts for a new app that people in Sport and Recreation are putting together with an external developer, fixed a bug in the Mary Queen of Scots’ Letters CMS I’d created for Alison Wiggins and tweaked one of the Levenshtein scripts I’d created for the HT / OED data linking for Fraser. All in all it was a pretty full-on week.