Week Beginning 25th June 2018

This was a short week as I headed off on holiday on Thursday.  I’ll be off for the next two and a bit weeks.  I spent about two of my three days of work working on the SCOSYA project, implementing the new ‘group statistics’ feature that I’d written a mini-requirements document for last week.  This new feature will allow users to create groups of locations by clicking on them, to then save multiple groups, have facilities to edit and delete groups, and to be able to open up a pop-up to view statistics about the group.  I think it will be a really useful feature.  It has, however, been somewhat tricky to implement.  The Leaflet mapping library doesn’t make it easy (currently at least) to work programmatically with map markers.  It’s not possible to assign a marker an ID, or to iterate through all markers until you find the one you’re looking for.  It’s also very tricky to update the look of a marker after it has been added to the map.  Also, rather annoyingly, the latitude and longitude or particular markers doesn’t exactly match up with the values I store in the database for my map locations, which is slightly odd, and contributes to the difficulty of matching up my data and particular markers on the map.

However, I did manage to make some progress this week.  I created a new section in my ‘Display Options’ menu for group statistics and created a section in this for creating a new group.  With this option active, the default map behaviour when a marker is clicked on changes:  rather than opening a pop-up the script manages to grab some details relating to the map (such as the ‘display name’ from the pop-up) and to store these in an array.  Clicking a second time removes the details from the array.  I also figured out a way of updating the CSS class of a marker when it is clicked on.  This is actually not very easy to do in Leaflet.  Markers have access to a ‘setClass’ function which should allow you to add a custom class to a marker, but in actual fact this function is only available before a marker is added to the map – for existing markers using this function does nothing.  This seems really weird to me.  However, by using jQuery I was able to get around this limitation.  Within my ‘onClick’ function that gets called when the user clicks a marker, and assuming the marker variable has the name ‘loc’ I can update the class as follows:

$(loc.target.getElement()).attr(“class”,”leaflet-interactive markerSelect”);

If my ‘markerSelect’ class has styles to add a border round a marker I then have a highlighted marker.  A second click to deselect the marker calls the same code but without ‘markerSelect’ included.  This works pretty well.

With this in place I then created an AJAX script that the JavaScript posts data to when the user presses the ‘save’ button.  This takes the group name the user has entered and the array of locations and adds these to the database.

I then created another AJAX script that outputs a list of the user’s groups and added these underneath the ‘new’ section.  You ca see how this works in the following screenshot.

Unfortunately this is all I had time to implement this week.  I would have liked to have made more progress with the feature, but I just ran out of time.

Also this week I met with David Wilson, a developer working elsewhere in the College of Arts, to discuss approaches to creating project websites and database.  I made a few further tweaks to the HT category select feature too.  When creating the screenshot last week I noticed that the ‘recommended’ section was displaying information about each word’s categories that should really only have appeared in the timeline pop-up.  I fixed this.  I also updated the main timeline visualisation so that all results appear on it rather than just those for the current page of results.  I think this works a lot better and is a lot less confusing.  Fraser also wanted to update the visualisation so that words were clickable and led through to the appropriate category page.  I looked into this but I’m afraid this is going to be more complicated than you might think.

The timeline library strips out all HTML from the labels.  I tried tinkering with the library to ensure it can process HTML in the label, but I pull in the labels as JSON data and Javascript’s JSON.parse function can’t handle HTML in the data it’s parsing.  I tried passing escaped HTML (e.g. ‘&lt;’ instead of ‘<’) and then converting it back after the JSON was parsed, and this does get the tags through to the visualisation, but unfortunately as the visualisation is an SVG and not HTML it doesn’t know what to do with HTML tags.  I checked my Mapping Metaphor visualisations to see how I handled links in labels here, and what I did was use classes and IDs in order to process clicks in JavaScript rather than just using simple HTML links.  I can do this with the timeline, but I don’t have time to do it before I go on holiday.  I’ll see what I can do about it once I’m back.

 

 

Week Beginning 18th June 2018

This week I continued with the new ‘Search Results’ page for the Historical Thesaurus.  Last week I added in the mini-timelines for search results, but I wanted to bring some of the other updated functionality from the ‘browse’ page to the ‘search results’ page too.

There is now a new section on the search results page where sort options and such things are located.  This is currently always visible, as it didn’t seem necessary to hide it away in a hamburger menu as there’s plenty of space at the top of the page.  Options include the facility to open the full timeline visualisation, turn the mini-timelines on or off and set the sorting options.  These all tie into the options on the ‘browse’ page too, and are therefore ‘remembered’.

It took some time to get this working as the search results page is rather different to the browse page.  I also had to introduce a new sort option (by ‘Thesaurus Category’) as this is how things are laid out by default.  It’s also been a bit of a pain as both word and category results are lumped together, but the category results always need to appear after the ‘word’ results, and the sort options don’t really apply to them.  Also, I had to make sure the ‘recommended’ section got ordered as well as the main results.  This wasn’t so important for searches with few recommended categories, such as ‘sausage’ but for things like ‘strike’ that have lots of ‘recommendeds’ I figured ordering them would be useful.  I also had to factor the pagination of results into the ordering options too.  It’s also now possible to bookmark / share the results page with a specific order set, allowing users to save or share a link to a page with results ordered by length of attestation, for example.  Here’s a screenshot showing the results ordered by length of attestation:

I then set about implementing the full timeline visualisation for the search results.  As with the other updates to the search results page, this proved to be rather tricky to implement as I had to pull apart the timeline visualisation code I’d made for the ‘browse’ page and reformat it so that it would work with results from different categories.  This introduced a number of weird edge cases and bugs that took me a long time to track down.  One example that I’ve only just fixed and has taken about two hours to get to the bottom of:

When ordering by length of attestation all OE dates were appearing at the end, even though many were clearly the longest attested words.  Why was this happening?  It turns out that elsewhere in the code where ‘OE’ appears in the ‘fulldate’ I was replacing this with an HTML ‘span’ to give the letters the smallcaps font.  But having HTML in this field was messing up the ‘order by duration’ code, buried deep within a function called within a function within a function.  Ugh, getting to the bottom of that almost had me at my wit’s end.

But I got it all working in the end, and the visualisation popup is now working properly on the search results page, including a new ‘Thesaurus Category’ ordering option.  I’ve also made the row label field wider and have incorporated the heading and PoS as well as the word.  ‘Thesaurus Category’ ordering might seem a little odd as the catnum doesn’t appear on the visualisation, but adding this in would made the row label very long.  Here’s how the timeline visualisation for results looks:

Note that this now search results page isn’t ‘live’ yet.  Fraser also wanted me to update how the search works to enable an ‘exact’ search to be performed, as currently a search for ‘set’ (for example) brings back things like ‘set (of teeth)’, which Fraser didn’t want included.  I did a little further digging into this as I had thought we once allowed exact searches to be performed, and I was right.  Actually, when you search for ‘set’ you are doing an exact search.  If it was a partial match search you’d use wildcards at the beginning and end and you’d end up with more results than the website allows you to see.

 

Maybe an example with less results would work better.  E.g. ‘wolf’.  Using the new results page, here’s an exact search: https://ht.ac.uk/category-selection/index-test.php?qsearch=wolf with 36 results, and here’s a wildcard search: https://ht.ac.uk/category-selection/index-test.php?qsearch=*wolf* with 240 results.

Several years ago (back when Christian was still involved) we set about splitting up results to allow multiple forms to be returned, and I wrote some convoluted code to extract possible permutations so that things like ‘wolf/wolf of hell/devil’s wolf < (deofles) wulf’ would be found when doing an exact search for ‘wolf’ or ‘wolf of hell’ or whatever.

When you do an exact search it searches the ‘searchterms’ table that contains all of these permutations.  One permutation type was to ignore stuff in brackets, which is why things like ‘set (about)’ are returned when you do an exact search for ‘set’, but handily keeps other stuff like ‘Wolffian ridge’ and ‘rauwolfia (serpentina)’ out of exact searches for ‘wolf’.  A search for ‘set’ is an exact match for one permuation of ‘set (about)’ that we have logged, so the result is returned.

To implement a proper ‘exact’ search I decided to allow users to surround a term with double quotes.  I updated my search code so that when double quotes are supplied the search code disregards the ‘search terms’ table and instead only searches for exact matches in the ‘wordoe’ and ‘wordoed’ fields.  This then strips out things like ‘set (about)’ but still ensures that words with OE forms, such as ‘set < (ge)settan’ are returned, as such words have ‘set’ in the ‘wordoed’ field and ‘(ge)settan’ in the ‘wordoe’ field.  This new search seems to be working very well but as with the other updates I haven’t made it live yet.

One thing I noticed with the search results timeline visualisation that might need fixed:  The contents of the visualisation are limited to a single results page, so reordering the visualisation will only reorder a subset of the results.  E.g. if the results go over two pages and are ordered by ‘Thesaurus Category’ and you open the visualisation, if you then reorder the visualisation by ‘length of attestation’ you’re only ordering those results that appeared on the first page of results when ordered by ‘Thesaurus Category’.

So to see the visualisation with the longest period of attestation you first need to reorder the results page by this option and then open the visualisation.  This is possibly a bit clunky and might lead to confusion.  I can change how things work if required.  The simplest way around this might be to just display all results in the visualisation, rather than just one page of results.  That might lead to some rather lengthy visualisations, though.  I’ve asked Marc and Fraser what they think about this.

I’m finding the search results visualisation to be quite fun to use.  It’s rather pleasing to search for a word (e.g. the old favourite ‘sausage’) and then have a visualisation showing when this word was used in all its various senses, or which sense has been used for longer, or which sense came first etc.

Also this week the Historical Thesaurus website moved to it’s new URL.  The site can now be accessed at https://ht.ac.uk/, which is much snappier than the old https://historicalthesaurus.arts.gla.ac.uk URL.  I updated the ‘cite’ options to reflect this change in URL and everything seems to be working very well.  I also discussed some further possible uses for the HT with Fraser and Marc, but I can’t really go into too many details at this point.

Also this week I fixed a minor issue on a page for The People’s Voice project, and a further one for the Woolf short story site, and gave some further feedback about a couple of further updates for the Data Management Plan for Faye Hammill.  I also had some App duties to take care of and gave some feedback on a Data Management Plan that Graeme had written for a project for someone in History.  I also created the interface for the project website for Matthew Creasy’s Decadence and Translation Network project, which I think is looking rather good (but isn’t live yet).  I had a chat with Scott Spurlock about his crowdsourcing project, which it looks like I’m going to start working on later in the summer, I spoke to David Wilson, a developer elsewhere in the college, about some WordPress issues, and I gave some feedback to Kirsteen McCue on a new timeline feature she is hoping to add to the RNSN website.  I also received an email from the AHRC this week thanking me for acting as a Technical Reviewer for them.  It turns out that I completed 39 reviews during my time as a reviewer, which I think it pretty good going!

Also this week I had a meeting with Megan Coyer where we discussed a project she’s putting together that’s related to Scottish Medicine in history.  We discussed possible visualisation techniques, bibliographical databases and other such things.  I can’t go into any further details just now, but it’s a project that I’ll probably be involved with writing the technical parts for in the coming months.

Eleanor Lawson sent me some feedback about my reworking of the Seeing Speech and Dynamic Dialects websites, so I acted on this feedback.  I updated the map so that the default zoom level is one level further out than before, centred on somewhere in Nigeria.  This means on most widths of screen it is possible to see all of North America and Australia.  New Zealand might still get cut off, though.  Obviously on narrower screens less of the world will be shown.

On the chart page I updated the way the scrollbar works.  It now appears at both the top and the bottom of the table.  This should hopefully make it clearer to people that they need to scroll horizontally to see additional content, and also make it easier to do the scrolling – no need to go all the way to the bottom of the table to scroll horizontally.  I also replaced one of the videos with an updated one that Eleanor sent me.

On Friday I began to think about how to implement a new feature for the SCOSYA atlas.  Gary had previously mentioned that he would like to be able to create groups of locations and for the system to then be able to generate summary statistics about the group for a particular search that a user has selected.  I wrote a mini ‘requirements’ document detailing how this might work, and it took some time to think through all of the various possibilities and to decide how the feature might work best.  By the end of the day I had a plan and had emailed the document to Gary for feedback.  I’m hoping to get started on this new feature next week.

 

Week Beginning 11th June 2018

I met with Matthew Creasey from English Literature this week to discuss a project website for his recently funded ‘Decadence and Translation Network’ project.  The project website is going to be a fairly straightforward WordPress site, but there will also be a digital edition hosted through it, which will be sort of similar to what I did for the Woolf short story for the New Modernist Editing project (https://nme-digital-ode.glasgow.ac.uk/).  I set up an initial site for Matthew and will work on it further once he receives the images he’d like to use in the site design.

I also gave some further help with Craig Lamont in getting access to Google Analytics for the Ramsay project, and spoke to Quintin Cutts in Computing Science about publishing an iOS app they have created.  I also met with Graeme Cannon to discuss AHRC Data Management Plans, as he’s been asked to contribute to one and hasn’t worked with a plan yet.  I also made a couple of minor fixes to the RNSN timeline and storymap pages and updated the ‘attribution’ text on the REELS map.  There’s quite a lot of text relating to map attribution and copyright so instead of cluttering up the bottom of the maps I’ve moved everything into a new pop-up window.  In addition to the statements about the map tilesets I’ve also added in a statement about our place-name data, the copyright statement that’s required for the parish boundaries, a note about Leaflet and attribution for the map icons too. I think it works a lot better.

Other than these issues I mainly focussed on three projects this week.  For the SCOSYA project I tackled an issue with the ‘or’ search, that was causing the search to not display results in a properly categorised manner when the ‘rated by’ option was set to more than one.  It took a while to work through the code, and my brain hurt a bit by the end of it, but thankfully I managed to figured out what the problem was.  Basically when ‘rated by’ was set to 1 the code only needed to match a single result for a location.  If it found one that matched then the code stopped looking any further.  However, when multiple results need to be found that match, the code didn’t stop looking, but instead had to cycle through all other results for the location, including those for other codes.  So if it found two matches that met the criteria for ‘A1’ it would still go on looking through the ‘A2’ results as well, would realise these didn’t match and set the flag to ‘N’.  I was keeping a count of the number of matches but this part of the code was never reached if the ‘N’ flag was set.  I’ve now updated how the checking for matches works and thankfully the ‘Or’ search now works when you set a ‘rated by’ to be more than 1.

For the reworking of the Seeing Speech and Dynamic Dialects websites I decided for focus on the accent map and accent chart features of Dynamic Dialects.  For the map I switched to using the Leaflet.js mapping library rather than Google Maps.  This is mainly because I prefer Leaflet, you can use it with lots of different map tilesets, data doesn’t have to be posted to Google for the map to work and other reasons, such as the fact that you can zoom in and out with the scrollwheel of a mouse without having to also press ‘ctrl’, which gets really annoying with the existing map.  I’ve removed the option to switch from map to satellite and streetview as well as these didn’t really seem to serve much purpose.  The new base map is a free map supplied by Esri (a big GIS company).  It isn’t cluttered up with commercial map markers when zoomed in, unlike Google.

You can now hover over a map marker to view the location and area details.  Clicking on a marker opens up a pop-up containing all of the information about the speaker and links to the videos as ‘play’ buttons.  Note that unlike the existing map, buttons for sounds only appear if there are actually videos for them.  E.g. on the existing map for Oregon there are links for every video type, but only one (spontaneous) actually works.

Clicking on a ‘play’ button brings down the video overlay, as with the other pages I’ve redeveloped.  As with other pages, the URL is updated to allow direct linking to the video.  Note that any map pop-up you have open does not remain open when you follow such a link, but as the location appears in the video overlay header it should be easy for a user to figure out where the relevant marker is when they close the overlay.

For the Accent Chart page I’ve added in some filter options, allowing you to limit the display of data to a particular area, age range and / or gender.  These options can be combined, and also bookmarked / shared / cited (e.g. so you can follow a link to view only those rows where the area is ‘Scotland’ the age range is ’18-24’ and the gender is ‘F’).  I’ve also added a row hover-over colour to help you keep your eye on a row.  As with other pages, click on the ‘play’ button and a video overlay drops down.  You can also cite / bookmark specific videos.

I’ve made the table columns on this page as narrow as possible, but it’s still a lot of columns and unless you have a very wide monitor you’re going to have to scroll to see everything.  There are two ways I can set this up.  Firstly the table area of the page itself can be set to scroll horizontally.  This keeps the table within the boundaries of the page structure and looks more tidy, but it means you have to vertically scroll to the bottom of the table before you see the scrollbar, which is probably going to get annoying and may be confusing.  The alternative is to allow the table to break out of the boundaries of the page.  This looks messier, but the advantage is the horizontal scrollbar then appears at the bottom of your browser window and is always visible, even if you’re looking at the top section of the table.  I’ve asked Jane and Eleanor how they would prefer the page to work.

My final project of the week was the Historical Thesaurus.  I spent some time working on the new domain names we’re setting up for the thesaurus, and on Thursday I attended the lectures for the new lectureship post for the Thesaurus.  It was very interesting to hear the speakers and their potential plans for the Thesaurus in future, but obviously I can’t say much more about the lectures here.  I also attended the retirement do for Flora Edmonds on Thursday afternoon.  Flora has been a huge part of the thesaurus team since the early days of its switch to digital and I think she had a wonderful send-off from the people in Critical Studies she’s worked closely with over the years.

On Friday I spent some time adding the mini timelines to the search results page.  I haven’t updated the ‘live’ page yet but here’s an image showing how they will look:

It’s been a little tricky to add the mini-timelines in as the search results page is structured rather differently to the ‘browse’ page.  However, they’re in place now, both for general ‘word’ results  and for words within the ‘Recommended Categories’ section.  Note that if you’ve turned mini-timelines off in the ‘browse’ page they stay off on this page too.

We will probably want to add a few more things in before we make this page live.  We could add in the full timeline visualisation pop-up, that I could set up to feature all search results, or at least the results for the current page of search results.  If I did this I would need to redevelop the visualisation to try and squeeze in at least some of the category information and the pos, otherwise the listed words might all be the same.  I will probably try to add in each word’s category and pos, which should provide just enough context, although subcat names like ‘pertaining to’ aren’t going to be very helpful.

We will also need to consider adding in some sorting options.  Currently the results are ordered by ‘Tier’ number, but I could add in options to order results by ‘first attested date’, ‘alphabetically’ and ‘length of attestation’.  ‘Alphabetically’ isn’t going to be hugely useful if you’re looking at a page of ‘sausage’ results, but will be useful for wildcard searches (e.g. ‘*sage’) and other searches like dates.  I would imagine ordering results by ‘length of attestation’ is going to be rather useful in picking out ‘important’ words.  I’ll hopefully have some time to look into these options next week.

 

 

 

Week Beginning 4th June 2018

I’d taken Friday off as a holiday this week, and I was also off on Monday afternoon to attend a funeral.  Despite being off for a day and a half I still managed to achieve quite a lot this week.  Over the weekend Thomas Clancy had alerted me to another excellent resource that has been developed by the NLS Maps people that plots the boundaries of all parishes in Scotland, which you can access here:  http://maps.nls.uk/geo/boundaries/#zoom=10.671666666666667&lat=55.8481&lon=-2.5155&point=0,0.  For REELS we had been hoping to incorporate parish boundaries into our Berwickshire map but didn’t know where to get the coordinates from, and there wasn’t enough time in the project for us to manually create the data.  I emailed Chris Fleet at the NLS to ask where they’d got their data from, and whether we might be able to access the Berwickshire bits of it.  Chris very helpfully replied to say were created by the James Hutton Institute and are hosted on the Scottish government’s Scottish Spatial Data Infrastructure Metadata Portal (see https://www.spatialdata.gov.scot/geonetwork/srv/eng/catalog.search#/metadata/c1d34a5d-28a7-4944-9892-196ca6b3be0c).  The data is free to use, so long as a copyright statement is displayed, and there’s even an API through which the data can be grabbed (see here: http://sedsh127.sedsh.gov.uk/arcgis/rest/services/ScotGov/AreaManagement/MapServer/1/query).  The data can even be outputted in a variety of formats, including shape files, JSON and GeoJSON.  I decided to go for GeoJSON, as this seemed like a pretty good fit for the Leaflet mapping library we use.

Initially I used the latitude and longitude coordinates for one parish (Abbey St Bathans) and added this to the map.  Unfortunately the polygon shape didn’t appear on the map, even though no errors were returned.  This was rather confusing until I realised that whereas Leaflet tends to use latitude and then longitude as the order of the input data, GeoJSON is set to have longitude first and then latitude.  This meant my polygon boundaries had been added to my map, just in a completely different part of the world!  It turns out that in order to use GeoJSON data in Leaflet it’s better to use Leaflet’s in-built ‘L.geoJSON’ functions (See https://leafletjs.com/examples/geojson/).  With this in place, Leaflet very straightforwardly plotted out the boundaries of my sample parish.

I had intended to write a little script that would then grab the GeoJSON data for each of the parishes in our system from the API mentioned above.  However, I noticed that when passing a text string to the API it does a partial match, and can return multiple parishes.  For example, our parish ‘Duns’ also brings back the data for ‘Dunscore’ and ‘Dunsyre’.  I figured therefore that it would be safer if I just manually grabbed the data and inserted it directly into our ‘parishes’ database.  This all worked perfectly, other than for the parish of Coldingham, which is a lot bigger than the rest, meaning the JSON data was also a lot larger.  The size of the data was larger than a setting on the server was allowing me to upload to MySQL, but thankfully Chris McGlashan was able to sort that out for me.

With all of the parish data in place I styled the lines a sort of orange colour that would show up fairly well on all of our base maps.  I also updated the ‘Display options’ to add in facilities to turn the boundary lines on or off.  This also meant updating the citation, bookmarking and page reloading code too.  I also wanted to add in the three-letter acronyms for each parish too.  It turns out that adding plain text directly to a Leaflet map is not actually possible, or at least not easily.  Instead the text needs to be added as a tooltip on an invisible marker, and the tooltip then has to be set as permanently visible, and then styled to remove the bubble around the text.  This still left the little arrow pointing to the marker, but a bit of Googling informed me that if I set the tooltip’s ‘dicrection’ to ‘center’ the arrowheads aren’t shown.  It all feels like a bit of a hack, and I hope that in future it’s a lot easier to just add text to a map in a more direct manner.  However, I was glad to figure a solution out, and once I had manually grabbed the coordinates where I wanted the parish labels to appear I was all set.  Here’s an example of how the map looks with parish boundaries and labels turned on:

I had some other place-name related things to do this week.  On Wednesday afternoon I met with Carole, Simon and Thomas to discuss the Scottish Survey of Place-names, which I will be involved with in some capacity.  We talked for a couple of hours about how the approach taken for REELS might be adapted for other surveys, and how we might connect up multiple surveys to provide Scotland-wide search and browse facilities.  I can’t really say much more about it for now, but it’s good that such issues are being considered.

I spent about a day this week continuing to work on the new pages and videos for the Seeing Speech project.  I fixed a formatting issue with the ‘Other Symbols’ table in the IPA Charts that was occurring in Internet Explorer, which Eleanor had noticed last week.  I also uploaded the 16 new videos for /l/ and /r/ sounds that Eleanor had sent me, and created a new page for accessing these.  As with the IPA Charts page I worked on last week, the videos on this page open in an overlay, which I think works pretty well.  I also noticed that the videos kept on playing if you closed an overlay before the video finished, so I updated the code to ensure that the videos stop when the overlay is closed.

Other than these projects, I investigated an issue relating to Google Analytics that Craig Lamont was encountering for the Ramsay project, and I spent the rest of my time returning to the SCOSYA project.  I’d met with Gary last week and he’d suggested some further updates to the staff Atlas page.  It took a bit of time to get back into how the atlas works as it’s been a long time since I last worked on it, but once I’d got used to it again, and had created a new test version of the atlas for me to play with without messing up Gary’s access, I decided to try and figure out whether it would be possible to add in a ‘save map as image’ feature.  I had included this before, but as the atlas uses a mixture of image types (bitmap, SVG, HTML elements) for base layers and markers the method I’d previously used wasn’t saving everything.

However, I found a plugin called ‘easyPrint’ (https://github.com/rowanwins/leaflet-easyPrint) that does seem to be able to save everything.  By default it prints the map to a printer (or to PDF), but it can also be set up to ‘print’ to a PNG image.  It is a bit clunky, sometimes does weird things and only works in Chrome and Firefox (and possibly Safari, I haven’t tried, but definitely not MS IE or Edge).  It’s not going to be suitable for inclusion on the public atlas for these reasons, but it might be useful to the project team as a means of grabbing screenshots.

With the plugin added a new ‘download’ icon appears above the zoom controls in the bottom right.  If you move your mouse over this some options appear that allow you to save an image at a variety of sizes (current, A4 portrait, A4 landscape and A3 portrait).  The ‘current’ size should work without any weirdness, but the other ones have to reload the page, bringing in map tiles that are beyond what you currently see.  This is where the weirdness comes in, as follows:

  1. The page will display a big white area instead of the map while the saving of the image takes place.  This can take a few seconds.
  2. Occasionally the map tiles don’t load successfully and you get white areas in the image instead of the map.  If this happens pan around the map a bit to load in the tiles and then try saving the image again.
  3. Very occasionally when the map reloads it will have completely repositioned itself, and the map image will be of this location too.  Not sure why this is happening.  If it does happen, reposition the map and try again and things seem to work.

Once the processing is complete the image will be saved as a PNG.  If you select the ‘A3’ option the image will actually be of a much larger area than you see on your screen.  I think this will prove useful to you for getting higher resolution images and also for including Shetland, two issues Gary was struggling with.  Here’s a large image with Shetland in place:

That’s all for this week.

 

Week Beginning 28th May 2018

Monday was a bank holiday so this was a four-day working week.  The big news this weeks was that we went live with the new timeline and mini-timeline feature for the Historical Thesaurus.  This is a feature I started working on just for fun during a less busy period in the week before Christmas and it’s grown and grown since then into what I think is a hugely useful addition to the site.  It’s great to see it live at last.  Marc has been showing the feature to people at a conference this week and the feedback so far has been very positive, which is excellent.  The only slight teething problem was I inadvertently broke the Sparkline interface when I made this feature live (as the Sparkline page was using a test version of the site’s layout script that I deleted when the timelines went live).  Thankfully that was a two-second job to fix.  Anyway, here’s an example page with the timeline options available: https://historicalthesaurus.arts.gla.ac.uk/category/?type=search&qsearch=physician&word=physician&page=1#id=14766

I met with Gary Thoms this week to discuss the public interface for the SCOSYA atlas.  It looks like this is now going to be worked on later this year, possibly from September or October onward, with an aim of launching it in April next year.  We also talked about further updates to the staff version of the atlas that Gary would like to be incorporated, such as better options to save map images and facilities to select groups of locations and automatically display statistics about the group.  I’m hoping to spend some time on these updates over the next few weeks.

I also had a meeting with Thomas Clancy this week to discuss some possible future place-name projects that I might be involved with in some capacity, and I was in communication with SLD about some issues relating to the Google developer account for the Scots School dictionary.  I also fixed a minor error with the Corpus of Modern Scottish Writing that had cropped up during the move to HTTPS and gave further feedback to the latest (and possibly final) version of the Data Management Plan for Faye Hammill’s project.

Other than that I spent my time this week working on the redevelopment of the Seeing Speech and Dynamic Dialect websites for Jane Stuart-Smith.  I’d realised that we never decided how we’d redevelop the interface for the Dynamic Dialects website, so I spent some time setting this up.  As a starting point I took the same interface as for the new Seeing Speech website, but added in the Dynamic Dialects navigation structure (with links to the chart and map at the top).  I wasn’t sure what to do about the logo.  Unfortunately there is no version of the current logo on the server that doesn’t have the ‘Dynamic Dialects’ text in front of it.  Instead I found a couple of free images that might work and created mockups of the interface with them so that Jane and Eleanor could see which might work best.

I then decided to focus on the redevelopment of the Seeing Speech IPA chart interface to the videos, as I figured that in terms of content there probably wouldn’t be many changes to be made.  The charts now appear within the overall site structure rather than on an isolated page.  I’ve split the four charts into separate tabs.  Within each tab there are then buttons for setting the video type and speaker.  The charts all now use the ‘Doulos SIL’ font automatically, so no need to worry about missing symbols.

I’ve added a line of help text above the tables just in case people don’t know they can click on a symbol to open videos.  I can change this text if required.  The charts themselves should be pretty much identical to the existing charts.  The only difference is I’ve removed the hover-over title text, as to me it didn’t seem necessary for things like ‘U+00F0: LATIN SMALL LETTER ETH’ to be visible.  One other tiny difference is I’ve greyed out the ‘Affricates and double articulation’ symbols in the ‘other’ tab as these don’t have videos.

Regarding the videos, these now open in an overlay rather than in a new browser window.  The page greys out and the overlay drops down from the top.  When you click outside of the overlay, or on the ‘close’ button in the top right of the overlay, the page fades back into view and the overlay slides up the screen and disappears.  Most browsers now also display a ‘full screen’ button in the video player options if people want to see a bigger video, and some browsers (e.g. Chrome) also give the user a ‘download’ option.  When the video overlay is open an extra ID is added to the browser’s address bar.  If you copy the full URL when the overlay is open you can then link to a specific video.  This means we could add ‘cite’ text to the overlay to allow people to cite specific videos.  When you close the overlay the information is removed from the address bar, to allow people to bookmark / cite the full page.

I haven’t copied all of the copyright text across as it seemed a bit confusing.  The link to the International Phonetic Association was broken and it was unclear why the chart has copyright attributed to three organisations.  The ‘Weston Ruter’ one is particularly confusing as the link just leads to a personal website for a WordPress developer.  So for now what is displayed is ‘Charts reprinted with permission from The International Phonetic Association’ (with a link to https://www.internationalphoneticassociation.org/).

In terms of responsiveness (i.e. things working on all screen sizes), I’ve tested things out on my phone and the charts and video overlays work fine.  The tabs end up stacked vertically, which I think is fine.  Once the screen narrows beyond a certain point the tables (particularly the pulmonic consonants table) stops getting narrower and instead a scrollbar appears underneath the table.  This ensures the structure of the table is never compromised – i.e. no dropping down of columns onto new lines or anything).  As this feature and the new site design is still in development I can’t post any screenshots yet, but I think it’s coming along nicely.  Eleanor noticed some strange formatting with one of the tables in Internet Explorer, so I’ll have to investigate this next week.