Content

tagged: research

robot comedy lab: journal paper

the robot stand-up work got a proper write-up. well, part of it got a proper write-up, but so it goes.

This paper demonstrates how humanoid robots can be used to probe the complex social signals that contribute to the experience of live performance. Using qualitative, ethnographic work as a starting point we can generate specific hypotheses about the use of social signals in performance and use a robot to operationalize and test them. […] Moreover, this paper provides insight into the nature of live performance. We showed that audiences have to be treated as heterogeneous, with individual responses differentiated in part by the interaction they are having with the performer. Equally, performances should be further understood in terms of these interactions. Successful performance manages the dynamics of these interactions to the performer’s- and audiences’-benefit.

pdf download

diary | 25 aug 2015 | tagged: comedy lab · phd · qmat · research

the conversational rollercoaster

media and arts technology colleague saul albert put out a call for help for the conversational rollercoaster. happy to help as a last-hurrah for time in the same research group, but more significantly it’s an event conceived to take interaction and audiences seriously.

cribbed from an email, my quick take after was –
– I could show passers-by a scientific process happening live. A production line, almost.
– With the talkaoke table, not only did we have a source of conversation, but something that passers by had to navigate past.
– Watch enough people come past, you start to spot patterns
– Capture those moments, pore over the detail, and soon you can…
– Build a “theory of passing the talkaoke table without getting pulled in”
– Laws that clearly aren’t like the laws of physics, but for this specific situation do have similar predictive power.
– Why is it that they work?

diary | 23 sep 2016 | tagged: conversational rollercoaster · engaging audiences · qmat · research

accepted for ISPS2017

‘visualising performer–audience dynamics’ spoken paper accepted at ISPS 2017, the international symposium on performance science. this is doubly good, as i’ve long been keen to visit reykjavík and explore iceland.

diary | 13 apr 2017 | tagged: comedy lab · performer–audience dynamics · phd · qmat · research

submission

…finally.

diary | 09 may 2017 | tagged: phd · research · qmat

isps » performer-audience dynamics talk

had a lot of fun with my talk ‘visualising performer–audience dynamics’ at ISPS 2017. with a title like that, some play with the ‘spoken paper’ format had to be had.

pleasingly, people were coming up to me to say how much they enjoyed it for the rest of the conference. huzzah!

i recorded it, and have stitched it together with the slides. the video is here, on the project page.

diary | 01 sep 2017 | tagged: comedy lab · performer–audience dynamics · phd · qmat · research · iceland · talk

viva

the dissertation had done the talking, and the viva was good conversation about it wrapped up with a “dr. harris” handshake. phew, and woah. having been in a death-grip with the dissertation draft for so long, nothing in the whole experience could touch the wholesomeness of simply hearing “i read it all, and it’s good”.

supervisor –

Dear All,

I’m delighted to report that Toby Harris successfully defended his thesis "Liveness: An Interactional Account” this morning.
The external said: “that was a sheer pleasure”. (very) minor corrections.

Pat.


Pat Healey,
Professor of Human Interaction,
Head of the Cognitive Science Research Group,
Queen Mary University of London

external examiner –

This is a richly intriguing study of the processes of interaction between performers, audiences and environments in stand-up comedy – a nice topic to choose since it is one where, even more than in straight theatrical contexts, ‘liveness’ is intuitively felt to be crucial. But as Matthew Harris says, what constitutes ‘liveness’ and how precisely it operates and matters, remains elusive – if pugnaciously clung to!

The conclusions reached and offered – which more than anything insist on the value and necessity of seeing all audience contexts as socially structured situations – both rings right, and seems to be based well in the details of the data presented. And the cautions at the end, about the risks with moving to higher levels of abstraction (wherein ‘the audience’ might become increasingly massified, rather than understood processually) looks good and valuable.

The specific claims made – that the ‘liveness’ of the performer matters little (e.g. by replacing him/her with a robot, or with a recording) – will nicely infuriate those with an over-investment in the concept, and will need careful presentation when this work is published. The subsequent experiment on the role of spotlighting or darkness on the kinds and levels of interaction audiences have with each other, and with the performer are also nicely counter-intuitive.

internal examiner –

I greatly enjoyed reading this thesis. It strikes a good balance between theory and experiment and makes several well-defined contributions. The literature reviews show a keen insight and a good synthesis of ideas, and the motivation for each of the experiments is made clear. The writing is polished and engaging, and the order of ideas in each chapter is easy to follow.

diary | 06 oct 2017 | tagged: phd · research · qmat

postdoc: machine folk

Folk music is part of a rich cultural context that stretches back into the past, encompassing the real and the mythical, bound to the traditions of the culture in which it arises. Artificial intelligence, on the other hand, has no culture, no traditions. But it has shown great ability: beating grand masters at chess and Go, for example, or demonstrating uncanny wordplay skills when IBM Watson beat human competitors at Jeopardy. Could the power of AI be put to use to create music?

The article, by Bob Sturm and Oded Ben-Tal, goes on to say yes there is precedent, and here’s what we’re doing.

I’m now helping them, on a part-time contract. The idea is it’s part UI design for composing with deep learning, part community engagement (read: website), and part production–reception research.

diary | 18 oct 2017 | tagged: machine folk · research

folkrnn composer mk.i

off the digital anvil: a bare-bones web-app adapation of the folk-rnn command line tool, the first step in making it a tool anyone can use. happily, folk-rnn is written in python – good in and of itself as far as i’m concerned – which makes using the django web framework a no-brainer.

- created a managed ubuntu virtual machine.
- wrangled clashing folk-rnn dependencies.
- refactored the folk-rnn code to expose the tune generation functionality through an API.
- packaged folk-rnn for deployment.
- created a basic django webapp:
	- UI to allow user to change the rnn parameters and hit go.
	- UI to show a generated tune in staff notation.
	- URL scheme that can show previously generated tunes.
	- folk-rnn-task process that polls the database (python2, as per folk-rnn).
	- unit tests.
- functional test with headless chrome test rig.

diary | 15 nov 2017 | tagged: machine folk · research · code

renaissance garb means dr*spark

dressed up as a renaissance italian, doffed my hat, and got handed a certificate… that was a placeholder, saying the real one will be in the post. truly a doctor, but yet still one little thing!

best of all, is that first-born is no longer my totem of not having got this done; the bigger and better she got, the more egregious the not-being-a-doctor was.

diary | 18 dec 2017 | tagged: phd · research · qmat

thesis published

it’s a funny thing, handing in a thesis, submitting corrections and so on, but not being able to link anyone to the work. finally, so long after may, but at least not so long after the viva, here it is. all 169 pages of it.

https://qmro.qmul.ac.uk/xmlui/handle/123456789/30624

diary | 20 dec 2017 | tagged: phd · research · qmat

folkrnn.org: 50x faster, and online

the web-app adapation of the folk-rnn command line tool is now online, and generating tunes 50x faster – from ~1min to 1-2s. still bare-bones, an ongoing project, but at least playable with.

diary | 29 jan 2018 | tagged: machine folk · research · code

the new projectionists

to birmingham, invited to give my talk about the live in live cinema at the new projectionists. formative place, brum.

diary | 24 feb 2018 | tagged: live in live cinema · live cinema · talk · research · *spark | downloads: uk05-poster.jpg

the hack starts

the week hack session starts. claude and sophie talk through how their art practice has informed their research on social interaction, and we all discuss how that could inform graphical tools and techniques for the transcription, analysis and presentation of social interaction… a fun day with old friends doing good work.

having thought i might be making all sorts of acetate-and-pens-and-displays hacks, it becomes pretty clear that a tablet+pen app that could support their kind of approach is achievable, and would be a great platform to then experiment from.

diary | 22 mar 2018 | tagged: drawing interactions · research

hands-on with time

first challenge for the drawing interactions prototype app is to get ‘hands-on with time’. what does that mean? well, clearly controlling playback is key when analysing video data. but that also frames the problem in an unhelpful way, where playback is what’s desired. rather the analyst’s task is really to see actions-through-time.

pragmatically, when i read or hear discussions of video data analysis, working frame-by-frame comes up time and time again, along with repeatedly studying the same tiny snippets. but if i think of the ‘gold standard’ of controlling video playback – the jog-shuttle controllers of older analogue equipment, or the ‘J-K-L’ of modern editing software – they don’t really address those needs.

so what might? i’ve been thinking about the combination of scrolling filmstrips and touch interfaces for years, promising direct manipulation of video. also, in that documentary the filmstrips are not just user-interface elements for the composer, but displays for the audience. such an interface might get an analyst ‘hands on with time’, and might better serve a watching audience. this is no small point, as the analysis is often done in groups, during ‘data sessions’. others would be able to tap the timeline for control – rather than one person owning the mouse – and all present would have an easy understanding of the flow of time as the app is used.

of course, maybe this is all post-hoc rationalisation. i’ve been wanting to code this kind of filmstrip control up for years, and now i have.

a little postscript: that panasonic jog-shuttle controller was amazing. the physical control had all the right haptics, but there was also something about the physical constraints of the tape playback system. you never lost track of where you were, as the tape came to a stop, and starting speeding back. time had momentum. so should this.

diary | 25 mar 2018 | tagged: drawing interactions · research · code

ready for the unveil

after an intense week, the drawing interactions app is ready to be unveiled. the iPad Pro and Apple Pencil turn out to be amazing hardware, and i’ve really enjoyed the deep dive into the kind of precise, expressive, powerful code that’s required for this kind of iOS app. it

  • plays a video
  • lets you navigate around the video by direct manipulation of the video’s filmstrip.
  • when paused, you can draw annotations with a stylus.
  • these drawings also become ‘snap’ points on the timeline
  • these drawings are also drawn into the timeline, partly to identify the snap points, and partly with the hope they can become compound illustrations in their own right
  • when not paused, you can highlight movements by following them with the stylus

i got there. huzzah! that last feature, drawing-through-time, i’m particularly pleased with. of course, there are bugs and plenty of things it doesn’t do. but it’s demoable, and that’s what we need for tomorrow’s workshop.

diary | 29 mar 2018 | tagged: drawing interactions · research · code

workshop time

the drawing interactions project is not just the app. and even if it were, what is an app without users. so: a workshop, at ‘new directions in ethnomethodology’.

diary | 30 mar 2018 | tagged: talk · drawing interactions · research

sketching posture

the project takes a fresh approach to (e.g. conversation analytic) transcription, based on long-standing artistic and drafting skills. so, here we are in the workshop, all learning life drawing. while tracing people’s outlines from photo and video source material can get you a long way (something i quickly learnt coming into product design with an engineering background), it can also constrain what can be communicated – or even seen.

cue sophie, whose quick and economical illustrations convey qualities like posture and weight.

diary | 30 mar 2018 | tagged: drawing interactions · research

a room full of ethnographers drawing you

if you ever wondered what it might look like to have a room full of ethnographers learning life drawing with you as the model… well, it’s like this. an unexpected turn of events, to say the least.

i blame sophie, to the left in the photo =]

diary | 30 mar 2018 | tagged: drawing interactions · research

designer infographics?

the app, as conceived for a prototype, is all about exploratory research. of course, ultimately the insights and the backing evidence need to be distilled for publication.

happily, sylvaine tuncer, barry brown and others have been working on a design-informed exploration of ways to standardise presentation. right up my alley… if we can continue this project, there’s so much that could be done, and i’d love to do it.

diary | 30 mar 2018 | tagged: drawing interactions · research

conversational rollercoaster journal paper

The conversational rollercoaster: Conversation analysis and the public science of talk

How does talk work, and can we engage the public in a dialogue about the scientific study of talk? This article presents a history, critical evaluation and empirical illustration of the public science of talk. We chart the public ethos of conversation analysis that treats talk as an inherently public phenomenon and its transcribed recordings as public data. We examine the inherent contradictions that conversation analysis is simultaneously obscure yet highly cited; it studies an object that people understand intuitively, yet routinely produces counter-intuitive findings about talk. We describe a novel methodology for engaging the public in a science exhibition event and show how our ‘conversational rollercoaster’ used live recording, transcription and public-led analysis to address the challenge of demonstrating how talk can become an informative object of scientific research. We conclude by encouraging researchers not only to engage in a public dialogue but also to find ways to actively engage people in taking a scientific approach to talk as a pervasive, structural feature of their everyday lives.

Albert, S., Albury, C., Alexander, M., Harris, M. T., Hofstetter, E., Holmes, E. J. B., & Stokoe, E. (2018). The conversational rollercoaster: Conversation analysis and the public science of talk. Discourse Studies, 20(3), 397–424. https://doi.org/10.1177/1461445618754571

PDF available from Loughborough University Institutional Repository

diary | 16 may 2018 | tagged: conversational rollercoaster · engaging audiences · qmat · research

1

2

3

4

5