Content

tagged: qmat

comedy lab » on tour, unannounced

an email comes in from a performance studies phd candidate asking if they could watch the whole robot routine from comedy lab: human vs. robot. damn right. i’d love to see someone write about that performance as a performance.

but, better than that staging and its weird audiences (given the advertised title, robo-fetishists and journalists?) there is comedy lab #4: on tour, unannounced. the premise: robot stand-up, to unsuspecting audiences, at established comedy nights. that came a year later with the opportunity to use another robothespian (thanks oxford brookes!). it addressed the ecological validity issues, and should simply be more fun to watch.

for on tour, unannounced we kept the performance the same – or rather, each performance used the same audience responsive system to tailor the delivery in realtime. there’s a surprising paucity in the literature about how audiences respond differently to the same production; the idea was this should be interesting data. so i’ve taken the opportunity to extract from the data set the camera footage of the stage from each night of the tour. and now that is public, at the links below.

the alternative comedy memorial society

gits and shiggles

angel comedy

the robot comedy lab experiments form chapter 4 of my phd thesis ‘liveness: an interactional account’

Four: Experimenting with performance

The literature reviewed in chapter three also motivates an experimental programme. Chapter four presents the first, establishing Comedy Lab. A live performance experiment is staged that tests audience responses to a robot performer’s gaze and gesture. This chapter provides the first direct evidence of individual performer–audience dynamics within an audience, and establishes the viability of live performance experiments.

http://tobyz.net/project/phd

there are currently two published papers –

and finally, on ‘there is a surprising paucity…’, i’d recommend starting with gardair’s mention of mervant-roux.

diary | 03 may 2019 | tagged: comedy lab · phd · qmat · research

conversational rollercoaster journal paper

The conversational rollercoaster: Conversation analysis and the public science of talk

How does talk work, and can we engage the public in a dialogue about the scientific study of talk? This article presents a history, critical evaluation and empirical illustration of the public science of talk. We chart the public ethos of conversation analysis that treats talk as an inherently public phenomenon and its transcribed recordings as public data. We examine the inherent contradictions that conversation analysis is simultaneously obscure yet highly cited; it studies an object that people understand intuitively, yet routinely produces counter-intuitive findings about talk. We describe a novel methodology for engaging the public in a science exhibition event and show how our ‘conversational rollercoaster’ used live recording, transcription and public-led analysis to address the challenge of demonstrating how talk can become an informative object of scientific research. We conclude by encouraging researchers not only to engage in a public dialogue but also to find ways to actively engage people in taking a scientific approach to talk as a pervasive, structural feature of their everyday lives.

Albert, S., Albury, C., Alexander, M., Harris, M. T., Hofstetter, E., Holmes, E. J. B., & Stokoe, E. (2018). The conversational rollercoaster: Conversation analysis and the public science of talk. Discourse Studies, 20(3), 397–424. https://doi.org/10.1177/1461445618754571

PDF available from Loughborough University Institutional Repository

diary | 16 may 2018 | tagged: conversational rollercoaster · engaging audiences · qmat · research

thesis published

it’s a funny thing, handing in a thesis, submitting corrections and so on, but not being able to link anyone to the work. finally, so long after may, but at least not so long after the viva, here it is. all 169 pages of it.

https://qmro.qmul.ac.uk/xmlui/handle/123456789/30624

diary | 20 dec 2017 | tagged: phd · research · qmat

renaissance garb means dr*spark

dressed up as a renaissance italian, doffed my hat, and got handed a certificate… that was a placeholder, saying the real one will be in the post. truly a doctor, but yet still one little thing!

best of all, is that first-born is no longer my totem of not having got this done; the bigger and better she got, the more egregious the not-being-a-doctor was.

diary | 18 dec 2017 | tagged: phd · research · qmat

viva

the dissertation had done the talking, and the viva was good conversation about it wrapped up with a “dr. harris” handshake. phew, and woah. having been in a death-grip with the dissertation draft for so long, nothing in the whole experience could touch the wholesomeness of simply hearing “i read it all, and it’s good”.

supervisor –

Dear All,

I’m delighted to report that Toby Harris successfully defended his thesis "Liveness: An Interactional Account” this morning.
The external said: “that was a sheer pleasure”. (very) minor corrections.

Pat.


Pat Healey,
Professor of Human Interaction,
Head of the Cognitive Science Research Group,
Queen Mary University of London

external examiner –

This is a richly intriguing study of the processes of interaction between performers, audiences and environments in stand-up comedy – a nice topic to choose since it is one where, even more than in straight theatrical contexts, ‘liveness’ is intuitively felt to be crucial. But as Matthew Harris says, what constitutes ‘liveness’ and how precisely it operates and matters, remains elusive – if pugnaciously clung to!

The conclusions reached and offered – which more than anything insist on the value and necessity of seeing all audience contexts as socially structured situations – both rings right, and seems to be based well in the details of the data presented. And the cautions at the end, about the risks with moving to higher levels of abstraction (wherein ‘the audience’ might become increasingly massified, rather than understood processually) looks good and valuable.

The specific claims made – that the ‘liveness’ of the performer matters little (e.g. by replacing him/her with a robot, or with a recording) – will nicely infuriate those with an over-investment in the concept, and will need careful presentation when this work is published. The subsequent experiment on the role of spotlighting or darkness on the kinds and levels of interaction audiences have with each other, and with the performer are also nicely counter-intuitive.

internal examiner –

I greatly enjoyed reading this thesis. It strikes a good balance between theory and experiment and makes several well-defined contributions. The literature reviews show a keen insight and a good synthesis of ideas, and the motivation for each of the experiments is made clear. The writing is polished and engaging, and the order of ideas in each chapter is easy to follow.

diary | 06 oct 2017 | tagged: phd · research · qmat

isps » performer-audience dynamics talk

had a lot of fun with my talk ‘visualising performer–audience dynamics’ at ISPS 2017. with a title like that, some play with the ‘spoken paper’ format had to be had.

pleasingly, people were coming up to me to say how much they enjoyed it for the rest of the conference. huzzah!

i recorded it, and have stitched it together with the slides. the video is here, on the project page.

diary | 01 sep 2017 | tagged: comedy lab · performer–audience dynamics · phd · qmat · research · iceland · talk

submission

…finally.

diary | 09 may 2017 | tagged: phd · research · qmat

accepted for ISPS2017

‘visualising performer–audience dynamics’ spoken paper accepted at ISPS 2017, the international symposium on performance science. this is doubly good, as i’ve long been keen to visit reykjavík and explore iceland.

diary | 13 apr 2017 | tagged: comedy lab · performer–audience dynamics · phd · qmat · research

the conversational rollercoaster

media and arts technology colleague saul albert put out a call for help for the conversational rollercoaster. happy to help as a last-hurrah for time in the same research group, but more significantly it’s an event conceived to take interaction and audiences seriously.

cribbed from an email, my quick take after was –
– I could show passers-by a scientific process happening live. A production line, almost.
– With the talkaoke table, not only did we have a source of conversation, but something that passers by had to navigate past.
– Watch enough people come past, you start to spot patterns
– Capture those moments, pore over the detail, and soon you can…
– Build a “theory of passing the talkaoke table without getting pulled in”
– Laws that clearly aren’t like the laws of physics, but for this specific situation do have similar predictive power.
– Why is it that they work?

diary | 23 sep 2016 | tagged: conversational rollercoaster · engaging audiences · qmat · research

robot comedy lab: journal paper

the robot stand-up work got a proper write-up. well, part of it got a proper write-up, but so it goes.

This paper demonstrates how humanoid robots can be used to probe the complex social signals that contribute to the experience of live performance. Using qualitative, ethnographic work as a starting point we can generate specific hypotheses about the use of social signals in performance and use a robot to operationalize and test them. […] Moreover, this paper provides insight into the nature of live performance. We showed that audiences have to be treated as heterogeneous, with individual responses differentiated in part by the interaction they are having with the performer. Equally, performances should be further understood in terms of these interactions. Successful performance manages the dynamics of these interactions to the performer’s- and audiences’-benefit.

pdf download

diary | 25 aug 2015 | tagged: comedy lab · phd · qmat · research

oriented-to test

need a hit-test for people orienting to others. akin to gaze, but the interest here is what it looks like you’re attending to. but what should that hit-test be? visualisation and parameter tweaking to the rescue…

diary | 03 feb 2015 | tagged: comedy lab · performer–audience dynamics · phd · qmat · research

through the eyes

with the visualiser established, it was trivial to attach the free view camera to the head pose node and boom!: first-person perspective. to be able to see through the eyes of anyone present is such a big thing.

diary | 13 jan 2015 | tagged: comedy lab · performer–audience dynamics · phd · qmat · research

robot comedy lab: workshop paper

minos gave a seminar on his engineering efforts for robot stand-up, we back-and-forthed on the wider framing of the work, and a bit of that is published here. his write-up.

workshop paper presented at humanoid robots and creativity, a workshop at humanoids 2014.

pdf download

diary | 18 nov 2014 | tagged: comedy lab · phd · qmat · research

rotation matrix ambiguities

the head pose arrows look like they’re pointing in the right direction… right? well, of course, it’s not that simple.

the dataset processing script vicon exporter applies an offset to the raw angle-axis fixture pose, to account for the hat not being straight. the quick and dirty way to get these offsets is to say at a certain time everybody is looking directly forward. that might have been ok if i’d thought to make it part of the experiment procedure, but i didn’t, and even if i had i’ve got my doubts. but we have a visualiser! it is interactive! it can be hacked to nudge things around!

except that the visualiser just points an arrow at a gaze vector, and that’s doesn’t give you a definitive orientation to nudge around. this opens up a can of worms where everything that could have thwarted it working, did.

“The interpretation of a rotation matrix can be subject to many ambiguities.”
http://en.wikipedia.org/wiki/Rotation_matrix#Ambiguities

hard-won code –

DATASET VISUALISER

// Now write MATLAB code to console which will generate correct offsets from this viewer's modelling with SceneKit
for (NSUInteger i = 0; i < [self.subjectNodes count]; ++i)
{
	// Vicon Exporter calculates gaze vector as
	// gaze = [1 0 0] * rm * subjectOffsets{subjectIndex};
	// rm = Rotation matrix from World to Mocap = Rwm
	// subjectOffsets = rotation matrix from Mocap to Offset (ie Gaze) = Rmo

	// In this viewer, we model a hierarchy of
	// Origin Node -> Audience Node -> Mocap Node -> Offset Node, rendered as axes.
	// The Mocap node is rotated with Rmw (ie. rm') to comply with reality.
	// Aha. This is because in this viewer we are rotating the coordinate space not a point as per exporter

	// By manually rotating the offset node so it's axes register with the head pose in video, we should be able to export a rotation matrix
	// We need to get Rmo as rotation of point
	// Rmo as rotation of point = Rom as rotation of coordinate space

	// In this viewer, we have
	// Note i. these are rotations of coordinate space
	// Note ii. we're doing this by taking 3x3 rotation matrix out of 4x4 translation matrix
	// [mocapNode worldTransform] = Rwm
	// [offsetNode transform] = Rmo
	// [offsetNode worldTransform] = Rwo

	// We want Rom as rotation of coordinate space
	// Therefore Offset = Rom = Rmo' = [offsetNode transform]'

	// CATransform3D is however transposed from rotation matrix in MATLAB.
	// Therefore Offset = [offsetNode transform]

	SCNNode* node = self.subjectNodes[i][@"node"];
	SCNNode* mocapNode = [node childNodeWithName:@"mocap" recursively:YES];
	SCNNode* offsetNode = [mocapNode childNodeWithName:@"axes" recursively:YES];

	// mocapNode has rotation animation applied to it. Use presentation node to get rendered position.
	mocapNode = [mocapNode presentationNode];

	CATransform3D Rom = [offsetNode transform];

	printf("offsets{%lu} = [%f, %f, %f; %f, %f, %f; %f, %f, %f];\n",
		   (unsigned long)i+1,
		   Rom.m11, Rom.m12, Rom.m13,
		   Rom.m21, Rom.m22, Rom.m23,
		   Rom.m31, Rom.m32, Rom.m33
		   );

	// BUT! For this to actually work, this requires Vicon Exporter to be
	// [1 0 0] * subjectOffsets{subjectIndex} * rm;
	// note matrix multiplication order

	// Isn't 3D maths fun.
	// "The interpretation of a rotation matrix can be subject to many ambiguities."
	// http://en.wikipedia.org/wiki/Rotation_matrix#Ambiguities
}

VICON EXPORTER

poseData = [];
for frame=1:stopAt
	poseline = [frameToTime(frame, dataStartTime, dataSampleRate)];
	frameData = reshape(data(frame,:), entriesPerSubject, []);
	for subjectIndex = 1:subjectCount

		%% POSITION
		position = frameData(4:6,subjectIndex)';

		%% ORIENTATION
		% Vicon V-File uses axis-angle represented in three datum, the axis is the xyz vector and the angle is the magnitude of the vector
		% [x y z, |xyz| ]
		ax = frameData(1:3,:);
		ax = [ax; sqrt(sum(ax'.^2,2))'];
		rotation = ax(:,subjectIndex)';

		%% ORIENTATION CORRECTED FOR OFF-AXIS ORIENTATION OF MARKER STRUCTURE
		rm = vrrotvec2mat(rotation);

		%% if generating offsets via calcOffset then use this
		% rotation = vrrotmat2vec(rm * offsets{subjectIndex});
		% gazeDirection = subjectForwards{subjectIndex} * rm * offsets{subjectIndex};

		%% if generating offsets via Comedy Lab Dataset Viewer then use this
		% rotation = vrrotmat2vec(offsets{subjectIndex} * rm); %actually, don't do this as it creates some axis-angle with imaginary components.
		gazeDirection = [1 0 0] * offsets{subjectIndex} * rm;

		poseline = [poseline position rotation gazeDirection];
	end
	poseData = [poseData; poseline];
end

diary | 19 aug 2014 | tagged: comedy lab · performer–audience dynamics · phd · qmat · research

writing up

if only writing up the phd was always like this. beautiful room, good friends, excellent facilitation by thinkingwriting.

diary | 12 aug 2014 | tagged: phd · qmat · research

virtual camera, real camera

of course, aligning the virtual camera of the 3D scene with the real camera’s capture of the actual scene was never going to be straightforward. easy to get to a proof of concept. hard to actually register the two. i ended up rendering a cuboid grid on the seat positions in the 3D scene, drawing by hand (well, mouse) what looked about right on a video still, and trying to match the two sets of lines by nudging coordinates and fields-of-view with some debug-mode hotkeys i hacked in.

in hindsight, i would have stuck motion capture markers on the cameras. so it goes.

diary | 16 jul 2014 | tagged: comedy lab · performer–audience dynamics · qmat · research

visualising everything

visualising head pose, light state, laugh state, computer vision happiness, breathing belt. and, teh pretty. huzzah.

diary | 21 jun 2014 | tagged: comedy lab · performer–audience dynamics · phd · qmat · research

comedy lab » angel comedy

third gig: angel comedy. again, an established comedy club and again a different proposition. a nightly, free venue, known to be packed. wednesdays was newcomers night which, again, was somewhat appropriate.

what i remember most vividly has not to do with our role in it, but was rather the compère warming up the crowd after the interval. it was a masterclass in rallying a crowd into an audience (probably particularly warranted given the recruitment message of ‘free’ combined with inexperienced acts). i rue to this day not recording it.

diary | 04 jun 2014 | tagged: comedy lab · phd · qmat · research

comedy lab » gits and shiggles

the second gig of our tour investigating robo-standup in front of ‘real’ audiences: gits and shiggles at the half moon, putney. a regular night there, we were booked amongst established comedians for their third birthday special. was very happy to see the headline act was katherine ryan, whose attitude gets me every time.

shown previously was artie on-stage being introduced. he (it, really) has to be on stage throughout, so we needed to cover him up for a surprise reveal. aside from the many serious set-up issues, i’m pleased i managed to fashion the ‘?’ in a spare moment. to my eye, makes the difference.

artie has to be on stage throughout as we need to position him precisely in advance. that, and he can’t walk. the precise positioning is because we need to be able to point and gesture at audience members: short of having a full kinematic model of artie and the three dimensional position of each audience member identified, we manually set the articulations required to point and look at every audience seat within view, while noting where each audience seat appears in the computer vision’s view. the view is actually a superpower we grant to artie, the ability to have see from way above his head, and do that in the dark. we position a small near-infrared gig-e vision camera in the venue’s rigging along with a pair of discreet infra-red floodlights. this view is shown above, a frame grabbed during setup that has hung around since.

diary | 03 jun 2014 | tagged: comedy lab · phd · qmat · research

comedy lab » alternative comedy memorial society

getting a robot to perform stand-up comedy was a great thing. we were also proud that we could stage the gig at the barbican arts centre. prestigious, yes, but also giving some credibility to it being a “real gig”, rather than an experiment in a lab.

however, it wasn’t as representative of a comedy gig as we’d hoped. while our ‘robot stand-up at the barbican’ promotion did recruit a viably sized audience (huzzah!), the (human) comics said it was a really weird crowd. in short, we got journalists and robo-fetishists, not comedy club punters. which on reflection is not so surprising. but how to fix?

we needed ‘real’ audiences at ‘real’ gigs, without any recruitment prejudiced by there being a robot in the line-up. we needed to go to established comedy nights and be a surprise guest. thanks to oxford brookes university’s kind loan, we were able to load up artie with our software and take him on a three day tour of london comedy clubs.

and so, the first gig: the alternative comedy memorial society at soho theatre. a comedian’s comedy club, we were told; a knowledgeable audience expecting acts to be pushing the form. well, fair to say we’re doing something like that.

diary | 02 jun 2014 | tagged: comedy lab · phd · qmat · research

1

2

3

4