Content

tagged: qmat

event app as research

it doesn’t look like much at the moment, but this is the first step into my research group at university doing a study on real festival audiences at real festivals. i’m developing an interactive map / timetable app, which will have quite some interesting features and opt-ins by the time we’re done. the promoters we’ve been talking to already have an interactive map of sorts, i’ve already done some interesting things visualising live events, and of course there’s my phd on audiences and interaction.

diary | 10 aug 2012 | tagged: code · open frameworks · ios · imc at festivals · vj · liveness · research · qmat

event app as research, shaping up

a little bit of a more compelling demo than last shown. development of this app has proved pretty painful, part of which is engaging with openFrameworks and c++ at a level beyond demo, and part of which has been the flakiness of the ofxAddons i’ve tried to use. the 3D model loader ofxAssimpModelLoader turned into the bane of this project; a core component of the app, the scope of its ill-effects was never clear until the debugging got truly brutal. i also had to ditch ofxTwitter, but at least can contribute back my working of the search functionality into the immeasurably better codebase of ofxTwitterStream.

diary | 29 nov 2012 | tagged: open frameworks · imc at festivals · liveness · research · qmat

latour'd CHI

out of all the people in this world that i half-understand, bruno latour is by far my favourite. ‘visualisation and cognition: drawing things together’ is my favourite academic paper by far, where he explains in no small part the western world by tracing the practice of using bits of paper. ‘aramis, or the love of technology’ is one of my favourite books, which transcends the genre of ‘look at this team of people working to make their dent in the world’ in the most amazing ways that just won’t make sense in summary. at it’s extremes you go from reading straight technical documents to hearing a train philosophise.

the thread that runs through these is that rather than gesture about society, technology, culture and other abstractions, if you want to do something productive in the world of those terms, start at the manifest phenomena in their tiny instantiations and build up. its quite a shift in world view, but i’m signed up - hence looking at the ‘liveness’ of live events through what is exposed by people as they experience the event.

so being able to attend CHI and hear latour give the closing keynote was a gift. not that i’ve entirely wrapped my mind around what he was saying, to put it mildly: WHAT BABOON NOTEBOOKS, MONADS, STATE SURVEILLANCE, AND NETWORK DIAGRAMS HAVE IN COMMON: BRUNO LATOUR AT CHI

diary | 02 may 2013 | tagged: research · qmat

cogsci crowd app

the interactive-map-and-then-some app turned out to be a step too far for the organisation we hoped to make it their own, but there still was a festival and a need to determine just what smartphone sensors could tell you about the activity around a festival. and so another app was born, one to harvest any and all sensor data for real-time or subsequent analysis. the interaction, media and communication group i’m part of now being rebranded cognitive science, here is the cogsci crowd app, as it stands.

https://github.com/qmat/IMC-Crowd-App-Android
https://github.com/qmat/IMC-Crowd-Server

the UI presents a ‘crowd node’ toggle button, which corresponds to the app running a data logger and making a connection to a server conterpart. it’s called ‘crowd node’ because we hope this to be the beginning of a network of devices word amongst the crowd, from which crowd dynamics can be analysed in realtime, and interventions staged. being on android, this crowd node is a service running in the foreground, which means the app can come and go while the service runs. it maintains a notification, and while this is there, the phone won’t sleep until it powers down. the datalogger registers for updates from all the sensors available on that device, and constantly scans for wifi base stations and bluetooth devices. getting some kind of audio fingerprint should be a useful future addition to the sensing. the server connection mints session IDs that keep things anonymous while tracking instances of the app, and receives the 1000 line json formatted log files either in bulk afterwards or as they’re written. in time, this should be a streaming connection for realtime use, with eg. activity analysis and flocking algorithms running on the incoming data.

diary | 25 may 2013 | tagged: imc at festivals · research · liveness · qmat

cogsci crowd app » biosensing

of course, if you’ve just written a sensor logging smartphone app, and you have some bio-sensing data logging kit in the research group, you’re going to use it, right?

diary | 25 may 2013 | tagged: imc at festivals · liveness · research · qmat

cogsci crowd app » field day

thanks to the promoters and the media and arts dtc, we had seven people running the crowd app attending the field day music festival in victoria park, london. science! fun!

…the analysis, however, is going to be less fun.

diary | 25 may 2013 | tagged: imc at festivals · liveness · research · qmat

sensing festivals paper

a sense of satisfaction to see someone i’ve been helping get on the research ladder accepted to a workshop and the paper we co-wrote going into the acm archive.

In order to sense the mood of a city, we propose first looking at festivals. In festivals such as Glastonbury or Burning Man we see temporary cities where the inhabitants are engaged afresh with their environment and each other. Our position is that not only are there direct equivalences between larger festivals and cities, but in festivals the phenomena are often exaggerated, and the driving impulses often exploratory. These characteristics well suit research into sensing and intervening in the urban experience. To this end, we have built a corpus of sensor and social media data around a 18,000 attendee music festival and are developing ways of analysing and communicating it.

“Sensing Festivals as Cities”, a position paper for ‘SenCity: uncovering the hidden pulse of a city’ workshop, accepted for publication in UbiComp '13 conference proceedings.

diary | 30 jun 2013 | tagged: imc at festivals · liveness · research · qmat

comedy lab: human vs robot

Come and see some more stand-up comedy, in the name of science – and this time, there’s a robot headlining!

What makes a good performance? By pitting stand-up comics Tiernan Douieb and Andrew O’Neill against a life size robot in a battle for laughs, researchers at Queen Mary, University of London hope to find out more — and are inviting you along.
A collaboration between the labs of Queen Mary’s Cognitive Science Research Group, RoboThespian’s creators Engineered Arts, and the open-access spaces of Hack The Barbican, the researchers are staging a stand-up gig where the headline act is a robot as a live experiment into performer-audience interaction.
This research is part of work on audience interaction being pioneered by the Cognitive Science Group. It is looking at the ways in which performers and audiences interact with each other and how this affects the experience of ‘liveness’. The experiment with Robothespian is testing ideas about how comedians deliver their material to maximize comic effect.

Shows at 6pm, Wednesday 7th and Thursday 8th August, Barbican Centre. Part of Hack the Barbican.

Poster attached. Aside from the science, the designer in me is quite content with how that little task turned out.

diary | 02 aug 2013 | tagged: comedy lab · phd · qmat · research | downloads: comedy_lab_robot.pdf

comedy lab: tiernan douieb

“good evening ladies and gentlemen, welcome to the barbican centre. “comedy lab: human vs robot” will be starting shortly in level minus one. part of hack the barbican, it is a free stand-up gig with robot headlining.”

so said i, on the public address system across all the spaces of the barbican centre. didn’t see that coming when i went to find out how to request an announcement.

the gig started, people came – this photo makes it look a bit thin, you can’t see all the seated people – and tiernan did his warm-up thing. and most brilliantly, didn’t run a mile when we brought up the idea of another comedy lab, and getting a robot to tell jokes.

diary | 07 aug 2013 | tagged: comedy lab · phd · qmat · research

comedy lab: andrew o'neill

first act proper: andrew o’neill. go watch the opening of this show, it’s perfect: http://www.youtube.com/watch?v=aGjbmywaKMI

highlight of this show had to be turning to the many kids who had appeared at the front, and singing his bumface song. to be clear, the bumface song is far from his funniest gag, not even anything much beyond the antics of a school playground. but what is so interesting is how that content is transformed in that moment of live performance and audience state into a bubble of joy. thats what we’re after. he had lots of techniques for eliciting response from a slightly wary audience.

it’s why we’ve chosen the genre for these live experiments, but it bears repeating: stand-up comedy really is so much more than the jokes.

diary | 07 aug 2013 | tagged: comedy lab · phd · qmat · research

comedy lab: robothespian

“I never know how to start, which is probably because I run off windows 8” – and there were more laughs than groans!

as part of the media and arts technology phd you spend six months embedded somewhere interesting, working on something interesting. i did a deep dive into web adaptations and the semantic mark-up of stories at the bbc. klemomenis katevas has spent five months at engineered arts working on realtime interaction with their robothespian, and what better test could be a re-staging of comedy lab.

beyond tiernan’s script and kleomenis’s programming of the robot, what was most exciting was to see a robot did colombine gardair’s ‘woooo’ gesture, and the audience responded exactly as they do in covent garden. that’s our first trying out of something we’ve learnt about performance from doing this line of research… and it worked.

robothespian’s first gig was straight delivery of the script and ‘press play’ stagecraft. it went surprisingly well - it really did get laughs and carried the audience to a fair degree. tomorrow, we turn on the interactivity…

diary | 07 aug 2013 | tagged: comedy lab · phd · photo · qmat · research

comedy lab: instrumenting audiences

getting a robot to tell jokes is no simple feat. programming and polishing a script for the robot to deliver is challenge enough, but trying to get that delivery to be responsive to the audience, to incorporate stagecraft that isn’t simply a linear recording… now that is hard. of course, in the research world, we like hard, so reading the audience and tailoring the delivery appropriate to that is exactly what we set out to do.

having robothespian deliver what was essentially a linear script for his first night performance, for his second performance we turned on the interactivity. we had a camera and microphone giving us an audio-visual feed of the audience, and processed this to give us information to make decisions about robothespian’s delivery. a simple example is waiting until any audience audio – laughing, you hope – dies down before proceeding to the next section of the routine. more interesting to us is what having an humanoid robot allows us to do, as eye contact, body orientation, gesture and so on form so much of co-present human-human interaction. for that you need more than a single audio feed measuring the audience as a whole, you need to know exactly where people are and what they’re doing. in the photo you can see our first iteration of solving this, using the amazingly robust fraunhofer SHORE software, which detects faces and provides a number of metrics for each recognised face, such as male/female, eyes open/closed, and most usefully for instrumenting a comedy gig: a happiness score, which is effectively a smileometer. from this, robothespian delivered specific parts of the routine to the audience member judged most receptive at that point, was able to interject encouragement and admonitions, gestured scanning across the audience, and so on.

research being hard, it seems turning the interacion on backfired, as the gross effect was to slow down the delivery, taking too long between jokes. but this is a learning process, and tweaking those parameters is something we’ll be working on. and – big point i’ve learnt about research, you often learn more when things go wrong, or by deliberately breaking things, than when things work or go as expected. so there’ll be lots to pore over in the recordings here, comparing performer-audience-audience interaction between human and robot.

diary | 08 aug 2013 | tagged: comedy lab · phd · qmat · research

comedy lab: evening standard article

nice article in the london evening standard on comedy lab, link below and photo of it in the paper attached:
http://www.standard.co.uk/news/london/scientists-create-robot-to-take-on-comedians-in-standup-challenge-8753779.html

here’s the q & a behind the article, our answers channeled by pat healey

What does using robots tell us about the science behind stand-up comedy?
Using robots allows us to experiment with the gestures, movements and expressions that stand-up comedians use and test their effects on audience responses.

What’s the aim of the experiment? Is it to design more sophisticated robots and replace humans?
We want to understand what makes live performance exciting, how performers ‘work’ an audience; the delivery vs. the content.

Is this the first time an experiment of this kind has been carried out? How long is the research project?
Robot comedy is an emerging genre. Our performance experiment is the first to focus on how comedians work their audiences.

Tell me more about RoboThespian. Does he just say the comedy script or is he (and how) more sophisticated? Does he walk around the stage/make hand movements/laugh etc?
This research is really about what’s not in the script - we’re looking at the performance; the gestures, gaze, movement and responsiveness that make live comedy so much more than reading out jokes.

How does his software work?
We use computer vision and audio processing to detect how each person in the audience is responding. The robot uses this to tailor who it talks to and how it delivers each joke - making each performance unique.

What have you learned already from the show? Does the robot get more laughs? Does he get heckled? What has been the feedback from the audience afterwards?
I think Robothespian had a great opening night.

Do you see robots performing stand-up in future?
It will take some time to emerge but yes, I think this will come. Interactive technology is used increasingly in all forms of live performance.

diary | 09 aug 2013 | tagged: comedy lab · phd · qmat · research | downloads: comedylab-eveningstandardprint.jpeg

comedy lab: new scientist article

“Hello, weak-skinned pathetic perishable humans!” begins the stand-up comic. “I am here with the intent of making you laugh.”
A curiously direct beginning for most comics, but not for Robothespian. This humanoid robot, made by British company Engineered Arts, has the size and basic form of a tall, athletic man but is very obviously a machine: its glossy white face and torso taper into a wiry waist and legs, its eyes are square video screens and its cheeks glow with artificial light.
Robothespian’s first joke plays on its mechanical nature and goes down a storm with the audience at the Barbican Centre in London. “I never really know how to start,” it says in a robotic male voice. “Which is probably because I run off Windows 8.”
The performance last week was the brainchild of Pat Healey and Kleomenis Katevas at Queen Mary University of London, who meant it not only to entertain but also to investigate what makes live events compelling.
As we watched, cameras tracked our facial expressions, gaze and head movements. The researchers will use this information to quantify our reactions to Robothespian’s performance and to compare them with our responses to two seasoned human comics – Andrew O’Neill and Tiernan Douieb – who performed before the robot. […]

full article: http://www.newscientist.com/article/dn24050-robot-comedian-stands-up-well-against-human-rivals.html

bit miffed that the brainchild line has been re-written to sound definitively like it’s pat and minos only, but hey. in the context of that sentence, it should be my name: comedy lab is my programme, prodding what makes performance and the liveness of live events compelling is my phd topic.

diary | 16 aug 2013 | tagged: comedy lab · phd · qmat · research

comedy lab: first results

hacked some lua, got software logging what I needed; learnt python, parsed many text files; forked a cocoa app, classified laugh state for fifteen minutes times 16 audience members times two performances; and so on. eventually, a dataset of meascollect audience response for every tenth of a second. and with that: results. statistics. exciting.

a teaser of that is above, peer review needs to go it’s course before announcements can be made. as a little fun, though, here is the introduction of the paper the first results are published in – at some point before it got re-written to fit house style. this has… more flavour.

Live performance is important. We can talk of it “lifting audiences slightly above the present, into a hopeful feeling of what the world might be like if every moment of our lives were as emotionally voluminous, generous, aesthetically striking, and intersubjectively intense” \cite{Dolan:2006hv}. We can also talk about bums on seats and economic impact — 14,000,000 and £500,000,000 for London theatres alone in recent years \cite{Anonymous:2013us}. Perhaps more importantly, it functions as a laboratory of human experience and exploration of interaction \cite{FischerLichte:2008wo}. As designers of media technologies and interactive systems this is our interest, noting the impact of live performance on technology \cite{Schnadelbach:2008ii, Benford:2013ia, Reeves:2005uw, Sheridan:2007wc, Hook:2013vp} and how technology transforms the cultural status of live performance \cite{Auslander:2008te, Barker:2012iq}. However, as technology transforms the practice of live performance, the experiential impact of this on audiences is surprisingly under-researched. Here, we set out to compare this at its most fundamental: audience responses to live and recorded performance.

diary | 18 sep 2013 | tagged: comedy lab · phd · qmat · research

science photo prize

thanks to this shot, science outreach, and a national competition, i have a new camera. first prize! huzzah!

the full story is here — http://www.qmul.ac.uk/media/news/items/se/126324.html

screenshot above from — http://www.theguardian.com/science/gallery/2014/mar/31/national-science-photography-competition-in-pictures

diary | 31 mar 2014 | tagged: phd · comedy lab · photo · qmat · research

comedy lab dataset viewer

happy few days bringing-up a visualiser app for my PhD. integrating the different data sources of my live performance experiment had brought up some quirks that didn’t seem right. i needed to be confident that everything was actually in sync and spatially correct, and, well, it got the point where i decided to damn well visualise the whole thing.

i hoped to find a nice python framework to do this in, which would neatly extend the python work already doing most of the processing on the raw data. however i didn’t find anything that could easily combine video with a 3D scene. but i do know how to write native mac apps, and there’s a new 3D scene framework there called SceneKit…

so behold Comedy Lab Dataset Viewer. it’s not finished, but it lives!

  • NSDocument based application, so i can have multiple performances simultaneously
  • A data importer that reads the motion capture data and constructs the 3D scene and its animation
  • A stack of CoreAnimation layers compositing 3D scene over video
  • 3D scene animation synced to the video playback position

diary | 16 may 2014 | tagged: comedy lab · performer–audience dynamics · phd · qmat · research

comedy lab » alternative comedy memorial society

getting a robot to perform stand-up comedy was a great thing. we were also proud that we could stage the gig at the barbican arts centre. prestigious, yes, but also giving some credibility to it being a “real gig”, rather than an experiment in a lab.

however, it wasn’t as representative of a comedy gig as we’d hoped. while our ‘robot stand-up at the barbican’ promotion did recruit a viably sized audience (huzzah!), the (human) comics said it was a really weird crowd. in short, we got journalists and robo-fetishists, not comedy club punters. which on reflection is not so surprising. but how to fix?

we needed ‘real’ audiences at ‘real’ gigs, without any recruitment prejudiced by there being a robot in the line-up. we needed to go to established comedy nights and be a surprise guest. thanks to oxford brookes university’s kind loan, we were able to load up artie with our software and take him on a three day tour of london comedy clubs.

and so, the first gig: the alternative comedy memorial society at soho theatre. a comedian’s comedy club, we were told; a knowledgeable audience expecting acts to be pushing the form. well, fair to say we’re doing something like that.

diary | 02 jun 2014 | tagged: comedy lab · phd · qmat · research

comedy lab » gits and shiggles

the second gig of our tour investigating robo-standup in front of ‘real’ audiences: gits and shiggles at the half moon, putney. a regular night there, we were booked amongst established comedians for their third birthday special. was very happy to see the headline act was katherine ryan, whose attitude gets me every time.

shown previously was artie on-stage being introduced. he (it, really) has to be on stage throughout, so we needed to cover him up for a surprise reveal. aside from the many serious set-up issues, i’m pleased i managed to fashion the ‘?’ in a spare moment. to my eye, makes the difference.

artie has to be on stage throughout as we need to position him precisely in advance. that, and he can’t walk. the precise positioning is because we need to be able to point and gesture at audience members: short of having a full kinematic model of artie and the three dimensional position of each audience member identified, we manually set the articulations required to point and look at every audience seat within view, while noting where each audience seat appears in the computer vision’s view. the view is actually a superpower we grant to artie, the ability to have see from way above his head, and do that in the dark. we position a small near-infrared gig-e vision camera in the venue’s rigging along with a pair of discreet infra-red floodlights. this view is shown above, a frame grabbed during setup that has hung around since.

diary | 03 jun 2014 | tagged: comedy lab · phd · qmat · research

comedy lab » angel comedy

third gig: angel comedy. again, an established comedy club and again a different proposition. a nightly, free venue, known to be packed. wednesdays was newcomers night which, again, was somewhat appropriate.

what i remember most vividly has not to do with our role in it, but was rather the compère warming up the crowd after the interval. it was a masterclass in rallying a crowd into an audience (probably particularly warranted given the recruitment message of ‘free’ combined with inexperienced acts). i rue to this day not recording it.

diary | 04 jun 2014 | tagged: comedy lab · phd · qmat · research

1

2

3

4