Two Proofs of Concept
In preparation for office hours with Andrew today, I wanted to make a couple of proofs of concept demonstrating Alexa commenting on my http/s traffic. I did this in two ways:
- I had my good friend Adam on a Zoom call where I screen-shared the .pcap output from my RaspberryPi running mitmproxy and capturing my phone data. I told him to improvise on what he saw from the domain names, transfer types, and packet sizes as someone who is making a ton of money off my traffic (I think I told him to imagine a western oil baron hollerin’ “WE’VE STRUCK OIL!”). His commentary was outputted from zoom via bluetooth to my Alexa speaker. In working with this data feed, I noticed that many websites will create a backlog because of the size and quantity of the transfers. If I were going this again, I would try and find a way to filter the interface to only show main domain names of websites once.
- I then wanted to do a similar process computationally using an Amazon skill. I ran into a barrier getting the traffic data out of the mitmproxy in a way that only filtered what I wanted. I was planning to program out some commentary based on the live data capture, however I chose to spend more time learning the Amazon skill kit and Lambda function hosting as I was still very new to this. I followed this Lambda tutorial and referred to the basic skill building guide to make this function which lives on Lambda. Connecting these two was the biggest challenge, but simply required adding an endpoint in the Alexa developer console and an Alexa skills kit trigger for the function. I was also using the built-in AMAZON.YesIntent for the VUI, which connects to the YesIntentHandler in my code. The function simply reads out a string of what Adam said in the first proof of concept. It was an interesting experiment trying to make the AVS output as close to Adam’s natural cadence as possible. I’m sure this was part of John Cayley’s work with The Listeners. I also took some juvenile delight in finding work-arounds to make Alexa say “shitt” and “fuq.”
Artistically, I don’t know if there is a lot of merit in this. It feels very general, kinda dumb, and not very pointed. I really liked the frantic energy that Adam brought to his recording and it felt close to the anxious listening I imagine for these characters. Between the two, Adam’s is obviously more comical and a much more on-the-nose criticism. Alexa’s feels much creepier despite the comical content and less alienating. I feel more inside of the commentary when I listen to Alexa’s. While trying to glean some artistic benefit from this experiment, I still feel like these proofs are incredibly rough and that I have barely scratched the surface of the themes I really want to look into with a performance. Moving forward, I was hoping to create a skill that will allow me to dance with Alexa in preparation for 50 days of making.
P.S. The Echo that Amazon said was lost actually arrived and works. I may send it back if they end up charging me, but for now I have TWO Echoes.
Related Posts
Leave a Reply Cancel reply
You must be logged in to post a comment.
Kat Sullivan
Adam Colestock
Helen (Chenuan) Wu
Christina Lan
Dorian Janezic
George Faya
Julia Myers
Kelsie Smith
Michael Morran
Po-Wen Shih
Liu Siyan
Fisher Yu
—
Craig Protzel
Christopher Wray
Haoqi Xia
Hayden Carey
Katherine Nicoleta Helén
Maria Maciak
Parisa Shemshaki
Sakar Pudasaini
Skyler Pierce
Steven Doughty
Yiqi Wang
—
Andrew Lazarow
Benoit Belsot
Enrique García Alcalá
Hongyi Zhang
Jay Mollica
Li Shu
Teddy (Jian) Guo
Monika Lin
Wenye Xie
Yiru Lu