Hey try capturing reminders with this spreadsheet-hackable voice-enabled free-to-use call bot:
On your next Zoom, Google Meet, or Microsoft Teams call, go here and paste your meeting URL. (Sign in with Google first.) Then hit: Invite Action Cat
Youâll get a live Google Sheets spreadsheet of all your meeting actions. Action Cat listens to your conversation. Say âremind me…â and see it update.
The spreadsheet link appears once you admit the bot to your call. (Copy and paste the link into chat so others can see. The doc is public by default. Lock it down if youâd prefer they request access.) The full transcript is updated with minimal lag on the second tab, timestamped and broken down by speaker. Transcription is magically good.
đ§ Use Chrome on a desktop for now. Safari needs pop-up windows set to âAllowâ for sign-in to work, and this isn't mobile friendly yet. đ§
You can hack the way the bot works live on the call. Change the âremind meâ phrase to pick up actions using a different keyword. Edit the formulas.
e.g. at the end of your call, search for â?â on the whole transcript to make sure all your questions have been answered. Check who has spoken most. Or least. Or =ANYTHING() else. Itâs just a spreadsheet. Edit the wiring of the whole thing. Build a smart speaker for virtual meetings and call out to GPT-3, whatever. The document is yours. Tell us how you use it.
Reduct is a collaborative, transcription-based video platform. You can upload or live capture more videos, tag transcripts, and make video supercuts. Start your free trial to get all the functionality.
Some background from Matt đŹ
I met the Reduct team as they were building the Live Capture tool (launched Dec 2022). Reduct is a deep product suite for multiplayer video editing, and the transcription is uncannily good (like, you can jump between phonemes and exact video frame).
Live transcription seemed transformative beyond even the immediate feature. So we got to wondering: is there a way to let everyone experiment with this? Amazingly the team were up for a collab. The result is Action Cat.
Action Cat is a call bot with wires hanging out the back. Itâs useful from minute 1. Who doesnât track tasks in meetings? And quick for people to try.
Then hackable by anyone who can write a spreadsheet formula. We found ourselves interacting with the transcript during our collab calls, sometimes playing to add up durations, sometimes checking back to see if weâd addressed a point… a mini feedback loop inside the conversation.
Action Cat scratches a few personal design itches:
User-serviceable parts inside. The Adaptive Design movement argued for end-user customisation of everything from architecture to software… both to allow users to tune features, but also as distributed R&D discover new applications. Editable, shareable spreadsheets are perfect adaptive artefacts. Read more: Revisiting Adaptive Design, a lost design movement(Aug 2020).
Features as non-player characters. As we interact with AIs, and as we move to an always-multiplayer web, maybe it makes sense to work with bots as NPC teammates? Call bots are a great way to explore this. Blog post here: Let me recruit AI teammates into Figma(Oct 2022).
Tools for small groups. I feel like the small group (3—12 people) isnât given enough attention in design. Shared goals in social software(Oct 2020) is one idea… but what other tools could we build for these fluid, high-trust contexts? Are there cyborg prostheses for conversation and group action? There's a lot to mine.
Iâm into Action Cat because itâs a sharp and sharable use of Reductâs cutting edge tech, with an expressive second read. After a great experience with Reduct, Iâm also up for more collabs. If youâve got some new technology or product to play with, then email me and letâs chat.
Why Action Cat?
Because it sounds like a character not a feature, and also because Iâm into the idea that you could cat a conversation and pipe it into grep or whatever. Also why not đ˝
Influences
Architecture and interaction design, via adaptation and hackability(May 2006) by Dan Hill: In adaptive design, designers must enable the experience/object to âlearnâ, and users to be able to âteachâ the experience/object. So, itâs a two-way interaction, in which the user wants to adapt the product, to make it useful to him or her. Therefore the designer must concentrate on enabling this adaptation in order to achieve a useful experience, rather than attempting to direct the experience towards usefulness themselves. Designers shouldnât aim to control, but to enable.
us+(2013) Lauren Lee McCarthy and Kyle McDonald: us+ is a Google Hangout video chat app that uses audio, facial expression, and linguistic analysis to optimize conversations based on the Linguistic Inquiry Word Count (LIWC) database, and the concept of Linguistic Style Matching (LSM). The app displays a visualization, provides pop up notifications to each participant, and takes actions (like auto-muting) when the conversation gets out of balance.
Wildcard: Spreadsheet-Driven Customization of Web Applications(Oct 2020) by Geoffrey Litt and Daniel Jackson: In this paper, we present spreadsheet-driven customization, a technique that enables end users to customize software without doing any traditional programming. The idea is to augment an applicationâs UI with a spreadsheet that is synchronized with the applicationâs data.(Action Cat isnât bidirectional but itâs a step in that direction.)
Programming Portals(Nov 2022) by Maggie Appleton: Programming portals are small, scoped areas within a graphical interface that give users access to command lines and text-based programming. They open a little window into the underlying functionality of an interface.
Thank you
Thanks to the Reduct team for being up for exploring new interfaces on a real platform. Thanks especially to Robert Ochshorn and Ned Burnell for this collab.