2025 in Review: The Year of Showing Up
This year, I joined a band, shot principal photography for a murder mystery, and became militant about ending talks on time. It was a good year.
Looking back at 2025, I’m struck by how much of it involved being in rooms I hadn’t been in before, conference stages, wastewater treatment plants, tiny six-seater planes, and rehearsal spaces where I was definitely new to singing in a band. If there’s a thread connecting everything, it’s that I kept saying yes to things that excited me, and most of them turned out better than expected.
Here’s how it went.
Conferences & Community
NEMS 2025
In April, I co-organized the New England Manipulation Symposium with Lael Odhner and Kaitlyn Becker. My job was coordinating paper acceptances, scheduling speakers, helping Kait wrangle the space at MIT, and giving a talk about the future of intelligent robotics
The thing I’m most proud of? We ended ten minutes early. I was militant about cutting off talks that ran over, and people actually thanked me for it. The thing I would change for next time is giving people more time to talk in the hallway outside talks. The whole day came together beautifully: a packed room, great energy, and a group photo where everyone looks genuinely happy to be there.
GTC in March
I was invited to NVIDIA’s GPU Technology Conference to meet with others in the field. The keynotes were worth attending in person, and it was good to reconnect with colleagues working on similar problems.
CoRL in Seoul
I didn’t present at CoRL this year, but attended to see what’s happening in the field. The vibe was very much “everyone is collecting data for robotics.” There’s been a notable shift back to hardware, not in the sense of building better robots, but in creating better teleoperation and data collection systems. Computer scientists who used to focus purely on software are now designing hardware for data acquisition.
I came away with mixed feelings. On one hand, the bet on data-driven methods is clearly accelerating. On the other hand, I saw a lot of companies selling multi-degree-of-freedom hands and humanoid robots without clear plans for how to train them or what problems they’d actually solve: solutions looking for problems. There’s also a pervasive issue of overinflated claims, with researchers presenting general-purpose capabilities that aren’t yet supported by what their robots can actually do.
One talk that stuck with me was Sangbae’s, which critiqued the field’s lack of understanding of fundamental problems and encouraged deeper reflection on what we’re actually trying to achieve. It echoed something I’ve been thinking about: the field would benefit from more focus on high-quality data and clearly defined success metrics for specific problems, rather than broad, unsolvable goals like “solving manipulation.”
While I was in South Korea, a colleague suggested we visit the DMZ. Standing at the border, seeing the two countries side by side, was sobering in a way that’s hard to articulate.
Columbia Robotics Hackathon
In November, I returned to Columbia as a judge for the MakeCU hackathon. What struck me was the sheer growth from last year. Students came from out of town to participate, and the projects were ambitious. One team finally implemented something I’d dreamed about in college: a smart lock compatible with dorm rooms. Seeing students solve problems I’d only imagined was a highlight.
Lions in AI Panel
Earlier this year, I joined a panel at the Columbia Alumni Association of Boston alongside fellow Columbia alumni working in AI. Barnet Sherman moderated a discussion about AI and automation across different industries. I represented the robotics perspective. The audience questions were sharp, and I left feeling good about helping demystify what’s actually happening in the field right now.
Dr. Waku Interview
I was featured on Dr. Waku’s YouTube channel to discuss the “ChatGPT moment” in robotics, or rather, why we haven’t had one yet. We talked about where robotics and AI still need development before we see the kind of breakthrough that makes everything feel different.
Writing
Launching the Blog
Stefanie Tellex and I had been collaborating for a while. Stefanie Tellex and I had been collaborating for a while. She’s a professor at Brown University who studies how robots understand language, which made her the perfect co-conspirator for a blog about what to tell them. We published “A Survey of Robotic Language Grounding: Tradeoffs Between Symbols and Embeddings” at IJCAI in 2024, and then George Konidaris asked us to write a chapter for his upcoming book, Designing an Intelligence, which we published earlier this year, Elephants Don’t Write Sonnets: The Physically Grounded Turing Test.
We discovered that we genuinely enjoyed writing together. So we kept it going.
In August, we officially launched What to Tell the Robot (What for Watkins and Tell for Tellex) with the publication of our book chapter. The blog has become a space for us to work through ideas about robotics, AI, and the things we think matter.
Elephants Don’t Write Sonnets
Our flagship post lays out the thesis of our book chapter: that the original Turing Test is no longer sufficient, and we need a new benchmark. We call this the Physically Grounded Turing Test. The argument is that elephants don’t play chess or write sonnets, but we all agree they’re intelligent. True intelligence requires embodiment, perception, and action in the physical world, not just manipulating language.
The Deer Island Marvel
One of my favorite pieces this year was about visiting the Deer Island Wastewater Treatment Plant. Stefie and I toured this $3.8 billion facility that processes 360 million gallons of wastewater daily, and I couldn’t stop thinking about the robotics opportunities hiding in plain sight. The engineering is staggering, with 12 egg-shaped digesters, each 90 feet in diameter, but much of the inspection and maintenance is still manual. Workers enter digesters on rafts to clean them. Plastic removal happens by hand. There’s real work to be done here.
Adventures
Culebra
Despite spending more than a collective year of my life in Puerto Rico, I’d never made it to Culebra until this August. We took a six-seater plane. I am pretty sure neither the runway nor the plane was excited about the total weight of my 6-member immediate family and two dogs. The island is tiny, even smaller than Key West, which I’d visited for the first time 5 months earlier in March of this year. Beautiful views, though, and worth the questionable takeoff and landing to celebrate my mom’s birthday.
Quebec City by EV
I drove to and from Quebec City with an electric vehicle this year. Quebec was beautiful in August, and I got to see some stunning natural sights. I learned about the wonders of using ChatGPT as a tour guide with my girlfriend, Amelia. I was truly in awe of how cool it was to take pictures of anything, learn detailed information about it, and simultaneously ask it where the best place to get food was, right where we were. Traveling by EV was by far the worst choice, as it took us 12 hours to get there instead of 6, despite Canada being extremely EV-friendly.
Facts & Figures
My coworker Kevin Karol wrote a murder mystery dance party called Facts & Figures, which premiered at the Boston Fringe Festival. I did the principal photography.
I learned more than I expected: how to work with stage lighting, how to get better angles on principal actors, the timing of theatrical photography, and how to edit images shot in low light. I got an entirely new perspective on the acting I did at the Edinburgh Fringe Festival 15 years ago. It’s something I’d love to do again.
RAI Band
My company started a group band this year, and I volunteered for vocals. I’m in the rock portion of the band, and we’ve been practicing “Tribute” by Tenacious D. I love singing as part of a group and rehearsing the songs, though I have a lot of room to grow. There’s something satisfying about learning a skill that has nothing to do with your day job.
Piano
I’ve kept up with my piano lessons with Tatiana Bercu. It’s been gratifying to finally be able to play Gottes Zeit ist die allerbeste Zeit by Bach. Sight reading becomes easier every week!
What I Learned
If I had to distill this year into a lesson, it would be about the importance of multidisciplinary teams and the value of implementing software yourself.
In the 2000s and 2010s, software had something technology never had before: virtual free replication to customers. No physical media needed, so many talented people got excited about building software companies. Those same people are now looking at robotics.
On the flip side, many robotics companies have focused primarily on hardware, building out the mechanical systems and expecting customers to figure out how to program them. Neither approach works as well as you’d hope.
What I’ve seen work, both this year and over my career, is bringing multidisciplinary teams together from day one. Having hardware people talking to software people from the start produces better robots. Even better, as a manager, having access to these AI tools to assist you gives you so much more. Implementing software yourself, even assisted, is how you keep learning. There is something irreplaceable about staying hands-on. You understand problems differently when you’ve built even part of the thing yourself.
Looking Ahead to 2026
I’m looking forward to building greater things.
That’s vague, I know. But after a year of showing in new places and saying yes to unfamiliar challenges, I have a clearer sense of what I want to build and who I want to build it with. The blog will keep growing. The research will continue. And I’ll probably say yes to a few more things that scare me.
Thanks for reading. If you want to follow along, subscribe to What to Tell the Robot, and I’ll see you in the new year.
I want to acknowledge both Stefanie Tellex and Reena Leone for reviewing the post before publication and making constructive suggestions.











