The Jules team finished off another ship week with the release of an initial version of their API, allowing you to integrate with your friendly asynchronous squid engineer.
I have been interested in having a native app that let’s me interact with Jules, and get notifications pinged to me when activities are ready for the next step. This way I can nudge the system along before the next thing.
So, I found myself in front of some English Premier League games on a Sunday morning, laptop on my… well lap.
By the time the last ball was kicked in the Brentford vs. Man City game, I had a working application that was designed by Stitch, and coded up with Jules… a favorite pair of mine these days 🙂
Starting with the Spec

As per usual these days, I started by defining a simple spec that captures what I am looking to build: it’s design and implementation. Sometimes I work with an AI to go deep on the definition up front, and other times I stay shallower and flush out the spec as I add more features.
This is also where I capture useful context such as the documentation for the alpha Jules API, so it is always available to an agent.
Designed By Stitch

Next, I feed the spec, which contains the screens that I am looking to design, into Stitch. Given that Jules has a strong design aesthetic, I could take a screenshot of the home page, which includes all of the whimsy of the squid and their adventures, and pass it into Stitch as the design inspiration to go along with the spec.
That is all I needed to get what I wanted this time, in comparison to when I am in a new area exploring a bunch of design styles.
I tend to follow a layout pattern for screens when there isn’t a complex flow.
The first screen on the left is the screen I ended up choosing, and the others to the right are copies where I have asked for specific edits, or where I have asked for a series of variants. It’s a rare project when I don’t ask for variants as this is the fun of Stitch! It’s cheap to explore!
And here’s the project to explore yourself!
Coded by Jules
Finally, it’s time to build. I setup a repo that contains my spec as a README.md, and the screens that I want to build which have been downloaded from Stitch.
This time, I decided to code a SwiftUI based native iPhone app. Hmm, the Stitch screens are paired images as well as HTML/CSS code. Fortunately, LLMs are REALLY good at translating. They can listen in English and speak back in Spanish. And they can read HTML and come back with Swift. It’s impressive.
Now, I have the admit that the early alpha API doesn’t have allllll of the documentation available yet, so I first asked for an API client, and then had it run through and output sample payloads which I saved away to look through, and for the AI to have in it’s back pocket to flush out the full client with the given info.
Now I had designs, API samples, docs, and a spec. It was time to get Jules coding… and while waiting for some of the initial work, I noodled away and added issues to tag Jules into later.
Now, I admit that it’s a bit more frustrating working with an iPhone app compared to the Web… I wasn’t able to puppeteer, nor get preview links to test with in a wonderful happy cloud. Instead I was running locally and pulling things down (and back with changes).
But a short time later, I was enjoying a new shiny squid app on my phone. I love this new world of personal software where you can go from an idea to it running in your hand, in the exact way you want it.
Here are some of the screens in all their glory!



More adventures with my friend are just a tap away!

/fin

