I am an indie dev. Every launch I shipped, I needed App Store screenshots in many languages. English first. Then French, German, Japanese, Korean, Spanish, Portuguese. After a few launches I knew the routine by heart. I also knew how much it hurt.
The old workflow
Open Figma. Duplicate the screenshot template ten times. Paste translations from a doc. Fix overflow on German. Resize text on Japanese. Export at the right size for every device class. Repeat for every screen. Repeat for every language. A weekend gone.
I tried short cuts. I asked ChatGPT for translations. They were good. I asked it to render the screenshot in another language. It gave me a square image with the wrong fonts and a different phone shape. App Store rejected it.
The missing tool
Generic AI tools are great at language. They are bad at one specific thing: keeping the exact device resolution Apple and Google ask for. iPhone 16 Pro Max is 1320×2868. Not 1024×1024. Not 1280×720. That requirement is the heart of the problem.
So I built lokal. Same screenshot, same shape, same size, just in the new language. One job per workspace, no copy-paste.
Today
I still use lokal for every app I ship. If something breaks, my users see it the same day I do. If a feature is missing, I add it on the next free evening. That is the deal.