The end-user experience is often the highlight when discussing local-first software. While it's true that many applications don't work offline today, which is a key motivation for companies to adopt local-first approaches, there's another aspect that excites me even more.
In the early days of personal computers, applications weren't collaborative. There was no reliable, fast internet, and no software distribution systems. You had to buy a disk with the software saved on it to use it. Each application had its own persistent local data store. Despite the less rich tools available for building applications back then, the architecture was much simpler compared to today's typical architecture, which often involves only a frontend with its own database.
As we've seen more types of devices and connectivity, applications have become more collaborative. Users could share and communicate effortlessly. Consequently, a large portion of software moved to the cloud, along with databases, to support the rich experiences people now expect. As software usage grows, so does the complexity required to build robust backend systems.
We've seen (and will continue to see) tremendous progress in tools that help developers build such software. It's astonishing what software can do today. Big tech companies invent new tools and best practices that are scalable and reliable.
But if you step back and think, most applications don't need such sophistication. It's great that big tech companies have architectures supporting billions of users, but most applications just need simple software and a database.
Let's divide the typical application development process into three steps:
Prototype the end-user application with no backend (single-user mode).
Move the database to the cloud and create a backend application to make the application collaborative.
Build infrastructure to develop fast and reliable software that scales.
Many applications in the world don't (or don't need to) reach step 3 and probably stop at step 2. The gap between steps 1 and 2 is unnecessarily large today. Hosting an application in the cloud isn't technically challenging with modern tools, but the architectural shift can be mentally taxing, prompting developers to bypass the simplicity of step 1.
This seems reasonable, but it introduces unnecessary complexity. For example, to update something in the database, you need client-side code to call the update, backend code to handle the request and validate it, update the database, return the proper error code if something goes wrong, and handle errors and cache updates on the client side. Don't forget about optimistic updates because it's often slow to wait on the whole process to complete. This is just basic functionality. Your application needs an internet connection to work reliably, which makes the application brittle, and ensuring offline functionality is a lot of work. It's a poor experience for both users and developers.
With a local-first architecture, you eliminate the added complexity while still making the application collaborative. This architectural simplicity is not talked about enough. You enable sync when moving to step 2 by setting up a sync service, which doesn't require changing your application's architecture. Only when you want to scale, hopefully with paying customers, do you set up your own cloud infrastructure.