I completely agree with this take. There's an immense feeling of calm when your digital life revolves around files you completely own. I've managed my entire productivity around a single org-mode file for the bast five years and it perfectly adapts to my needs. Add on top of that syncthing and I have a robust system I can use on mobile that works perfectly for me.
I feel I could leave and come back in 20 years and still use my org-mode productivity file, Emms[1] playlist, verb[2] HTTP request book, etc. exactly the same as I left off. What would my data in Notion, Postman, etc. all look like in 20 years if left unattended?
This is why I'm building my own postman replacement which runs files stored locally (it looks something like the ES query console). It will give me a lot of flexibility to change and share my requests and configurations, as well as actually letting me store my files locally.
(I'm also building an assertion feature to allow me to assert the shape/data on the request/response, which will come in very useful when testing api changes)
Here's what a request to the pokeapi looks like with a query parameter to limit the results:
Take a look at .http files. They are a file based alternative to postman. I don’t think it has the assertion feature.
miningape 3 minutes ago [-]
This is incredibly similar (we even landed on the same @base syntax)! Thanks for sharing, I think I'll have to look up some parsers to see if there are any tricks I've missed.
I'm also not sure if http files provide a way to share configurations across a set of files / inherit properties? - Similar to how in postman you can share a set of properties within a folder
ashishb 3 hours ago [-]
That's exactly when I created MusicSync (obsidian for music) I relied on keeping absolute minimal state in the app.
All the file organization and maintenance happens in the user's Google Drive and the phone's SD card.
The problem with files is that they usually don't provide what users need or expect in 2024.
Having multiple users editing one file at the same time is hard, especially if they're non-technical and they don't understand git diffs. To make that work, you need CRDTs (or operational transforms), and those can't really be represented nicely in plain text formats.
Even something like a music library, where you have n devices authorized to make changes, each device keeps an offline copy, and all changes get "synced up" when devices get online, is just far, far easier to implement with a servr guarding over a database than with a raw file on some cloud drive.
miningape 9 minutes ago [-]
I do agree for general users, but the post seems targeted at a more technical audience. I don't expect HR or finance to care about file types and storage beyond being something they know how to work with. I do expect developers to care though, and developers also (generally) prefer "self-managed" solutions like git over CRDT for editing files.
I really disagree with this idea in software design that even when you are making a tool for technical people it must be implemented in a way that non-technical people expect. Like how postman is trying to be a google docs for http requests. Programming languages are our favourite technical tools and we don't expect features that are nice for non-techies, we expect features that confuse non-techies but make our lives easier!
reddalo 3 hours ago [-]
This is exactly why I like to document my APIs using Bruno [1] instead of Postman or Insomnia.
Bruno creates plain-text files that I can easily read with any text editor; and as an additional bonus, I can version my files using Git.
Noticed that good old Visual Studio and some OSS tools supports .http files of the IntelliJ format so hopefully we can converge on them because the format is quite plain and obvious and doesn't contain too many surprises.
Text files are amazing for longevity - however, they are obviously limited in the type of data they can contain (text).
Personally I create a lot of visual data such as images, drawings and video, as well as audio. I try to have everything accessible in the most rudimentary lossless format, but in this domain there are so many tradeoffs. It would be interesting to read (or perhaps write) a similar post for this sort of data.
kelvinjps10 8 minutes ago [-]
Just link to the files ? In markdown you can do just link to the files
rishikeshs 3 hours ago [-]
I had the same thought will writing this. I do not have an answer other than plain text.
But in terms of preservation and archiving, I'm thinking of storing the files in binary as a backup.
XorNot 3 hours ago [-]
I have been swinging around to the opinion that SQLite + files is perhaps the universal data format after all. Files give you efficient blob storage, and the SQLite database can in fact encode most types of constraints and structure.
rhl314 2 hours ago [-]
Text files are amazing, and for when you need to structure data you can use sqlite.
I am using it for https://loadjitsu.io/
Still looking for a good solution to seamlessly sync local sqlite to cloud for backup when the user wants
benfortuna 3 hours ago [-]
Isn't this "philosphy" just a rehash of Open File Formats:
I think when I scribbled this thought, the idea was more on formats that MIGHT endure the test of time. Yes, copyright free matters a lot. But I'm very skeptical of say formats like WebP. Will it last for decades, I don't know!
TacticalCoder 2 hours ago [-]
File over online app. I'm all for it.
But I'm not concerned about the ability to read this or that obscure file format in three decades: just look at the retro community accessing files in old formats for the C64 or whatever old machine.
In a way we know have "app as file": be it a container build file or a complete VM, we can emulate pretty much anything and everything as long as it doesn't depend on something online.
Any app, on any OS, as long as it doesn't require a proprietary online server, can be emulated or virtualized.
I can run the old DOS programs I wrote back in 1990 or so, decoding weird picture file formats. I've got an emulator running on a Pi hooked to an adapter in my vintage arcade cab emulating thousands of arcade games.
If anything it is easier to access all those old apps and file formats today then back in the days, because you can manipulate them from a much more powerful system.
Rant done, off to Proxmox to create a container installing QEMU to emulate a Raspberry Pi 2 (it's more convenient to test in an emulator and then deploy later on on the real thing).
Rendered at 11:33:43 GMT+0000 (Coordinated Universal Time) with Vercel.
I feel I could leave and come back in 20 years and still use my org-mode productivity file, Emms[1] playlist, verb[2] HTTP request book, etc. exactly the same as I left off. What would my data in Notion, Postman, etc. all look like in 20 years if left unattended?
[1] https://www.gnu.org/software/emms/ [2] https://github.com/federicotdn/verb
(I'm also building an assertion feature to allow me to assert the shape/data on the request/response, which will come in very useful when testing api changes)
Here's what a request to the pokeapi looks like with a query parameter to limit the results:
```
@baseUrl = "https://pokeapi.com/api/v2"
GET /pokemon
- query:
```I'm also not sure if http files provide a way to share configurations across a set of files / inherit properties? - Similar to how in postman you can share a set of properties within a folder
All the file organization and maintenance happens in the user's Google Drive and the phone's SD card.
https://ashishb.net/all/why-i-built-an-alternative-to-google...
Having multiple users editing one file at the same time is hard, especially if they're non-technical and they don't understand git diffs. To make that work, you need CRDTs (or operational transforms), and those can't really be represented nicely in plain text formats.
Even something like a music library, where you have n devices authorized to make changes, each device keeps an offline copy, and all changes get "synced up" when devices get online, is just far, far easier to implement with a servr guarding over a database than with a raw file on some cloud drive.
I really disagree with this idea in software design that even when you are making a tool for technical people it must be implemented in a way that non-technical people expect. Like how postman is trying to be a google docs for http requests. Programming languages are our favourite technical tools and we don't expect features that are nice for non-techies, we expect features that confuse non-techies but make our lives easier!
Bruno creates plain-text files that I can easily read with any text editor; and as an additional bonus, I can version my files using Git.
[1] https://www.usebruno.com/
https://www.jetbrains.com/help/idea/exploring-http-syntax.ht...
Personally I create a lot of visual data such as images, drawings and video, as well as audio. I try to have everything accessible in the most rudimentary lossless format, but in this domain there are so many tradeoffs. It would be interesting to read (or perhaps write) a similar post for this sort of data.
But in terms of preservation and archiving, I'm thinking of storing the files in binary as a backup.
I am using it for https://loadjitsu.io/ Still looking for a good solution to seamlessly sync local sqlite to cloud for backup when the user wants
https://en.wikipedia.org/wiki/List_of_open_file_formats
Albeit, limited to just plain text formats.
I think when I scribbled this thought, the idea was more on formats that MIGHT endure the test of time. Yes, copyright free matters a lot. But I'm very skeptical of say formats like WebP. Will it last for decades, I don't know!
But I'm not concerned about the ability to read this or that obscure file format in three decades: just look at the retro community accessing files in old formats for the C64 or whatever old machine.
In a way we know have "app as file": be it a container build file or a complete VM, we can emulate pretty much anything and everything as long as it doesn't depend on something online.
Any app, on any OS, as long as it doesn't require a proprietary online server, can be emulated or virtualized.
I can run the old DOS programs I wrote back in 1990 or so, decoding weird picture file formats. I've got an emulator running on a Pi hooked to an adapter in my vintage arcade cab emulating thousands of arcade games.
If anything it is easier to access all those old apps and file formats today then back in the days, because you can manipulate them from a much more powerful system.
Rant done, off to Proxmox to create a container installing QEMU to emulate a Raspberry Pi 2 (it's more convenient to test in an emulator and then deploy later on on the real thing).