

In a blogpost called, “AI crawlers need to be more respectful”, they claim that blocking all AI crawlers immediately decreased their traffic by 75%, going from 800GB/day to 200GB/day. This made the project save up around $1500 a month.
“AI” companies are a plague on humanity. From now on, I’m mentally designating them as terrorists.
This is the best advice for anyone who wants to learn how to utilize the command line. I used to write tremendous documentation for myself, as though I was writing a blog post. Now, I just throw a one line description of the change and hope I remember enough about it for it to make sense. But writing the longer info was really important for me internalizing things in the very beginning.
Glad you’re having fun!
Edit: Oh, and I suggest you look for guides on setting up
rsync
to work like Time Machine.