I’m thinking about social media backups again after Prismo’s database loss and one of my own test blogs crashing.

I can and do automate backups on the VPS where I host my main blogs.

I can manually backup my social media accounts, but IIRC none of them offer automatic scheduling. I have to remember to run a backup, log into the site, find the right control panel (which sometimes changes!) and request a backup.

I’d like to be able to schedule recurring backups on Mastodon, Twitter, etc. Send me an email each month with a link when it’s ready.

OK, you don’t want to keep generating backups for abandoned accounts. Here are some ideas:

  • Skip the process if I haven’t posted since the last archive.
  • Instead of scheduling a recurring job, schedule a new one 30 days out each time I download an archive.

Refine as needed.

Now, those of us with a little more tech savvy can automate some things with IFTTT. Not the native backup process, but we can set up rules to listen for new posts and automatically save the content somewhere else. But while I can reliably save the text of every post from Twitter, Mastodon, etc., saving media depends on what I’m saving it to. Often the best you can do with IFTTT is embed, not copy. (And that’s if the media is even available in the source feed. Pixelfed’s RSS doesn’t include image URLs, and Mastodon’s RSS/Atom includes them in a way IFTTT doesn’t recognize.)

Eh, maybe I should just read up on ActivityPub and see if I can make a subscribe-to-archive bot.

In the last few months:

And Google+ has less than two weeks left before Google pulls the plug on it.

Back up your social media accounts! Most sites have some sort of archive utility, and even if what you get isn’t suitable for moving to another site, at least you’ll have a copy in case they change their business model, screw up a data migration, get washed away in a flood or just shut down.

And if you can, consider donating to the Internet Archive to help protect other sites you rely on or would just like to see again. Websites go offline every day. Sometimes even the big ones.

I liked Rogue One: A Star Wars Story quite a bit. Despite having a very different tone from either the original trilogy or the prequels, it’s still recognizable as a Star Wars film, and successfully weaves in and out of the events leading up to A New Hope.

There’s a somewhat odd setup for where they actually find the Death Star plans, though. SPOILERS after the cut:

Continue reading

Emerald City Comicon’s website was hacked and deleted this week…along with all their backups.

Ouch.

Ticketing is all handled offsite by EventBrite, so tickets and financial info are safe. They’ve redirected their URL to the Facebook page while they rebuild their website.

Lesson learned: Isolate your backups.

I don’t just mean physically. Yes, you need to keep some offsite in case the reason you lost your server is that the building caught fire. But isolate the online access as well. If you back up your site by pushing the backups from your server to a remote location (either self-hosted or cloud storage like Dropbox or Amazon S3), those credentials are stored on your server somehow. What could an attacker do with them?

Consider: If someone breaks into your web server, what else can they do in addition to vandalizing your site? Can they access other databases? Can they hop onto your internal network? Retrieve or alter private files? Can they get at your backups? If so, can they get at all your backups including private documents?

The answers are going to depend on your network and backup setup. But they’re questions you need to start asking.