Yo @fossilesque@mander.xyz , did you create that? Mind if I share it with my archivists bubble over on Mastodon?
DasFaultier
- 0 Posts
- 53 Comments
Ya, she absolutely is.
Most certainly officer!

Can confirm.

Research suggests that cats are already liquid at room temperature, earning the researcher a well deserved Ig Nobel prize.
https://improbable.com/ig/winners/?amp=1#ig2017
https://www.drgoulu.com/wp-content/uploads/2017/09/Rheology-of-cats.pdf
DasFaultier@sh.itjust.worksto
Linux@lemmy.ml•Which of the 3 standard compression algorithms on Unix (gz, xz, or bz2) is best for long term data archival at their highest compression?
1·2 months agoOf yeah, there really was, thank you. :)
DasFaultier@sh.itjust.worksto
Linux@lemmy.ml•Which of the 3 standard compression algorithms on Unix (gz, xz, or bz2) is best for long term data archival at their highest compression?
5·2 months agound denke mal, bei dem Username, dass du deutsch sprechen kannst haha Jup, stimmt. :D
Ich bleib’ trotzdem mal bei Englisch, damit’s im englischen Thread verstanden wird.
ENGLISH: Yeah, you’re right, I wasn’t particularly on-topic there. :D I tried to address your underlying assumptions as well as the actual file format question, and it kinda derailed from there.
Sooo, file format… I think you’re restricting yourself too much if you just use the formats that are included in binutils. Also, you have conflicting goals there: it’s compression (make the most of your storage) vs. resilience (have a format that is stable in the long term). Someone here recommended
lzip, which is definitely a right answer for good compression ratio. The Wikipedia article I linked features a table that compares compressed archive formats, so that might be a good starting point to find resilient formats. Look out for formats with at least Integrity Check and possibly Recovery Record, as these seem to be more important than compression ratio. When you have settled on a format, run some tests to find the best compression algorithm for your material. You might also want to measure throughput/time while you’re at it to find variants that offer a reasonable compromise between compression and performance. If you’re so inclined, try to read a few format specs to find suitable candidates.You’re generally looking for formats that:
- are in widespread use
- are specified/standardized publicly
- are of a low complexity
- don’t have features like DRM/Encryption/anti-copy
- are self-documenting
- are robust
- don’t have external dependencies (e.g. for other file formats)
- are free of any restrictive licensing/patents
- can be validated.
You might want to read up on more technical infos on how an actual archive handles these challenges at https://slubarchiv.slub-dresden.de/technische-standards-fuer-die-ablieferung-von-digitalen-dokumenten and the PDF files with specifications linked there (all in German).
DasFaultier@sh.itjust.worksto
Linux@lemmy.ml•Which of the 3 standard compression algorithms on Unix (gz, xz, or bz2) is best for long term data archival at their highest compression?
1·2 months agoThey all will, if the filesystem images aren’t pre-compressed themselves, and if OP is archiving raw image formats (DNG, CR2, …).
DasFaultier@sh.itjust.worksto
Linux@lemmy.ml•Which of the 3 standard compression algorithms on Unix (gz, xz, or bz2) is best for long term data archival at their highest compression?
671·2 months agoYou’re asking the right questions, and there have been some great answers on here already.
I work at the crossover between IT and digital preservation in a large GLAM institution, so I’d like to offer my perspective. Sorry of there are any peculiarities in my comment, English is my 2nd language.
First of all (and as you’ve correctly realizes), compression is an antipattern in DigiPres and adds risk that you should only accept of you know what you’re doing. Some formats do offer integrity information (MKV/FFV1 for video comes to mind, or the BagIt archival information package structure), including formats that use lossless compression, and these should be preferred.
You might want to check this to find a suitable format here: https://en.wikipedia.org/wiki/List_of_archive_formats -> Containers and compression
Depending on your file formats, it might not even be beneficial to use a compressed container, e.g. if you’re archiving photos/videos that already exist in compressed formats (JPEG/JFIF, h.264, …).
You can make your data more resilient by choosing appropriate formats not only for the compressed container but also for the payload itself. Find significant properties of your data and pick formats accordingly, not the other way round. Convert before archival of necessary (the term is normalization).
You might also want to consider to reduce the risk of losing the entirety of your archive by compressing each file individually. Bit rot is a real threat, and you probably want to limit the impact of flipped bits. Error rates for spinning HDDs are well studied and understood, and even relatively small archives tend to be within the size range for bit flips. I can’t seem to find the sources just now, but iirc, it was something like 1 Bit in 1.5TB for disks at write time.
Also, there’s only so much you can do against bit rot on the format side, so consider using a filesystem that allows you to run regular scrubs and so actually run them; ZFS or Btrfs come to mind. If you use a more “traditional” filesystem like ext4, you could at least add checksum files for all of your archival data that you can then use as a baseline for more manual checks, but these won’t help you repair damaged payload files. You can also create BagIt bags for your archive contents, because bags come with fixity mechanisms included. See RFC 8493 (https://datatracker.ietf.org/doc/html/rfc8493). There are even libraries and software that help you verify the integrity of bags, so that may be helpful.
The disk hardware itself is a risk as well; having your disk laying around for prolonged periods of time might have an adverse effect on bearings etc. You don’t have to keep it running every day, but regular scrubs might help to detect early signs of hardware degradation. Enable SMART if possible. Don’t save on disk quality. If at all possible, purchase two disks (different make & model) to store the information.
DigiPres is first and foremost a game of risk reduction and an organizational process, even of we tend to prioritize the technical aspects of it. Keep that in mind at all times
And finally, I want to leave you with some reading material on DigiPres and personal archiving on general.
- https://www.langzeitarchivierung.de/Webs/nestor/DE/Publikationen/publikationen_node.html (in German)
- https://meindigitalesarchiv.de/ (in German)
- https://digitalpreservation.gov/personalarchiving/ (by the Library of Congress, who are extremely competent in DigiPres)
I’ve probably forgotten a few things (it’s late…), but if you have any further questions, feel free to ask.
EDIT: I answered to a similar thread a few months ago, see https://sh.itjust.works/comment/13922388
DasFaultier@sh.itjust.worksto
Science Memes@mander.xyz•Nothing to see here. Just a pine cone.English
2·3 months agoSo do I! I sometimes ask them: “What are you laughing at?!”
DasFaultier@sh.itjust.worksto
Science Memes@mander.xyz•Nothing to see here. Just a pine cone.English
22·3 months agoyou can’t really put a value judgement on evolution
I can and I will and you can’t stop me!
(/s for safety)
DasFaultier@sh.itjust.worksto
Science Memes@mander.xyz•Nothing to see here. Just a pine cone.English
6·3 months agoI mean it sure would sound hilarious.
"Tik tok tik tok tik tok kchchch AAAAAAAHH!!!”
DasFaultier@sh.itjust.worksto
Linux@lemmy.ml•I must have died and gone to heaven [nushell]
12·3 months ago(…) 'cause it was quarter part eleven
on a Saturday in 1999
🎶🎶
To answer your questions, I work on the Bash, because it’s what’s largely used at work and I don’t have the nerve to constantly make the switch in my head. I have tried nushell for a few minutes a few months ago, and I think it might actually be great as a human interface, but maybe not so much for scripting, idk.
DasFaultier@sh.itjust.worksto
Science Memes@mander.xyz•These new Captchas are getting out of hand.English
10·4 months agoUltimate “Is It Cake?” challenge.
Huh, I wonder why having the most chaotic animals known to mankind guide explosives to a precise location never took off…
My whole life has been a lie!
It’s been a while for me and i can’t try things out atm, but i think vSphere SSH access is only for managing the appliance itself, not objects like VMs in a vSphere cluster. For that, you would have to use the Python SDK or PowerCLI.
If you run VMware, you can use PowerCLI to interact with your vSphere servers, and PowerCLI requires PowerShell and uses similar syntax. I haven’t tried it on Linux yet, but I would assume that that might be a valid use case.
Thx, and also thank you for the many great memes that you continue to share every day!