ThanksForAllTheFish

joined 2 years ago

Mastodon is a social media app, a bit like Twitter, but instead of one company owning the whole thing, it is made of lots of smaller communities that can talk to each other. You join one community, but you can still follow and talk to people on other communities, a bit like how someone with a Gmail address can email someone with an Outlook address.

[–] ThanksForAllTheFish@sh.itjust.works 1 points 1 week ago* (last edited 1 week ago)

Seriously, look into the time blindness trait, I'm fully specced into it and it's going terribly.

[–] ThanksForAllTheFish@sh.itjust.works 7 points 3 weeks ago (6 children)

Ignore me if I'm being stupid, but could you just not give it internet? A lot of TVs have high spec CPU/APU these days and complicated firmware, surely ability to update the firmware for these is necessary for patches/feature improvements. They probably think it's silly not to include software if they can, but I agree the software experience is often a bit of a let down. LGs been good, but admittedly I block all telemetry on my network so wouldn't notice any downsides.

Also removes the pressure when cooking and helps the shell to separate.

[–] ThanksForAllTheFish@sh.itjust.works 2 points 4 weeks ago* (last edited 4 weeks ago)

I buy from refurbishers and modders. It's more expensive, but got some fully refurbished and tested PS2 controllers this way.

True, in this case trash-cli is the sane command though, it has a much different job than rm. One is remove forever no take backs, the other is more mark for deletion. It's good to have both options imo. Theres a lot of low level interfaces that are dangerous, if they're not the correct tool for the job then they don't have to be used. Trying to make every low level tool safe for all users just leads to a lot of unintended consequences and inefficiencies. Kill or IP address del can be just as bad, but netplan try or similar also exist.

[–] ThanksForAllTheFish@sh.itjust.works 1 points 2 months ago* (last edited 2 months ago)

I understand that they were intending to unpack from / and they unpacked from /home/ instead. I'm just arguing that the unpack was already a potentially dangerous action, especially if it had the potential to overwrite any system file on the drive. It's in the category of "don't run stuff unless you are certain of what it will do". For this reason it would make sense to have some way of checking it was correct before running it. Any rms to clean up files will need similar steps before running as well. Yes this is slower, but would argue deleting /etc by mistake and fixing it is slower still.

I'm suggesting 3 things:

  • Confirm the contents of the tar
  • Confirm where you want to extract the contents
  • Have backups in case this goes wrong somehow

Check the contents:

  • use "tar t'' to print the contents before extracting, this lists all the files in the tar without extracting the contents. Read the output and check you are happy with it

Confirm where:

  • run pwd first, or specify "-C '/output-place/'" during extraction, to prevent output to the wrong folder

Have backups:

  • Assume this potentially dangerous process of extracting to /etc (you know this because you checked) may break some critical files there, so make sure this directory is properly backed up first, and check these backups are current.

I'm not suggesting that everyone knows they should do this. But I'm saying that problems are only avoidable by being extra careful. And with experience people build a knowledge of what may be dangerous and how to prevent that danger. If pwd is /, be extra careful, typos here may have greater consequences. Always type the full path, always use tab completion and use "trash-cli" instead of rm would be ways to make rm safer.

If you're going to be overwriting system files as root, or deleting files without checking, I would argue that's where the error happened. If they want to do this casually without checking first, they have to accept it may cause problems or loss of data.

[–] ThanksForAllTheFish@sh.itjust.works -2 points 2 months ago (2 children)

Could make one archive intended to be unpacked from /etc/ and one archive that's intended to be unpacked from /home/Alice/ , that way they wouldn't need to be root for the user bit, and there would never be an etc directory to delete. And if they run tar test (t) and pwd first, they could check the intended actions were correct before running the full tar. Some tools can be dangerous, so the user should be aware, and have safety measures.

[–] ThanksForAllTheFish@sh.itjust.works 5 points 2 months ago (2 children)

The biggest flaw with cars is when they crash. When I crash my car due to user error, because I made a small mistake, this proves that cars are dangerous. Some other vehicles like planes get around this by only allowing trusted users to do dangerous actions, why can't cars be more like planes? /s

Always backup important data, always have the ability to restore your backups. If rm doesn't get it, ransomware or a bad/old drive will.

A sysadmin deleting /bin is annoying, but it shouldn't take them more than a few mins to get a fresh copy from a backup or a donor machine. Or to just be more careful instead.

[–] ThanksForAllTheFish@sh.itjust.works 6 points 2 months ago* (last edited 2 months ago) (1 children)

Kind of true, but it's more like an ant scent marking. They know this is a place that's safe (rats have been here). With good absorbent natural substrate in the cage, and wiping down their free roam area if needed, it's mostly unnoticeable. I like to compare it to a person needing to change their t-shirt every day, probably not a deal-breaker for interacting, unless you don't clean for way too long.

[–] ThanksForAllTheFish@sh.itjust.works 1 points 2 months ago* (last edited 2 months ago)

This is wild advice, thier algorithm will say "this person is addicted to matches and will literally match with anyone, sell him the unlimited swipes package and downgrade his match chance exposure to keep him hanging on for more". Based on 5 years since use.

[–] ThanksForAllTheFish@sh.itjust.works 3 points 3 months ago (1 children)

Method http died unexpectedly 127 means APT’s HTTP helper (usually curl/wget) was in a bad state when the disk filled. Chroot alone won’t fix it, run dpkg --configure -a first, and if http still fails, reinstall curl/apt-transport-https (manually via dpkg if needed), then apt --fix-broken install.

You'll have to download them manually as other people have mentioned, and resolve any missing dependencies during install the same way. Also check your network still works.

ping -c 3 1.1.1.1
 
 
view more: next ›