To boldy go... nowhere
unit327
Now it is a dumb tv with a 30 second boot up time and a clunky menu for changing inputs.
MOTION TO MAKE A RULE THAT USING ALL CAPS IS AGAINST THE RULES
Phone based AR is a mess because while openxr did make some specs around it, google and apple never adopted them, so you need to use their proprietary bullshit instead. I don't have any experience with it myself.
https://forum.godotengine.org/t/does-godot-4-support-ar-for-android-and-ios/88658/3
Honestly I wouldn't sweat too much about your scene structure. Just make the thing and when something starts becoming painful to use, then it might be time to reorganise things.
The incremental search feature on the file system tab in godot is a godsend for finding things in messy projects. Which all projects end up being to one degree or another.
If you want some sort of example structure to follow, you could take a look at how the godot-xr-tools / godot-xr-template stuff is done, though it might be overkill. Obviously that is for VR stuff but most of the structure is general.
Every additional customer using their product currently causes them to lose money, hand over fist.
You should still boycott them anyway though, they care about user numbers and user activity.
They aren't weaponizing anything against consumers. They don't care about consumers anymore, they are irrelevant.
They think if they just spend more money they will win the AGI race and therefore the whole economy forever. Consumers don't factor into it at all.
How so? I can easily just delete the whole s3 bucket.
I'm aware, but I myself have < 3TB and if I actually need it I'll be more happy to pay. It's my "backup of last resort", I keep other backups on site and infrequently on a portable HDD offsite.
I use aws s3 deep archive storage class, $0.001 per GB per month. But your upload bandwidth really matters in this case, I only have a subset of the most important things backed up this way otherwise it would take months just to upload a single backup. Using rclone sync instead of just uploading the whole thing each time helps but you still have to get that first upload done somehow...
I have complicated system where:
- borgmatic backups happen daily, locally
- those backups are stored on a btrfs subvolume
- a python script will make a read-only snapshot of that volume once a week
- the snapshot is synced to s3 using rclone with --checksum --no-update-modtime
- once the upload is complete the btrfs snapshot is deleted
I've also set up encryption in rclone so that all the data is encrypted an unreadable by aws.
only windows (maybe mac)
It's a sad state of affairs, but if you limit the date range of searches to pre-2022 you get much better results. Like this: https://noai.duckduckgo.com/?q=stuff&noai=1&df=2000-02-02..2022-11-30