afk_strats

joined 2 years ago
[–] afk_strats@lemmy.world 0 points 5 hours ago* (last edited 5 hours ago)

This is something I learned the hard way.

Consumer hardware is limited by multiple factors when it comes to PCIe connectivity.

  • Physical layout. Easy how many slots you have to plu into, their size, and configuration.
  • Supported lanes from the CPU
  • chipset (motherboard) limitations

Your graphics card might be a 16 lane card (referred to as "x16"), but sometimes, not all of them are used. Aforementioned 5060ti - I believe only uses x8. Some devices like graphics cards can use a physically smaller slot with an adapter for a loss in performance (a few frames in game play performance)

Similarly, your motherboard might have a x16 slot and another x16 at the bottom. That second slot might only function as x8 or even x4. Does this matter? Sort of. Inta-card communication aka peer to peer communication can affect affect performance and that can compound with multiple cards.

Even worse, some motherboards may have all sorts of connectivity but may have limitations like only 2 out of the bottom 4 slots, PCIe and m.2, can work at a time. ASK ME HOW I KNOW.

Your CPU controls PCIe. It has a hard cap in how many PCIe devices it can handle and what speed. AMD tends to be better here.

Enterprise gear suffers from none of this bs. Enterprise CPUs have a ton of PCIe lanes and enterprise motherboards usually match the physical size of their PCIe slots to their capacity and support full bifurcation*

PCIe lanes are used up by and consumable by m.2, MCIO, and occulink to name a few. That means that you can connect a graphics card to either one is those of you can figure out the wires and power**

  • ** Bonus: bifurcation and how my $200 consumer motherboard runs 6 graphics cards.

Bifurcation is a motherboard feature that lets you split PCIe capacity, so a 16x slot can support two x8 devices. My motherboard lets me do this on just the main slot and in a strange x8x4x4 configuration. I have an MCIO adapter (google it) which plugs into the PCIe and gives me 3 PCIe adapters with those corresponding speeds.

it also has 2 m.2 slots which connect to the CPU. One is them, I use for a nvme ssd like a normal person. The other is an m.2 to PCIe adapter which gives me an x4 PCIe slot. For those keeping track, that's 24 PCIe lanes so far. That's the maximum my processor Intel 265k can handle

But wait! The motherboard also has a kind of PCIe router and that thing can handle 8 more lanes! So I use the bottom 2 PCIe lanes on my motherboard for 2 cards at x4 each. The thing that kills me is that there are more m.2 ports. But the mobo will not be able to use any more than 2 devices at once. AND even though that bottom PCIe slot is sized at x16, electrically, its x4.

Do your research (level1techs is great) and read the manuals to really understand this stuff before you buy

My mobo for reference ASUS: TUF GAMING Z890-PRO WIFI

[–] afk_strats@lemmy.world 0 points 5 hours ago* (last edited 5 hours ago)

Vulkan helps with speed. Must benchmarks prove that out. Concurrency is a mixed bag. You can get some with llama.cpp bit vllm is concurrency king.

Just a couple of weeks ago llama.cpp released tensor parallelism which helps, but its still a experimental feature.

Unfortunately, I don't know of any diffusion runners that work in vulkan. If someone has expertise, let me know!

[–] afk_strats@lemmy.world 0 points 12 hours ago* (last edited 12 hours ago) (6 children)

I'm going to be brutal with you. I spent a few thousand dollars on 176GB of AMD vram because I was happy with getting vram for cheap and I hate Nvidia. It works and its nice to be able to run bigger models at usable performance, but if you need serious concurrency or good support for diffusion, you NEED Nvidia. AMD(and likewise Intel) just doesn't have the environment support for non-server GPUs. Again, coming from someone who's using this shit daily.

If you understand this limitation, then yes those B70s are cool as are AMD Pro 9700 which might have slightly better support rn. You may consider nvidia V100s which are old and cheap. I always recommend people start with 3090s (as a general powerhouse) or a pair of 5060tis (for really hood llm support) though. It will make your life easy if you can afford the vram limitation

[–] afk_strats@lemmy.world 4 points 1 week ago

Yeah. People will notice. People will speculate. Wild differences between people's INTERESTS tend to lead to relationship problems... usually. I think wild age differences are only weird when combined with differences in power and interest. Imo

[–] afk_strats@lemmy.world 11 points 1 week ago (1 children)

So RAM, GPUs, and SSDs will become cheap and available again...

...

Right...

...

Right!?!?!

[–] afk_strats@lemmy.world 0 points 1 week ago* (last edited 1 week ago)

Not exactly. Some people have used Claude's thinking patterns to train other open models.

This one has been in the top 5 on huggingface for weeks https://huggingface.co/Jackrong/Qwen3.5-27B-Claude-4.6-Opus-Reasoning-Distilled

[–] afk_strats@lemmy.world 0 points 2 weeks ago* (last edited 2 weeks ago) (6 children)

I've done some testing with the two large models and my initial impression is that they seem very similar in quality to Qwen3.5 35B and 27B. Some notable exceptions:

  1. llama.cpp has speculative decode support on day 1 and it speeds performance noticeably.
  2. Day 1 base model release will undoubtedly lead to faster finetunes

Can't wait for the inevitable Claude/Gemini distils.

My verdict is that even though these models benchmark slightly lower then their Qwen equivalents, their performance and support will likely drive me to pick them.

[–] afk_strats@lemmy.world 11 points 2 weeks ago

Three for one dad joke? Gee golly willikers! That's a steal!

[–] afk_strats@lemmy.world 16 points 3 weeks ago* (last edited 2 weeks ago) (3 children)

474

Popular imagery?

Edit. Ugh sorry for posting AI without warning. Can't believe I was fooled by a signature

[–] afk_strats@lemmy.world 20 points 3 weeks ago (2 children)

"unlimited PTO"

*looks inside

”4 weeks of PTO unless you have VP approval except you'll never get it”

[–] afk_strats@lemmy.world 23 points 1 month ago

Holy potatoes, Batman! That's an ancient meme

[–] afk_strats@lemmy.world 8 points 1 month ago (1 children)

Bojack Horseman predicted (?) this almost a full decade ago in Season 3 (July 2016l) 354

225
submitted 3 months ago* (last edited 3 months ago) by afk_strats@lemmy.world to c/linuxmemes@lemmy.world
 

As was actually rare at the time i was born into a household which had a personal computer. As long as I remember, computers fascinated me. They still do. But that fascination came with an increasingly adverserial relationship with Windows and distrust of Apple. That changed in 2025, my first full year living with Linux as my primary OS and booting no Windows machines. I'm excited about computing again. I am more dedicated to FOSS than ever. Here are some of MY takeaways in listicle format for no reason:

  1. Working on Linux is VERY good. Office suites are great. I'm partial to ~~Open~~Only Office. Developing is a joy because everything feels like its made to work with a few commands. This is in strong contrast to whatever Office/"Copilot" is and my experience developing on Windows.

  2. I work in an IT-ish field and I've become a lot more knowledgeable about sysadmin and netadmin type stuff. Not an expert but enough to have more confidence when something does comes up. A lot of this comes from being in terminal more. I understand Windows is going in that direction too, but it won't push users there. Some is from self-hosting.

  3. Multimedia is a mixed bag. Krita, Blender, and Godot are incredible tools but if you are a professional who relies on software for your job, some of the FOSS alternatives don't fit a majority of users. I personally don't think Darktable is reliable enough to replace Lightroom because I've had too many crashes on too many machines with it. Despite that, I'm still looking to get rid of Adobe.

  4. gaming on Linux is "good" to "great", but not perfect. In some cases, Proton beats Windows, yes. In most cases, games just work on Steam. I think for the amount of tinkering I put in, I could run a barebones W11for gaming and get better overall performance than my CachyOS. I don't because I can live with less than perfect and kernel level anti cheat can pound sand.

  5. I dodge an unknown but substantial amount of anguish from not having ads, ai, and surprise updates forced on me. I am sensitive to ads and am upset every time I see one. I'm always shocked to see them on other people's computers. My work computers (Mac) have forced updates and forced restarts which are jarring. My computers feel like my own.

  6. I find and (hopefully) fix all kinds of problems. My discord muted itself randomly because of a Wayland bug a few times. There's an open issue about dxvk getting framerate drops after about an hour of gameplay. That one sucks. One of my door sensors stopped working with homeassistant despite it being prefect in mqtt2z; it's a confirmed bug as of 3 weeks ago. This stuff is annoying but I take it as the cost of not trusting black boxes with my hardware.

To wrap it up, I think Linux is better than ever, more accessible than ever, and probably better than Windows for most people. To me, I would recommend it to my mom who only uses basic office tools and a browser and have recommended it to my tech savvy friend who got tired of windows update ruining his super custo1. m W11 setup... but would obviously caution my DOTA-addicted DM or my dad who runs part of his business on Access ( cringe, I know). It feels human, empowering and is good because of the way it is today not just because of its ideals.

I hope this made you reflect on your Linux experience and maybe on how you can contribute to or help the community.

Edit: OnlyOffice, not OpenOffice Edit2: WHY did I post on memes?!? Someone take away my late night/early morning posting privileges

 

Z8 50mm 1/250 sec at f/8.0, ISO 2000

Something I love about the plazas for some of the older buildings in New York is this extra decoration. A sculpture, or some seating, a few trees, etc. Gives the city some character and allows some space for a food vendor.

 

Z8 75mm 1/12800 sec at f/4.0, ISO 2500

This is literally the first thing I saw after walking out of the subway station when I visited Manhattan for the first time: a sysadmin convention! Jk, it furry meetup I think.

Which brings me to my takeaway... that New York is a true habitat made for modern humans. It feels more people centric and people act like others exist - at least more so than my experiences in the suburbs or small towns. Must be something about proximity or lack of car-induced separation. Or I've watched too many Not Just Bikes videos.

I didn't know what the protocol was. Can you go up and ask for photos? Is that weird?

 

Z8 24mm 1/1000 sec at f/16.0, ISO 200

I promise I was safe while taking this picture

 

The 30 Rock intro theme was on repeat in my head.

Z8 24mm 1/800 sec at f/11.0, ISO 2000

 

Z8 50mm 1/500 sec at f/11.0, ISO 2000

1
Building (lemmy.world)
submitted 3 months ago* (last edited 3 months ago) by afk_strats@lemmy.world to c/photography@lemmy.world
 

1/1000 sec at f/8.0, ISO 2000 120 mm Z8

New York

 

I loved New York but I didnt enjoy Times Square. It's an overly commercialized tourist trap. It felt loud, crowded, and soulless. It reminded me of Las Vegas. (Sorry). This picture somewhat captures my anxiety and discomfort there.

Nikon Z8 120mm 1/1250 sec @ f/9.0 iso 2000

 
1
submitted 4 months ago* (last edited 4 months ago) by afk_strats@lemmy.world to c/photography@lemmy.world
 

I'm kicking off a few posts from around the city with this very flat shot from floor 102 the One World Observatory.

Z8 75mm f/10 1/1000 sec at ISO 2000

 

TLDR; tell me if this is a waste of time before I spend forever tinkering on something that will always be janky

I want to run multiple OSs on one machine including Linux, Windows, and maybe OSX from a host with multiple GPUs + igpu. I know there are multiple solutions but I'm looking for advice, opinions and experience. I know I can google how-to but is this worh pursuing?

I currently dual boot Bazzite and Ubuntu, for gaming and develoent respectively. I love Bazzite ease of updates and Ubuntu is where it's at for testing and building frontier AI/ML tools.

What if I kept my computer running a thin hypervisor 24/7 and switched VMs based on my working context? I could pass through hardware as needed.

Proxmox? XCP-NG? Debian + QEMU? Anyone living with these as their computing machines (not homelabs/server hosts)?

This is inspired by Chris Tidus's (YouTube) setup on arch but 1) i don't know arch 2) I have a fairly beefy i7 265k 192gb build, but he's on an enterprise xenon ddr5 build so in a differenrent power class 3) I have a heterogenous mix of graphics cards I'm hoping to pass though depending on workload

Use cases:

  • Bazzite + 1 gpu for gaming
  • Ubuntu + 1 or more GPUs for work
  • Windows + 0 or more GPU Music Production paid vstis and kernel-level anti cheat games (GTAV, etc)
  • OSX? Lightroom? GPU?

Edit: Thank you all for your thoughts and contributions

Edit: what I've learned

  • this is viable but might be a pain
  • a Windows VM for getting around anti-cheat in vames defeats the purpose. I'd need a dual boot for that use case
  • hyperV is a no. Qubes Qemu libvirt, yes
  • may want to just put everything on sparate disks and boot / VM into them as needed

Edit: distrobox/docker works great but doesn't fit all my needs because I can't install kernel-level modules in them (AFAIK)

 
view more: next ›