brokenlcd

joined 2 years ago
[–] brokenlcd@feddit.it 2 points 1 month ago* (last edited 1 month ago)

Aw thank you :-) But to be honest. I ain't that good at this. All I know is mostly from reading wikis and following work already done by others.

The real beasts are the ones that actually discover that this is possibile. Eg. The console modding community. (I'm still impressed at how dumb of a way they managed to mod the switch: it literally shorts the CPU to make it skip checks) or the guys at win-raid forums. Where I think the whole research about this upgrade being possibile started in the first place.

English isn't my first language. So I'm not sure if I managed to get my point across. But essentially I'm saying: in the sea there is always a bigger fish. And in this world you never finish learning. There will always be someone that knows more than us.

Also if you do have a lot of these machines... They really aren't bad ones. Win 10 ltsc runs like a dream on these. And linux is right at home on them. It's only really a bottleneck for me because I literally use my machine for fluid simulation for my studies. But even for games. An half height gpu like the RX 6400 gives you a decently stout pc. If you aren't aiming for full res upscaled AAA games. Hell. The steam deck has shown just how much we can do with more modest specs.

[–] brokenlcd@feddit.it 2 points 1 month ago

A little bit of both. I generally like to mess with this kind of thing. Plus if I manage to save those 40-50€. It's money I can invest to buy a decent HDD. And move away from the WD green from 2009 I'm currently using in there.

Plus. If I do manage to find enough info. I may try to implement a rudimentary version of coffeetime. So at least the community has an open source variant to do this. Since especially with B0 chips like the one I'm trying to use. It's not that difficult of a process and is mostly drop in hardware wise. The only hiccup I've found is that the software side is lacking. Barring a sketchy binary that roams around some forums.

[–] brokenlcd@feddit.it 1 points 1 month ago

I’d recommend the MSI Afterburner's curve optimizer on Windows, and a pyNVML script on Linux.

In the end I managed to half ass a solution using LACT to make the fans stay off for most of the thermal excursion. And fire at max as soon as 65°C is reached. It's a dumb and loud way to do it. But it currently. Works.

I'm not sure if pyNVML is what is used inside of LACT. But from what I remember undervolting on linux is a bit messy. Since you have to set a maximum clock and then shift it upwards with nvmlDeviceSetGpcClkVfOffset. I've yet to do that though since exams started hammering me. Plus I was a bit hesitant to mess with clocks because of the card's warranty. I don't know if it could affect it.

On the thermal side there was also the problem of my desk suffocating the card. So I've cut a pair of holes in the desk and added two pwm controlled fans. And i'm currently building a PCB with a pi pico to allow the pc to make them spin faster when the card is under load. Through a fancontrol module. (I'll have to polish it up and share the code eventually. I bet there is another madman that'll find such a custom fan controller useful)

May I recommend a duct too? I have my 3090 “sealed” against the edge of the case with weather sealing strip foam, and it pulls in ambient air from a different spot where everything is exhausted.

Yeah. Unfortunately that isn't really applicable to my case. Right now the card pulls fresh air from the top of the case. That is refreshed by the desk fans. And it exhausts to the back and front of the case. I can't really make a proper forced path since the panel is fragile and the back is fully open. Though i'm planning to put an extra fan in that gap between the psu and GPU to force it to pull more air from the top. While exhausting the hot air towards the desk fans. (Sorry if I made no sense. My English isn't that good unfortunately)

Hopefully when i'm free enough I can try to get an under lock going though.

 

Around six months ago (and luckily before the whole ram shortage) I managed to scrounge up enough money to build this monstrosity of a machine, based on what I thought was a lenovo thinkcentre m700... More on that further down.

The whole mod works wonderfully. But the problem i'm facing is that the poor i5-6500 it came with just cannot keep up with what i'm doing with it and bottlenecks the whole machine.

Without any mods. The Best CPU i can put into it is an i7-6700. Which is still a 6th gen CPU... But it's still about 70€ where I live. While for some reason I can find a lot of i3-9100. For 20-30€. Which from what I understand are B0 stepping chips and don't require pin modding to be used. And should still be a good upgrade.

The last problem was the BIOS. The bios on this machine is not meant to support such a new chip. But I remember reading people having success with a program called "coffeetime" to shoehorn the microcode to use newer cpus.

When I went to sanity check what the machine's bios said. I found out it's a actually an m800. Not an m700. This raises a problem. Since it's chipset is a q150. That has the problem of having a stricter/ more in depth Intel ME. That from what I managed to find requires somekind of bypass.

Do you think this is still feasible to do? And do you know if there is any safe source for coffeetime / some guide to do this mod by hand? Since having a random software that I can't read the source of modify the bios of my machine feels a bit iffy.

[–] brokenlcd@feddit.it 4 points 1 month ago

Liking computers in general and switching to Linux at 15 out of desperation.

After that all it took was getting an shitbox pc as a hand me down to make me go "Linux is also used on servers right? Shouldn't be too difficult to setup something." And that's how I got the bug.

[–] brokenlcd@feddit.it 6 points 1 month ago

I once wired my whole ass house for ethernet. (Before realizing I was colorblind nonetheless.) Instead of studying.

Never underestimate how you can use study procrastination as a push force for other shit. (Unless you're a dipshit like me and do it with an imminent exam)

[–] brokenlcd@feddit.it 2 points 1 month ago

Just coming from a 2wk exam crunch. Hits way too close than I'm comfortable to admit

[–] brokenlcd@feddit.it 2 points 2 months ago

Crunching like mad for an exam currently. So justice and femtanyl to hype myself up while studying. (For me they work well since they don't have lyrics/they aren't easily comprehensible)

And before going to sleep. Either caravan palace or Jamie Berry to calm back down. + a couple of songs from Mina Celentano.

[–] brokenlcd@feddit.it 13 points 4 months ago

I still remember when my cousin's cat chewed through the pellet stove's power cord... I don't think I've ever heard a sound that so perfectly matched the expression "screams from hell". Luckily the RCD tripped so the poor bastard survived. But he sure as hell isn't touching wires again.

[–] brokenlcd@feddit.it 5 points 4 months ago* (last edited 4 months ago) (1 children)

Acer travelmate 4070. Used as a control unit for a cut and bend machine. (Don't ask me why)... Holy shit that bastard has outlived around three of the machines that began work with it.

What I'm saying is. Acer was good. But like all things enshittification ruined it.

HP though... I don't think I've ever seen a good hp. Even the ancient ones. I feel like the only HP stuff that saves itself are calculators and some of the old testing equipment

[–] brokenlcd@feddit.it 5 points 5 months ago* (last edited 5 months ago)

The last two books i've read are my Aerodynamics book for an exam. And Harry Potter and the philosopher stone to help my sibling make a review for English class.

Sooo... Potter will see how his magic fares against a rocket.

[–] brokenlcd@feddit.it 15 points 5 months ago

Well. Sub-zero is a range. 0K, or absolute zero, is sub 0°C. So sub zero can be equal to absolute zero in one instance of the range.

[–] brokenlcd@feddit.it 5 points 6 months ago* (last edited 6 months ago) (1 children)

I started a game review blog here on Lemmy, but I'm having trouble finding games I want to discuss lately

Ooh I remember you! It was fun seeing people having takes on some older games I had actually played while on the train. Mainly kona and Pacific drive.

My 2¢ when you also manage to get out of this purgatory is valley, parkour/puzzle based for the sake of exploring what happened to the place. Story driven and pretty good from what I remember. I played it back when it came out.

The other one is ultrakill. Frenetic as fuck boomer shooter. Most of the fun comes with learning tricks and acing levels and challenges. Not everyone's cup of tea. Still. Hakita's a musician and it shows a lot. The soundtrack is great. And the game was originally made to promote an album iirc. Played before the last round of exams sucked all will from me.

anything in particular that's also stuck in your list?

 

I've recently managed to setup a modded Skyrim install. Wanted to setup and play for months. And now... Zero. Same thing happened with bloodlines a while back.

During the day the spark of wanting to play comes. But as soon as I get home. It just disappears. And end up doing other things. It feels like wanting to play the game is more appealing than actually playing it.

How do y'all manage it?

18
submitted 6 months ago* (last edited 6 months ago) by brokenlcd@feddit.it to c/selfhosted@lemmy.world
 

i'm trying to setup nginx to run as a proxy to aggregate multiple services. running on different ports on the server, using nginx to let me connect to all the services by going to a specific subdirectory. so i can keep only one port open in the router between my lab and the main house network.

i'm using the following config file from an example i found to do this, with a landing page to let me get to the other services:

used config file


server { listen 80; server_name 10.0.0.114; # Replace with your domain or IP

# Redirect HTTP to HTTPS
return 301 https://$host$request_uri;

}

server { listen 1403 ssl; # Listen on port 443 for HTTPS server_name 10.0.0.114; # Replace with your domain or IP

ssl_certificate /certs/cert.pem;  # Path to your SSL certificate
ssl_certificate_key /certs/key.pem;  # Path to your SSL certificate key

location / {
    root /var/www/html;  # Path to the directory containing your HTML file
    index index.html;  # Default file to serve
}


location /transbt {
#configuration for transmission
    proxy_pass http://10.89.0.3:9091/;  
proxy_set_header Host $host;
    proxy_set_header X-Real-IP $remote_addr;
    proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
    proxy_set_header X-Forwarded-Proto $scheme;$proxy_add_x_forwarded_for;
}

but the problem i'm having is that, while nginx does redirect to transmission's login prompt just fine, after logging in it tries to redirect me to 10.0.0.114:1403/transmission/web instead of remaining in 10.0.0.114:1403/transbt and breaks the page. i've found a configuration file that should work, but it manually redirects each subdirectory transmission tries to use, and adds proxy_pass_header X-Transmission-Session-Id; which i'm not sure what's accomplishing: github gist

is there a way to do it without needing to declare it explicitly for each subdirectory? especially since i need to setup other services, and i doubt i'll find config files for those as well it's my first time setting up nginx, and i haven't been able to find anything to make it work.

Edit: I forgot to mention. The server is still inside of a nat. It's not reachable by the outside. The SSL certificate is self signed and it's just a piece of mind because a lot of things connect to the home net. And none of the services I plan to use only support http.

 

i managed to build my first proper pc in the most hacky way, and it works wonderfully(the one of the previous post).

the only problem i have left now is that during the heavyest workload i need it to run; the card reaches 77°C and i'm not sure if it's dangerous for the card to be cycled between 77°C and 51°C while it writes to the hdd. due to thermal stress.

the problem isn't the air flow of the case, but the fact that the pc is placed in an under desk shelf, the heat is pushed backwards and outwards by the gpu and psu fans, but the hot air still rises toward the top, where the card intakes air.

i'm already seeing if i can put fans in the cubby under the desk, but i'm also seeing if i can undervolt the gpu to have it heat less, since from what i could understand the performance loss is minimal up to a certain point.

the problem with that is that nvidia doesn't expose the core voltage in the drivers for linux (... torvalds was right in this front). i found that there is a workaround to do that with LACT but i'm afraid it's going to mess the card's warranty or the card itself. what do you think?

 

after finally having some free time between exams and work, and enough money to build it. i decided to assemble a decent pc, both for interference and general usage. due to limited budget i chose to pick up a refurbished thinkcenter m700 and a 12GB 3060. the problem? the thinkcenter is an sff pc. so it would have never fit the card, plus due to using a proprietary psu i couldn't upgrade it to something that could run the card.

so that's when the quest began to see how i could ever shoehorn a card+ psu in this mess.

the first thing that arrived was the thinkcentre, so i got to work trying to find a way to make both the pc's and the gpu's psus on at the same time. so i needed some power that would turn on as soon as the pc turned on to power a relay, and thus, turn on the gpu psu.

Luckily the pc had two SATA connectors for powers, one of which i opted to put an ssd in, so the 12V line was free. it was a bit annoying since it used a CPU molex, but the box of scrap parts took care of that:

i ended up adding the relay on the 12V line to turn on the other supply

and the original connector that was in the pc on the 5V to power the ssd.

then it came time to fit the harness inside of the pc, i managed to snake it in... even if i had to mess with zip ties since i had spliced the ssd wire the wrong way around. but in the end, the pc side came out pretty well:

fast forward a couple of weeks (courtesy of the postal system shipping my package to the other side of the country by mistake), i got the card, the psu and the riser.

since i wasn't able to find a riser that turned 90° to the right, i had to place the gpu above the psu, and make a bracket to hold it up, since the riser cable was as stiff as rock. plus i had the idea that after it was all buttoned up, the psu fan would pull air through the gpu as well, somewhat aiding it.

after mocking it up with books, it didn't look too bad so i went on with it.

so now i had to make the bracket, the holes in the top cover to allow both the riser and the switched line out of the case, and find out how to hold and protect this whole mess... so to the workshop i work at we go.

luckily they allowed me in on sundays so i could use all the tools we had in there. (the joys of working as a small artisan :-D)

i have to admit, having a card worth so mutch in the midst of alluminium shaving felt wrong in a way i can't explain, in a laptop next to a pool way.

first thing first, the holes in the case, i just roughly marked where they where supposed to go, and i added the leeway to allow the panel to slide open. the riser hole was done with an angle grinder, while the switched line hole was done with a christmas tree drill bit to 12mm:

now i had to find something that could cover up the sharp ends of the cut, both to not destroy the riser cable and my fingers. luckily we had just bought new band saw blades, and the blade protectors fit perfectly for this job:

now to the psu and bracket for the gpu: my idea was to add two plates to anchor the gpu to the psu, using the card's pci mount to bolt it on. and then add some brackets to allow the psu to screw where the case screws went, locking it all in place:

it's ugly as sin, but in the end it was going to be covered up, so it didn't matter.

the card was locked in place with a nut and bolt in the hole where the screw to secure the card would go, and a bolt/washer/wing nut set to hold the other side, in between the two slot "teeth" the card has.

now i just needed something to hold up the back of the card, since holding it just from the faceplate felt like an extremely dumb idea.

an L extrusion with some of the blade protector on top did the job, i was even able to use the psu's fan screws to lock it in place:

now it was mechanically sturdy, it just lacked a shell to cover it up, in between the scraps i found a sheet of something that would work. i only know it from brand name, but it's essentially a foam panel sandwitched between two alluminium plates, if you cut only one panel, you can bend it and it looks pretty good. so i went with it.

i added L brackets on the pc panel with rivets to hold it steady, and made some holes in the panel to let the card exaust both out of the front and back.

(frankly if it wasn't for the psu cables i would have made it out of plexiglass, since seeing the card suspended like this is beautiful)

now it was just time to bring it out of the workshop and button it all up:

and that's it. i'm surprised it took around a week to build it all, excluding the exodus the gpu had to take to arrive to me.

the only problem i have left now is that during the heavyest workload i need it to run; the card reaches 77°C and i'm not sure if it's dangerous for the card to be cycled between 77°C and 51°C while it writes to the hdd. due to thermal stress.

the problem isn't the air flow of the case, but the fact that the pc is placed in an under desk shelf, the heat is pushed backwards and outwards by the gpu and psu fans, but the hot air still rises toward the top, where the card intakes air.

i'm already seeing if i can put fans in the cubby under the desk, but i'm also seeing if i can undervolt the gpu to have it heat less, since from what i could understand the performance loss is minimal up to a certain point.

the problem with that is that nvidia doesn't expose the core voltage in the drivers for linux (... torvalds was right in this front). i found that there is a workaround to do that with LACT but i'm afraid it's going to mess the card's warranty or the card itself. what do you think? (i'm going to post the question aside as well so people don't have to go through a bible worth of build montage)

i want to thank all the peeps in the !localllama@sh.itjust.works and !pcmasterrace@lemmy.world communities for helping me understand the technicalities of this whole mess, since i never had hardware this poweful at hand.

especially @Smokeydope@lemmy.world and @brucethemoose@lemmy.world from the locallama community for helping me figure out if it was even worthwhile to do this, and for giving me clues for setting up an enviroment to run it all.

and @fuckwit_mcbumcrumble@lemmy.dbzer0.com from the pcmasterrace community for helping me figure out air flow issues.

 

after finally having some free time between exams and work, and enough money to build it. i decided to assemble a decent pc, both for interference and general usage. due to limited budget i chose to pick up a refurbished thinkcenter m700 and a 12GB 3060. the problem? the thinkcenter is an sff pc. so it would have never fit the card, plus due to using a proprietary psu i couldn't upgrade it to something that could run the card.

so that's when the quest began to see how i could ever shoehorn a card+ psu in this mess.

the first thing that arrived was the thinkcentre, so i got to work trying to find a way to make both the pc's and the gpu's psus on at the same time. so i needed some power that would turn on as soon as the pc turned on to power a relay, and thus, turn on the gpu psu.

Luckily the pc had two SATA connectors for powers, one of which i opted to put an ssd in, so the 12V line was free. it was a bit annoying since it used a CPU molex, but the box of scrap parts took care of that:

i ended up adding the relay on the 12V line to turn on the other supply

and the original connector that was in the pc on the 5V to power the ssd.

then it came time to fit the harness inside of the pc, i managed to snake it in... even if i had to mess with zip ties since i had spliced the ssd wire the wrong way around. but in the end, the pc side came out pretty well:

fast forward a couple of weeks (courtesy of the postal system shipping my package to the other side of the country by mistake), i got the card, the psu and the riser.

since i wasn't able to find a riser that turned 90° to the right, i had to place the gpu above the psu, and make a bracket to hold it up, since the riser cable was as stiff as rock. plus i had the idea that after it was all buttoned up, the psu fan would pull air through the gpu as well, somewhat aiding it.

after mocking it up with books, it didn't look too bad so i went on with it.

so now i had to make the bracket, the holes in the top cover to allow both the riser and the switched line out of the case, and find out how to hold and protect this whole mess... so to the workshop i work at we go.

luckily they allowed me in on sundays so i could use all the tools we had in there. (the joys of working as a small artisan :-D)

i have to admit, having a card worth so mutch in the midst of alluminium shaving felt wrong in a way i can't explain, in a laptop next to a pool way.

first thing first, the holes in the case, i just roughly marked where they where supposed to go, and i added the leeway to allow the panel to slide open. the riser hole was done with an angle grinder, while the switched line hole was done with a christmas tree drill bit to 12mm:

now i had to find something that could cover up the sharp ends of the cut, both to not destroy the riser cable and my fingers. luckily we had just bought new band saw blades, and the blade protectors fit perfectly for this job:

now to the psu and bracket for the gpu: my idea was to add two plates to anchor the gpu to the psu, using the card's pci mount to bolt it on. and then add some brackets to allow the psu to screw where the case screws went, locking it all in place:

it's ugly as sin, but in the end it was going to be covered up, so it didn't matter.

the card was locked in place with a nut and bolt in the hole where the screw to secure the card would go, and a bolt/washer/wing nut set to hold the other side, in between the two slot "teeth" the card has.

now i just needed something to hold up the back of the card, since holding it just from the faceplate felt like an extremely dumb idea.

an L extrusion with some of the blade protector on top did the job, i was even able to use the psu's fan screws to lock it in place:

now it was mechanically sturdy, it just lacked a shell to cover it up, in between the scraps i found a sheet of something that would work. i only know it from brand name, but it's essentially a foam panel sandwitched between two alluminium plates, if you cut only one panel, you can bend it and it looks pretty good. so i went with it.

i added L brackets on the pc panel with rivets to hold it steady, and made some holes in the panel to let the card exaust both out of the front and back.

(frankly if it wasn't for the psu cables i would have made it out of plexiglass, since seeing the card suspended like this is beautiful)

now it was just time to bring it out of the workshop and button it all up:

and that's it. i'm surprised it took around a week to build it all, excluding the exodus the gpu had to take to arrive to me.

after running it for the first time with my usual model (a nemo 12B) i have to say... holy shit if there isn't a difference between running at 3tok/s on the deck and 30 tok/s, i was expecting an increase, but not a 10x one. right now i'm converting some 24B models to exl3 3bpw to finally see how they fare.

the only problem i have left now is that during the conversion ( the heavyest workload i managed to throw at it) the card reaches 77°C and i'm not sure if it's dangerous for the card to be cycled between 77°C and 51°C while it writes to the hdd. due to thermal stress.

the problem isn't the air flow of the case, but the fact that the pc is placed in an under desk shelf, the heat is pushed backwards and outwards by the gpu and psu fans, but the hot air still rises toward the top, where the card intakes air.

i'm already seeing if i can put fans in the cubby under the desk, but i'm also seeing if i can undervolt the gpu to have it heat less, since from what i could understand the performance loss is minimal up to a certain point.

the problem with that is that nvidia doesn't expose the core voltage in the drivers for linux (... torvalds was right in this front). i found that there is a workaround to do that with LACT but i'm afraid it's going to mess the card's warranty or the card itself. what do you think? (i'm going to post the question aside as well so people don't have to go through a bible worth of build montage)

i want to thank all the peeps in the !localllama@sh.itjust.works and !pcmasterrace@lemmy.world communities for helping me understand the technicalities of this whole mess, since i never had hardware this poweful at hand.

especially @Smokeydope@lemmy.world and @brucethemoose@lemmy.world from the locallama community for helping me figure out if it was even worthwhile to do this, and for giving me clues for setting up an enviroment to run it all.

and @fuckwit_mcbumcrumble@lemmy.dbzer0.com from the pcmasterrace community for helping me figure out air flow issues.

10
submitted 8 months ago* (last edited 8 months ago) by brokenlcd@feddit.it to c/pcmasterrace@lemmy.world
 

I'm hacking together a gaming/fluid simulation pc from a lenovo thinkcentre m700 sff. I've already got everything set up. The problem is that the only riser i managed to find has a 90° bend to the left. (Looking from where the bracket would be.) The problem is that like this my only way to make it fit is to turn the 3060 fans up. With the extra psu powering it under it. With ~7 cm of clearance. (i'll make some stands to hold it properly) What i'm not sure is if it's going to make a big difference that it's going to push hot air downwards. The power supply fan runs continuously and it draws air downwards and out of the back. But i don't know if it will be enough.

What do you think. Is it going to make it get hotter a lot?

 

I have an unused dell optiplex 7010 i wanted to use as a base for an interference rig.

My idea was to get a 3060, a pci riser and 500w power supply just for the gpu. Mechanically speaking i had the idea of making a backpack of sorts on the side panel, to fit both the gpu and the extra power supply since unfortunately it's an sff machine.

What's making me weary of going through is the specs of the 7010 itself: it's a ddr3 system with a 3rd gen i7-3770. I have the feeling that as soon as it ends up offloading some of the model into system ram is going to slow down to a crawl. (Using koboldcpp, if that matters.)

Do you think it's even worth going through?

Edit: i may have found a thinkcenter that uses ddr4 and that i can buy if i manage to sell the 7010. Though i still don't know if it will be good enough.

 

Da poco hanno montato un locker di lockeritalia vicino a dove abito. E mi ha incuriosito il fatto che si possono ricevere anche pacchi di dhl. Il che è comodo perchè ordino componenti da un rivenditore che usa solo dhl. Il problema è che non riesco a trovare da nessuna parte come usarli; neppure sul sito ufficiale di lockeritalia.

Qualcuno ha idea di come si usino?

 

I remember seeing on the steam deck community someone mentioning a patch that reduced the size of the assets and, consequentially, the size of the whole game; does it actually exist or did i just hallucinate it?

 

L'ottobre del 2023 ho acquistato uno steam deck, però negli ultimo giorni ho notato un punto sullo schermo dove i pixel non si accendono più; già un mese dopo l'acquisto avevo notato che il feedback aptico sinistro praticamente non si sentiva, ma poichè non volevo passare per tutta la trafila di rimandarlo indietro staccai il feedback e lo continuao ad usare.

Ora però non ho una scelta, però vorrei conoscere l'esperienza di altri che l'hanno rimandato indietro, poichè ho visto che in europa molti hanno avuto problemi con il reso, con i deck che rimanevano bloccati alla wharehouse senza ulteriori informazioni sullo stato del reso.

 

L'ottobre del 2023 ho acquistato uno steam deck, però negli ultimo giorni ho notato un punto sullo schermo dove i pixel non si accendono più; già un mese dopo l'acquisto avevo notato che il feedback aptico sinistro praticamente non si sentiva, ma poichè non volevo passare per tutta la trafila di rimandarlo indietro staccai il feedback.

Ora però non ho una scelta, però vorrei conoscere l'esperienza di altri che l'hanno rimandato indietro, poichè ho visto che in europa molti hanno avuto problemi con il reso, con i deck che rimanevano bloccati alla wharehouse senza ulteriori informazioni sullo stato del reso.

view more: next ›