ByteSorcerer

joined 2 years ago
[–] ByteSorcerer@beehaw.org 11 points 1 day ago* (last edited 1 day ago)

Blue light is visible, so emitting more or less blue light Alters how colours are perceived on the display. That's also why enabling the blue light filter makes the screen look yellow.
Screens calibrated for the same colour temperature and wilth equal brightness should emit the same amount of blue light regardless of which display technology they use.

[–] ByteSorcerer@beehaw.org 3 points 4 weeks ago

The total voltage or amperage of the battery pack does not mean anything for the battery cells. You can put more cells in series and get a higher voltage at lower current, or more in parallel and get a higher current at a lower voltage. But all individual cells will run at the same voltage in either configuration (iirc between 3 and 4V), and the current per cell will also be the same for a given load regardless of the situation.

The main thing a higher battery pack voltage accomplishes is that the cables connected to the battery don't need to be as thick, as the required thickness of a cable depends only on current, not voltage.

[–] ByteSorcerer@beehaw.org 3 points 1 month ago (2 children)

I've heard a teacher using that as a test to see which students are using AI: If the student turn in a report full of em-dashes, then the teacher would put them in front of a laptop running Word and asks "can you please show me how you type those long dashes that you used all over your report?"

If they can't do it, then their report is considered AI-generated or plagiarism (which are considered equivalent by the school). If they could do it they would get the benefit of the doubt, but when I heard it he hadn't had a single student pass that test yet.

It's a better and likely far more accurate test than those complete bullshit "AI detectors".

[–] ByteSorcerer@beehaw.org 1 points 1 month ago

It's not easy to scale up chip production, because it relies on extremely precise machines, which take a long time to build, and many different steps on the way from raw materials to a finished chip.

If you'd want to set up a new RAM chip factory with competitive performance, it'd be an investment of over a billion USD at the bare minimum, and it'd take a few years to set up all the processes because the first chips can roll off the assembly line.

If the bubble popped by then, then your new factory would probably run at a loss because it's nearly impossible to complete with companies who have had decades to optimise their production processes.

Even if the bubble didn't pop by then, then the next problem will likely be the wafer supply. Because just like how there are only a few companies with the infrastructure to build modern, high-performance computer chips, there are only a few companies with the infrastructure to build silicon wafers of a high enough quality to build those chips with. And they have only just enough capacity to supply their current customers.

So to then solve the wafer problem, someone needs to be willing to invest at least a few hundreds of millions of USD to build a new factory for those, which again would struggle to complete in a post-scarcity market. And wafers are far from being the only resource with that issue.

TL;DR: It's be a huge investment and a huge gamble, and would likely end up just moving the problem anyway.

[–] ByteSorcerer@beehaw.org 7 points 1 month ago

Unfortunately not likely going to happen since AI is used more and more to write software, and AI doesn't tend to write very efficient software.

[–] ByteSorcerer@beehaw.org 0 points 1 month ago

Yeah, it's not a miracle drug, but it does allow you to diet on easy mode. What you're supposed to do while on the drug is switching to healthier food and healthier eating habits. The drug should make that easier as it should remove the cravings that you'd normally get from switching to less calorie dense meals, and if you're eventually used to healthier meals then it should be easier to keep that up once you're off the drug.

But you do have to actively build those healthier eating habits while on the drug to get that lasting effect. It offers an easier way to switch to a healthier lifestyle, but if you don't actually make that switch then the effect will indeed just wear off immediately once you stop taking the drug.

[–] ByteSorcerer@beehaw.org 0 points 1 month ago

Around here public chargers are ridiculously overpriced so even if you use slow chargers you end up paying more per km for a plug-in hybrid than if you only put fuel in it. And if you want to use a fash charger then it's of course even more expensive. You only get cheaper costs per km if you can charge at home.

I think the concept behind plug-in hybrids is great: The battery of an EV is by far the most expensive part, and also by far the most polling part to produce. So making a car that acts like an EV with a battery only just big enough for your daily commute, with a back-up power system for when you need to go further and to avoid range anxiety makes a lot of sense. But unfortunately they are held back from reaching their potential by lacking charging infrastructure and too high electricity costs.

[–] ByteSorcerer@beehaw.org 5 points 3 months ago* (last edited 3 months ago)

Alternatieve method tot know thing from future: wait.

[–] ByteSorcerer@beehaw.org 3 points 3 months ago (4 children)

I've also experimented with this. In my experience, getting the NPCs to behave the way you want with just a prompt is hard and inconsistent, and quickly falls apart when the conversation gets longer.

I've gotten much better results by starting from a small model and fine-tuning it on lore-accurate conversations (you can use your conversations with larger models as training materials for that). In theory you can improve it further with RLHF, but I haven't tried that myself yet.

The downside of this is of course that you're limited to open-weight models for which you have enough compute resources available to fine-tune them. If you don't have a good GPU then the free Google Collab sessions can give you access to a GPU with 15GB of VRAM. The free version has a daily limit on GPU time though so set up your training code to regularly save checkpoints so that you can continue the training on another day if you run out. Using LoRa instead of doing a full fine-tune can also reduce the memory and computational resources required for the fine-tune (or in other words, allows you to use a larger and better model with your available resources).

[–] ByteSorcerer@beehaw.org 0 points 4 months ago (1 children)

We have only old buses and trams with no information displays or announcements. We figure out where our stop is by either counting stops, following along on a navigation app, or looking out of the window to try to figure out where you are if you know the area well enough. So where would we fit in on this scale?

[–] ByteSorcerer@beehaw.org 6 points 4 months ago

I am typing this on a 5 year old Android phone. It has 128GB of memory and 8GB of RAM, very decent cameras, a beautiful OLED screen and a processor that is more than fast enough for everything I do with it. And even now the battery still lasts two days with normal use. It cost me about €300 at the time.

Unfortunately the Android version is getting so far behind that some apps are starting to get a few issues, so I have been checking out some black Friday deals for new phones, but they look very disappointing.

In the current market it seems like I'd have to pay about €500 to effectively just get a side-grade. All €300 offerings look like just a straight up downgrade in any way apart from the more recent android version.

So I think I'll hold on to this one a while longer. Hardware-wise it's still in perfect condition, and if software support really becomes an issue then perhaps I'll try out a custom ROM.

[–] ByteSorcerer@beehaw.org 5 points 4 months ago

The main reason is tech debt and proprietary software. Most companies have decades of software infrastructure all built on Microsoft based systems. Transitioning all that stuff to Linux is a massive investment, especially taking into account the downtime it'll cause combined with the temporary decrease in productivity when everyone has to get trained and build up experience with the new platform.

And then you have to deal with proprietary software. A lot of niche corporate or industrial hardware only supports Windows. And you probably have to regularly interact with customers who use Windows and share files with you that can only be opened in Windows only proprietary software.

Linux also frequently struggles with a lot of weird driver issues and other weird quirks, causing an increased burden on the IT department.

Basically you're looking at a massive investment in the short term, for significantly reduced productivity in the long run. And all that mostly to save a bit of hardware costs, which are only a fraction of the operating costs for most companies. Just sticking with Windows ends up being the more economical choice for most companies.

view more: next ›