

Sharkey is a misskey fork
e


Sharkey is a misskey fork
you can download the arch wiki on kiwix (for android), it’s like 30 megabytes


No, I don’t think so. There is cleanup required on the rails of course, but it’s used fairly regularly in some places I think when the tracks are wet


A lot of trams carry sand that they can put on the rails to get more grip when they need to break really fast. That might be what happened there


I’m someone who has gotten sick in cars before (rarely) but I have done a lot of crazy stuff on VR and never felt remotely sick. I’m an outlier though


As a native English speaker I certainly won’t process the words of a lot of songs without a conscious effort
Lyrics are so often indecipherable as well. https://m.youtube.com/watch?v=jGLYJQJh9c8
One that I remember is that I had heard the song “believer” a number of times before learning its name, I always thought they were saying “pick me up and pick me up and leave me, and leave me”. I don’t think I even tried to decypher the rest lol


I think the “proper” way to simplify it is would’ve, which is pronounced the same as ‘would of’
A lot of mistakes have just become incorporated into the language in the past. Maybe ‘would of’ is just too blatantly wrong for that to ever happen though
Maybe not really a ‘mistake’, more of a normal shortening but my personal favorite english-ism is “bye” being descended directly from “god be with you”. People just kept collapsing it more and more over time.
Edit: also “a pease” -> “peas” -> “a pea”


It says in the article
15% of these subreddits contained content likely posted by bots or corporate trolls specifically designed to promote companies or organizations.
The title is just misinfo I guess


GPU stuff and ideally easier parallelism. The same binary could be executed on a GPU from any vendor, any CPU, anything that supports OpenCL, and could maybe even be extended to support FPGAs in the future.


On the high performance compute / GPGPU side the AdaptiveCPP JIT compiler seems very good for cross-platform operation


look at EV prices in china for a more accurate depiction of the battery progress that is being made
apparently the government EV subsidy for outright purchases ended in 2022, but they’re good enough at the manufacturing now that EVs are still exceptionally cheap. 70-80% of world lithium-ion production also takes place in China, so it makes sense.
There’s a lot of reasons that I don’t like the Chinese government, but they have been doing a whole lot better than the rest of the world with investment into the future of technology from what I’ve seen. The number of top-rated CS and EE schools in China is doing a whole lot on its own.


they’re actively manufactured for consumers, and cheap and available enough to be relatively competitive with lithium ion on there


i hope isdt releases a firmware update for the q6 nano for that if RC sodium ion packs become available.
although afaik energy density per volume and weight isn’t quite there yet


It’s the same as about:profiles
Just an easy way to separate people’s browsing histories, cookes, bookmarks, etc I guess. And you can have them sync independently as well. For if other people want to use the same computer


Well, I guess he has tried to make his views fairly plain on his blog. it’s just a bit hard to find unless you’re looking for it


i do want to point out how hard it is to even find out about the views of these people, if you just look up the names of the projects and aren’t specifically looking for this information there’s no way you’ll find anything about it
even looking up the name of David Heinemeier Hansson, the more vocally bad of these, i had to go to the 5th link to find anything even vaguely mentioning his views
Speaking of interesting sensing capabilities there’s also the sea turtles that can detect magnetic fields, although I don’t think people understand the actual mechanical parts yet
ollama is the usual one, they have install instructions on their GitHub i think, and a model repository, etc
You can run something on your cpu if you don’t care about speed, or on your gpu although you can’t run any more intelligent model without a decent amount of vram
For models to use, I recommend checking out the qwen distilled versions of deepseek r1
I made a little desktop app in Godot once for sorting through D&D monsters, can’t really release it tho because it requires you to have the whole official monster manual saved as jpegs for it to work
I was able to get the layout pretty nice, but it still kinda breaks with some resolutions because I didn’t write any custom layout code
Idk, on lemmy back-and-forth conversations are usually just a few replies at most, not particularly long format, unless you count length by all of the separate comments/branches
Unlike discord for example