![](https://lemmy.today/pictrs/image/817e51c0-c1be-4c74-8884-fd73d6631b2b.png)
![](https://fry.gs/pictrs/image/c6832070-8625-4688-b9e5-5d519541e092.png)
Video of the accidental ascent
Video of the accidental ascent
putty
I mean, Windows can do the thin client side fine. I’d personally somewhat-prefer to use Linux for that, but that’s not really my sticking point. I’m normally keeping my software, data, stuff like that on the server, and just running two remotely-connected terminals and a web browser on my client. Virtually all the software can run on the server. My problem is Windows on the server side; like, it’s just not reasonable to use a Windows machine remotely via a command-line for anything other than some very basic administrative tasks, and using a GUI remotely once latency goes up or bandwidth down is just painful.
I’m not saying that that’s an unreasonable policy for companies to have, but I will bet that only a very small portion of individuals normally do that for personal smartphones.
and to remote in.
This is the approach I use with laptops domestically, and I think that there’s something to be said for it. Like, the laptop itself doesn’t store important information. A remote server does. The laptop is just a thin client. If the laptop gets lost or stolen – which I’ve had happen – I revoke the credentials. No important information is lost, and no important information is exposed.
Whole-disk laptop encryption has improved things too from an exposure standpoint (albeit not a loss standpoint), though I don’t use it myself (don’t want to spend any battery life on it). I assume that smartphones have some form of reasonably-secure storage hardware, but I don’t know if it involves encryption.
What I found irritating – and this is years back now – was an employer who didn’t care if I took a laptop in or out or what information I stored on it (as long as it was a work system), but who refused to provide remote access to the network, so I couldn’t just keep the important information on the work network. I mean, I get if they want to have some sort of isolated DMZ and require an externally-accessible server to live there, not provide VPN access in to the general network, but not having the ability to have remote network access to work systems at all is just incredibly obnoxious.
I think that some of it is that Windows is not phenomenal to use remotely. Yeah, there are solutions, but they aren’t great if you’re on a high-latency, low-reliability, or low-bandwidth link. I try to use console Linux for as much of my stuff as possible. That whole ecosystem was designed around thin-client, remote use.
https://www.verizon.com/coverage-map/
This shows that Verizon has good coverage in part of Wanaque, moderate coverage in part, and no coverage in part.
The planning board’s decision was based on health concerns due to the possible negative environmental impact of telecommunication on the residents, especially the children studying at the school who could potentially be exposed to electromagnetic radiation. The town felt the residents would be ‘unsafe’ due to radio frequencies and rejected the company’s notion of building the tower on the land.
I mean, I think that the planning board is idiotic, but I don’t see why T-Mobile cares enough to fight it. If they don’t build it, okay. It looks like the school in question is right in the middle of town. Then Wanaque is going to have crummy cell coverage. Let them have bad cell coverage and build a tower somewhere else. It’s not like this is the world’s only place that could use better cell coverage. The main people who benefit from the coverage are Wanaque residents. Sure, okay, there’s some secondary benefit to travelers, but if we get to the point that all the dead zones that travelers pass through out there are covered, then cell providers can go worry about places that are determined not to have have cell coverage.
If I were cell companies, I’d just get together with the rest of the industry and start publishing a coverage score for cities for cell coverage. Put it online in some accessible database format, so that when places like city-data.com put up data on a city, they also show that the city has poor cell coverage and that would-be residents are aware of the fact.
I don’t really use calendaring all that much, but when I do, I use org-mode agenda in emacs, which seems to do all the stuff that he’s complaining about not having. It does a lot more than I use.
That being said, I get that that’s probably not what he’s after if he’s not an emacs user.
You can buy discs online.
I mean, you can probably create something akin to a god, but I don’t see as to whether it being open source or not would change that.
I don’t really have a problem with this – I think that it’s rarely in a consumer’s interest to choose a locked phone. Buying a locked phone basically means that you’re getting a loan to pay for hardware that you pay back with a higher service price. But I’d point out that:
You can get unlocked phones and service now. I do. There are some privacy benefits to doing so – my cell provider doesn’t know who I am (though they could maybe infer it from usage patterns of their network and statistical analysis). It’s not a lack of unlocked service that’s at issue. To do this, Congress is basically arguing that the American consumer is just making a bad decision to purchase a plan-combined-with-a-locked-phone and forcing them not to do so.
Consumers will pay more for cell phones up front. That’s not necessarily a bad thing – it maybe makes the carrier market more competitive to not have a large portion of consumers locked to one provider. But there are also some benefits to having the carrier selecting cell phones that they offer in that the provider is probably in a better position to evaluate what phone manufacturers have on offer in terms of things like failure rates than do consumers.
I don’t see why they wouldn’t, or couldn’t do this
There are only 52 organizations that Firefox trusts to act as CAs. An ISP isn’t normally going to be on there.
https://wiki.mozilla.org/CA/Included_Certificates
https://ccadb.my.salesforce-sites.com/mozilla/CACertificatesInFirefoxReport
If whatever cert is presented by a remote website doesn’t have a certificate signed by one of those 52 organizations, your browser is going to throw up a warning page instead of showing content. KT Corporation, the ISP in question, isn’t one of those organizations.
They can go create a CA if they want, but it doesn’t do them any good unless it’s trusted by Firefox (or whatever browser people use, but I’m using Firefox, and I expect that basically the same CAs will be trusted by any browser, so…)
If ISP routers are anything like the west that means they control the DNS servers and the ones on router cannot be changed, and likely it blocks 1.1.1.1 and 8.8.8.8 and so on, as Virgin Media does (along with blocking secure DNS) in the UK for example, which definitely opens up a massive attack vector for an ISP to spin up its own website with a verified cert and malware and have the DNS resolve to that when users try to access it to either download the software needed to access this Grid System or if it’s a web portal - the portal itself.
Browser page integrity – if you’re using https – doesn’t rely on DNS responses.
If I go to “foobar.com”, there has to be a valid cert for “foobar.com”. My ISP can’t get a valid cert for foobar.com unless it has a way to insert its own CA into my browser’s list of trusted CAs (which is what some business IT departments do so that they cans snoop on traffic, but an ISP probably won’t be able to do, since they don’t have access to your computer) or has access to a trusted CA’s key, as per above.
They can make your browser go to the wrong IP address, but they can’t make that IP address present information over https that your browser believes to belong to a valid site.
I’d also add, on an unrelated note, that if the concern is bandwidth usage, which is what the article says, I don’t see why the ISP doesn’t just throttle users, based entirely on bandwidth usage. Like, sure, there are BitTorrent users that use colossal amounts of bandwidth, will cause problems for pricing based on overselling bandwidth, which is the norm for consumer broadband.
But you don’t need to do some kind of expensive, risky, fragile, and probably liability-issue-inducing attack on BitTorrent if your concern is bandwidth usage. Just start throttling down bandwidth as usage rises, regardless of protocol. Nobody ever gets cut off, but if they’re using way above their share of bandwidth, they’re gonna have a slower connection. Hell, go offer to sell them a higher-bandwidth package. You don’t lose money, nobody is installing malware, you don’t have the problem come right back as soon as some new bandwidth-munching program shows up (YouTube?), etc.
I don’t really understand the attack vector the ISP is using, unless it’s exploiting some kind of flaw in higher-level software than BitTorrent itself.
A torrent should be identified uniquely by a hash in a magnet URL.
When a BitTorrent user obtains a hash, as long as it’s from an https webpage, the ISP shouldn’t be able to spoof the hash. You’d have to either get your own key added to a browser’s keystore or have access to one of the trusted CA’s keys for that.
Once you have the hash, you should be able to find and validate the Merkle hash tree from the DHT. Unless you’ve broken SHA and can generate collisions – which an ISP isn’t going to – you shouldn’t be able to feed a user a bogus hash tree from the DHT.
Once you have the hash tree, you shouldn’t be able to feed a user any complete chunks that are bogus unless you’ve broken the hash function in BitTorrent’s tree (which I think is also SHA). You can feed them up to one byte short of a chunk, try and sandbag a download, but once they get all the data, they should be able to reject a chunk that doesn’t hash to the expected value in the tree.
I don’t see how you can reasonably attack the BitTorrent protocol, ISP or no, to try and inject malware. Maybe some higher level protocol or software package.
Five Guys does milkshakes with bacon. I’d think that bacon ice cream would work.
Faster isn’t always better – there’s software from the era that relied on hardware limitations to throttle itself – but I’d think that emulators probably have pretty good support for such throttling.
I can see people wanting to use retro software, but what surprises me is this being preferable to modern hardware running old software in emulation.
Especially a laptop, because I doubt that power management is that amazing on DOS.
Maybe there is something out there for which this addresses compatibility problems, but…
Too much Balatro.
If someone hasn’t written a software package to do so already, it’s probably possible to write one to dump and clone all the comments and posts on a server.
Nvidia is worth about $3 trillion at this point. France’s GDP is about that each year.