![](/static/253f0d9b/assets/icons/icon-96x96.png)
![](https://fry.gs/pictrs/image/c6832070-8625-4688-b9e5-5d519541e092.png)
Google is certainly planning on it being viable.
They’ve been merging RISC-V support in Android and have documented the minimum extensions over the base ISA that must be implemented for Android certification
Google is certainly planning on it being viable.
They’ve been merging RISC-V support in Android and have documented the minimum extensions over the base ISA that must be implemented for Android certification
Yeah, that’s bizarre. I’d never have guessed /home was created by tmpfiles
The RK3588 is pretty nifty, and is the first Mali GPU (610) where ARM themselves have contributed the firmware upstream and have helped with Collabora with Panfrost development
Bleeding edge, still, but kernel 6.10 and Mesa 24.1 have GPU support
HDMI TX and DSI/CSI are still in-progress
I’m working off the assumption you are using one GPU for the host and one for the guest
The guest one is permanently blacklisted on the host, and you can select the passthrough settings in the GUI
If you’re dynamically detaching the GPU, my statement was incorrect
If your motherboard supports it, it’s really easy
Ensure IOMMU is enabled and run the little script in section 2.2 to see if you can isolate the graphics card
https://wiki.archlinux.org/title/PCI_passthrough_via_OVMF
After that, you can do everything in the virtual-manager GUI
If you’re relying on iMessage for privacy, ensure you and everyone you’re messaging have gone to iCloud settings and enabled “Advanced Data Protection”
I can’t say for all of them, I just knew that e.g. the z790 chipset still ran the ethernet phy, audio dsp, SPI, their version of TrustZone, etc through the chipset
https://www.funkykit.com/wp-content/uploads/2022/10/intel-z790-chipset-diagram.jpg
If you have the block diagrams for the laptop ones, I’d be curious
I enjoy that they literally did. The article says the OTA update is just to ignore a hardware sensor
Which begs the question, why was that sensor needed originally?
I haven’t looked that closely at laptop CPUs
My guess would be partially because there are fewer possible interfaces, and they’re directly connecting the CPU to a separate Ethernet/WiFi MAC, USB hub controller, and audio DSP rather than having a separate chipset arbitrating who’s talking to the CPU and doing some of those functions?
For most intents and purposes
SoC is from the embedded system development world - as more and more coprocessors were being put into the same chip to consolidate board space and power efficiency, it wasn’t “just” a cpu - it had the CPUs, GPUs, DSPs, and other coprocessors in one
x86 has moved a lot closer to this architecture over the years, but you still generally have a separate chipset controller on the motherboard the CPU interfaces with
Which is why they’re trying to make their own now
The design/manufacturing of a chip is separate from the lithography machine itself
This is the first lithography machine Russia has built. They’d be getting the 90nm ones probably from ASML
I’m so curious to see how a Qualcomm gambit plays out for Microsoft.
With the ethos at Qualcomm being support a chip for 1 year, then move on, I have trouble believing they’ll update the drivers for a major windows release
Google browbeat them for nearly 10 years, and then ended up going with the majority Samsung designed chip called Tensor just to compete against Apple in years of updates
The comment over on hackaday pointing to it being bricked possibly being down to font licensing is funny if true
System on a chip. Think like a Qualcomm or Samsung processor, or the new M line from Apple
I literally linked you to the Reddit comment, and pointed out that Google’s response used the same measurements as the comment
Are you an LLM?
In this case, it seems pretty likely. We know Google paid Reddit to train on their data, and the result used the exact same measurement from this comment suggesting putting Elmer’s glue in the pizza:
https://old.reddit.com/r/Pizza/comments/1a19s0/my_cheese_slides_off_the_pizza_too_easily/
And their deal with Reddit: https://www.cbsnews.com/news/google-reddit-60-million-deal-ai-training/
It’s also funny to use “giga” in the marketing when everyone else is dealing with petabytes
A “fun” one I ran into was all our tests passing on my desk, but failing in the test farm
After a month, we realized that having an HDMI cable plugged into the unit was corrupting the SD card due to a memory overwrite in the graphics stack
You have to enable developer mode and install with
--bypass-low-target-sdk-block
now.Dunno if they’ll remove that eventually