cm0002@lemmy.world to Programmer Humor@programming.dev · 11 months agoTell me the truth ...piefed.jeena.netimagemessage-square165linkfedilinkarrow-up11.18Karrow-down117
arrow-up11.17Karrow-down1imageTell me the truth ...piefed.jeena.netcm0002@lemmy.world to Programmer Humor@programming.dev · 11 months agomessage-square165linkfedilink
minus-squarehouseofleft@slrpnk.netlinkfedilinkEnglisharrow-up10·11 months agoWait till you here about every ascii letter. . .
minus-squareanswersplease77@lemmy.worldlinkfedilinkarrow-up5arrow-down1·11 months agowhat about them?
minus-squareIron Lynx@lemmy.worldlinkfedilinkarrow-up7·edit-211 months agoASCII was originally a 7-bit standard. If you type in ASCII on an 8-bit system, every leading bit is always 0. (Edited to specify context) At least ASCII is forward compatible with UTF-8
minus-squareJankatarch@lemmy.worldlinkfedilinkarrow-up1·11 months agoIs ascii base-7 fandom’s strongest argument…
minus-squarehouseofleft@slrpnk.netlinkfedilinkEnglisharrow-up4·11 months agoAscii needs seven bits, but is almost always encoded as bytes, so every ascii letter has a throwaway bit.
minus-squareFuckBigTech347@lemmygrad.mllinkfedilinkarrow-up1·11 months agoSome old software does use 8-Bit ASCII for special/locale specific characters. Also there is this Unicode hack where the last bit is used to determine if the byte is part of a multi-byte sequence.
Wait till you here about every ascii letter. . .
what about them?
ASCII was originally a 7-bit standard. If you type in ASCII on an 8-bit system, every leading bit is always
0.(Edited to specify context)
At least ASCII is forward compatible with UTF-8
Is ascii base-7 fandom’s strongest argument…
Ascii needs seven bits, but is almost always encoded as bytes, so every ascii letter has a throwaway bit.
Some old software does use 8-Bit ASCII for special/locale specific characters. Also there is this Unicode hack where the last bit is used to determine if the byte is part of a multi-byte sequence.