Talk:18-bit computing
Appearance
This is the talk page for discussing improvements to the 18-bit computing article. This is not a forum for general discussion of the article's subject. |
Article policies
|
Find sources: Google (books · news · scholar · free images · WP refs) · FENS · JSTOR · TWL |
This article is rated Start-class on Wikipedia's content assessment scale. It is of interest to the following WikiProjects: | |||||||||||
|
Size of char/byte in these systems (first UNIX)
[edit]18-bit is the word size. "Two bytes + 2 kiddies"[1], probably meant just for size comparison. But byte can actually be be 18/2=9 bits in this schema. Or 18/3=6 bits. As PDP-7 was 18-bit and first Unix machine I wander what they did? I think Unix has always used at least 7-bit ASCII from C-era UNIX, but first version is pre-C assembly. Yes, C requires byte to be at least 8-bit in C99, but was 6-bit allowed at some point? comp.arch (talk) 12:00, 25 July 2014 (UTC)
- I also wonder what the "standard" way of storing letters of text was in these systems.
- I speculate that (assembly-language programs running on) 18-bit machines probably stored characters in exactly same ways as 36-bit machines. (See Wikipedia: "36-bit" for a surprisingly long list of ways).
- --DavidCary (talk) 22:53, 18 June 2015 (UTC)