OT-ish: History of the Ctrl key . . please?

Grazie wrote on 7/16/2005, 12:06 PM
The Control - Ctrl key - tab. When & why did it first appear?

Please make this simple . .TIA . .Grazie

Comments

ScottW wrote on 7/16/2005, 1:00 PM
Back in the days of the teletype and serial communications at 300 baud or less....

The ASCII character set contains a number of printable and non-printable characters; the non-printable characters are typically known as "control characters" and they usually have special meanings depending on the context. Control characters may also act as format effectors (a tab character is an example of a control character that is also a format effector - because it's such a useful character, many keyboards started assigning it it's own key - tab and CTRL/T used have the same effect on computers before windows came along .

Since the control characters are, by definition, non-printable, some way needed to be provided to access/create them from the keyboard, which is a print oriented device. Hence the CTRL key - pressing the CTRL key and then using one of the printable keys in conjunction generate a control character rather than the usual printable character. Now, this isn't really quite what happens with something like Windows, but the application does know that what was typed wasn't a printable character.

The concept of control characters appeared with the development of the telegraph - Morse code actually has some characters that could be considered "control characters" in the sense that they aren't printable.

<edit>
Thinking about it, the concept of a control character may pre-date even the telegraph.

--Scott
JohnnyRoy wrote on 7/16/2005, 1:26 PM
And what fun we had embedding them. The backspace key is Ctrl+H. Back in early 1970’s, IBM mainframes that ran the VM operating system had a command called SEND. You would use it with a userid and text to send (... you guessed it) an instant message! (and you though instant messaging was something new, it’s over 35 years old).

You would send a message like:
SEND bob How about lunch at 11:30?
And the person with the user bob would receive a message like:
MSG FROM ROFRANO: How about lunch at 11:30?
BUT, if you typed:
SEND bob ^h^h^h^h^h^h^h^hTHEBOSS: I want to see you in my office now!!!
The ^h (Ctrl+H) would become backspaces and back over the name of the real sender and the message would instead look like:

MSG FROM THEBOSS: I want to see you in my office now!!!
Then you would wait for Bob to go running down to the boss’s office (as you chuckled). And you thought mainframe programmers we a dull bunch.

I use to program on a DecWriter which was like a teletypewriter. It predated the CRT screen and control keys were the only way to correct mistakes. (which is why I remember the backspace so well) ;-)

~jr
JJKizak wrote on 7/16/2005, 3:45 PM
The teletype machines were a real wonderment. Everybody wondered how that all mechanical thing could work so fast and not fly apart. A real testament to mechanical technology. The MK-1A all mechanical US Navy computer was another wonderment. I know because I had to work on it and the teletype machines. What makes me wonder even further is that who in the world today could even conceive of something like those machines let alone make them work.

JJK
ScottW wrote on 7/16/2005, 3:55 PM
ALT is similar to CTRL in that it gives you access to a different space. With 7 bit ASCII you have 127 possible keys; control characters were contained in the values 0-31 (decimal), with the printable characters consuming the rest of the available space (with the exception of the delete key which is a special case).

Along comes 8 bit ASCII; the ALT key gives you a way to access the 8th bit. Holding down the ALT key basically turns on bit 8, then pressing a key on the keyboard fills in the lower 7 bits, combining the CTRL key with the keys on the keyboard gives you complete access to the 8 bit character space.

--Scott

Coursedesign wrote on 7/16/2005, 4:19 PM
I used the world's first computer forum, somewhat like this one, in the 1970s, long before there was an Internet

It was called KOM and was developed at the QZ Computer Center in Stockholm, Sweden.

Everybody who used it thought it felt completely natural immediately, and it became a monster hit.

You could access it while at the center, or if you were terminally hip (no pun intended) you had your own terminal at home, and connected via a blindingly fast acoustically coupled 110 bps modem. After another half decade or so you needed sunglasses, because of the new blazing 1200/75 modems that downloaded at 1200bps (120 characters per second!) and uploaded at 75 bps (keyboard input of course).

Ctrl-I was tab, and Ctrl-T was DC4 (Device Control 4), btw.

I remember on the DEC-10 mainframes of that era, after a terminal user pressed a key, it took the computer 1 millisecond to process it to figure out which key was pressed...

I used mostly IBM mainframes initially though, with 1MB of RAM (the biggest one even had 2 MB, ooooh!) for primary memory.

No stinkin' silicon chips here, this was handknitted core memory. Not a joke at all, little old women knitted thin copper wires through ferrite core beads into a woven pattern on a frame. Eight million beads to get a megabyte....

I had to pay to use these suckers (I developed electronic circuits, speaker enclosures and more), and mistakes were expensive. When starting your own program from a terminal, it was important to set a time limit in case you had written sloppy code that hung in an infinite loop. I usually set this to 15 seconds for small programs that I expected to run in maybe 5 seconds. Then if I got the message "Time Limit Exceeded" I knew I was only hit for 15 bucks. Better than running unprotected and getting a mortgage-sized bill for a program that didn't even complete...

I don't remember when the ALT key first appeared, I wonder if it was to replace the use of the ESC key as a mode changer? Many terminal control functions needed more than the few control codes available in the ASCII alphabet, so they used ESC sequences such as ESC-A (ESC followed by A) to signify one function, etc.

ALT-A etc. is mode-less, ie. it's just a single code, not a sequence. This necessitates going beyond the original 128 ASCII character codes. They were maxed out at 128 because that used up 7 bits and the last bit in an 8-bit byte (they don't have to be 8 bits!) was reserved for parity because all long connections were so unreliable.

One slightly unusual computer I used was the DEC PDP-15, an 18-bit "mini" (a mini computer was one that filled up only a smaller room...).

Why 18 bits? Because then you could store three six-bit characters in one 18-bit word. Remember memory was really scarce resource in those days, so it made sense for some people at the time to live with six bits per character, which was enough if you only used uppercase anyway.... (six bits = 64 codes, only enough for upper case + control codes).
ScottW wrote on 7/16/2005, 5:30 PM
Thanks for catching my CTRL-T mitake - it didn't feel right when I wrote it, and now I know why. It's been way too long...

ESC is a defined control character; it didn't give access to the different cell spaces within the 7 or 8 bits like the CTRL or ALT key do. ESC was/is typically used to create, well, meta commands. For example, if you want to change character sets, you prefix the data stream with the appropriate ESC sequence (ESC followed by a variable number of additional characters) that specifies the character sets you are switching too. These types of sequences were defined by standards bodies such as ISO or CCITT (who also defined the character sets as well). IIRC, ASCII is really ISO 646 (or 646 is ASCII, or 7-bit ASCII).

Many character cell terminals of the time used ESC sequences to control cursor movement as well as other functions, since with the advent of the 24x80 display (or sometimes 25x80) you could address any of the 1920 available character cells using an X Y coordinate system (or even relatively with some terminals).

Honeywell also made some slightly unusual computers with a 36 bit machine word. Great for doing BCD calculations.

--Scott
Grazie wrote on 7/16/2005, 8:59 PM
Thank you all very much - INDEED! :) G
p@mast3rs wrote on 7/16/2005, 9:11 PM
Why do we have a scroll lock? Also the Insert key. What benefits do these provide or what did they once provide
ScottW wrote on 7/16/2005, 9:40 PM
Display terminals used to scroll (in windows, go to Start, run program and type in CMD, then type DIR - the CMD window is a character cell interface and represents the type of terminal that people used to deal with). Scroll lock would send a CTRL/S (aka XOFF) to stop data transmission from the host, thus pausing the display of information. CTRL/Q (XON) would resume data transmission.

Insert key - In text editors, there's a difference betwen insert and overstrike ( this is a difference that MS word and probably other editors still honor). The insert key would toggle between these two modes. Insert mode inserts text at the cursor position, overstrike mode overwrites text. Insert might also be used to insert text that had been cut and copied to the clipboard (a windows term), aka the paste buffer.

Really, many of the keys that are on keyboards (that you don't understand) are a result of developments from Digital Equipment (DEC) - some of the keys are very specific to the days of the 24x80 character cell terminal and full screen editing (which is not the same as a WYSIWYG editing), many of which have been re-mapped to the windows environment that most folks are familiar with today.

--Scott
MH_Stevens wrote on 7/16/2005, 11:45 PM
Grazie: All this talk of teletype reminds me of:

AR SNA L 1 S CUN T HOR PE 2 ..............

..................

W EST BRO MW HICH 3 LI VERP O O L 0

Do you remember this?

Grazie wrote on 7/16/2005, 11:50 PM
Unfortunately yes . .. .

That bobbing up and down head of the printer head. Well, in its day it was "Of-The-Now" . . . Even now we have a fake dibbily dibbily sound to mimic the read/type-outs - we are just totally weird, us humans.

Grazie
Coursedesign wrote on 7/16/2005, 11:58 PM
One early text editor I used on an HP 1000F minicomputer functioned a bit differently from today's slick offerings.

You wrote a source program and hit CTRL-S to save. The 5MB disk drive (full 19" rack width, 3RU high) whirred, and the terminal showed "103,15".

Quick, where did I put that pad and pencil? Here, lemme write this down. ENCL SRC 103,15.

This meant that next time I wanted to access the source file for the ENCL program, I would look up its location in the directory (my note pad next to the terminal), and then type in EDIT 103,15 which would then pull the source file up on the screen from track 103, sector 15. Do a bit more work and save again. "87,4" was the response, so I erased the 103,15 next to ENCL SRC and wrote in pencil 87,4". Exactly the same functionality as in a modern file system, except I was the one doing the work...

To compile the program, the first step was to enter "PTP 87,4". This read the text starting at 87,4 and spewed out a paper tape across the room. This completed, I ran FTN PTP to get the Fortran compiler to suck in the paper tape again and have it spew object code onto paper tape output across the room, at which point it was time to get the ballpoint pen out and mark which tape was source and which was object. Then a mere linking, and a final output tape was produced that could be executed on demand...

Later when Unix became available, I felt like I was spoiled rotten. A file system? You mean no pad and pencil? This is too good!

After tiring of the early full screen editor "vi" (which probably stood for Visual Interface), I started using emacs, which I think inspired the much later, and rather nice for its time, program Wordstar. Anyway, in emacs, if you saved a document before you had given it a name, it didn't just put up a "pop up" to ask you what name you wanted to save it under. It just gladly saved it under the filename "gazonk.tmp", which was somewhat effective in teaching you not to be so stooopid again... :O)

Trivia question for oldtimers and noobs both: What does UNIX stand for?

Should be easy for this highly enlightened crowd!

Added: the "management" answer that the name was a pun on Multics (an earlier fairly advanced interactive OS) doesn't apply here.
apit34356 wrote on 7/17/2005, 12:55 AM
Pmasters, about the scrolls keys, during the early 1970's, the very high-end IBM 370 models offer a visual editor that would store a number of display pages internally, the scroll keys permitted accessing the terminal memory without mainframe interaction,though the special terminals were directly wired into a high speed IO server that was inside the "Computer-room". In 1974, TI offerred the first true visual editor for a minicomputer, (though xerox gets all the publicly because of Apple), which had a 180kilobyte bandwight, which was directly wired into the mini, todays PC video cards minic this design. Though nice and massive looking, IBM terminal was a poorly designed visual editor, its was far more productive to offload the programs from the 370's to the TI mini, edit, then upload, ( of course, we are talking about high speed IBM IO port connection, not 110, 300 baud.
johnmeyer wrote on 7/17/2005, 6:41 AM
And then there is the question as to why IBM decided, when they introduced the AT computer back in 1984, to move the Ctrl key from it's position to the left of the "A" to some undisclosed location. Ever since, I've had to re-map every keyboard on every computer I've ever owned to get the Ctrl key back where my fingers can find it -- especially since I still use the Wordstar keyboard layout for all my word processor needs (I have MS Word macros I developed a decade ago that let you control Word with Wordstar keystrokes.

riredale wrote on 7/17/2005, 9:54 AM
I just eat up this trivia.

My first real job was as a tape drive cleaner/punch card sorter at a local Burroughs factory in Pasadena, California, in 1966. Do you remember when only a few big companies ruled the computer world? Besides IBM, the big gorilla, there was the "BUNCH": Burroughs, Univac, NCR, Control Data, and Honeywell.

The Burroughs people were extremely proud of their OS which ran on the big Burroughs 5400 (5500?) mainframe. It was properly called the MCP (Master Control Program) and the big mainframe was intentionally starkly different from the big IBM iron of the day. While the IBM boxes had hundreds of indicators and switches (you still catch glimpses on some old cheezy sci-fi sets) the Burroughs operator's console consisted of just an elegantly-shaped swoopy desk with a teletype unit and a control panel built into the desk that had just two inch-square push buttons, labelled "Run" and "Stop."

Lots more memories from those days. Maybe there's a forum somewhere that is ideal for reminiscing..
Coursedesign wrote on 7/17/2005, 2:24 PM
While the IBM boxes had hundreds of indicators and switches (you still catch glimpses on some old cheezy sci-fi sets)

Most of them really did. I remember working for the Phone Company main data center. I worked in the IBM section, with a 370/158 accessing a dozen tape stations with vacuum columns, and feeding eight IBM 1403 line printers continuously printing phone bills 24 hours/day, 7 days/week.

You may think there must have been a LOT of maintenance on these printers, since they were all running 24/7 all year.

Sure. Once a month, a guy came from IBM to vacuum the paper dust out of them. But wait, there's more. Half of the printers had been running since the inception of the data center seven years earlier, so their covers got a new paint job at that time.

Many people thought this was the most reliable printer ever made. The characters to be printed were on a rotating chain, and each character position on the line was struck when the right character passed by on the chain. The designers said it was equivalent to sitting on a train going 60 mph and picking daisies one by one through every opening of a picket fence...

Contrast this with the ultramodern dual Univac 1107 at the far end of the same hall, this was used for 411 number information. They had two mainframes side-by-side, with one on standby for use when the first one failed, which was about every day. Huge lamp-and-switch panels, but I got curious when I saw a Polaroid camera hanging from a screw attached to the front panel. Huh? The operator said this was to capture the machine status when it croaked... He would jump on his feet and run over, grabbing the camera and hopefully getting a quick snapshot before the lights went out...
JJKizak wrote on 7/17/2005, 4:19 PM
You can bet they will not make anything that reliable again. No money in it for the bean counters to count.

JJK
Jay Gladwell wrote on 7/17/2005, 5:55 PM

Ah! Back in my day, we didn't have to mess with any of this "key" stuff. All you needed was a hammer, chisel and a slab of stone that wasn't prone to cracking. Although correcting misspellings was a bear!


apit34356 wrote on 7/17/2005, 6:03 PM
The the high end models of 360s and 370s were water cooled machines and some of the older 360s used iron cores for memory, talk about read/write cycle times. One interesting thing in the computer hardware upgrade market during 1970s was when rare metal dealers discovered that the IBM ICs had gold connections and gold solder points.
Coursedesign wrote on 7/17/2005, 7:39 PM
the high end models of 360s and 370s were water cooled machines
That was the reason we had a fountain outside... Many wondered...

I don't remember the ICs having gold connections, are you talking about circuit board connectors? There were a lot of those...

I remember much later, we bought a DEC VAX 11/785 for $750,000. After putting it to good use for 3 years, we needed a faster machine (a DEC 8650 was the ticket then), so we asked for highest bidders on our like-new $750,000 computer.

Highest bid?

$500.00

Next highest bid?

-$500.00

Huh? That's right, us paying to have it removed and recycled.

Most DEC computers had their gold recycled and the rest turned into roof tiles for McDonald's. No kidding.

In the meantime, Ken Olsen, then head of Digital Equipment, was asking rhetorically, "Why would anybody want to use a computer at home?"

I visited DEC HQ during their golden era, and saw a very very modest pace until suddenly, in the middle of our Friday afternoon meeting, the fire alarm went off. All employees dropped what they were working on and headed down the corridor at high speed. I was swept along by the sudden tide of employee energy, and everybody gathered in what I figured must be the designated gathering point for emergencies.

Turned out I was almost right. The "fire alarm" at 3pm on Friday indicated that it was time to consume company ice cream from huge barrels, with topping dispensers that shamed any ice cream parlor I have ever seen.

After that, we waddled back to continue our meeting...
apit34356 wrote on 7/17/2005, 9:04 PM
Coursedesign, high-speed IC chips manufactured from IBM and TI use gold wire to connect the silicon chip to the ext. pins, before being encased in plastic or cermic packaging.

The DEC line was very popular, do you remember PRIME computers. During the 70's I had the pleasure of working with Gene Amdhal, bring out the Amdhal 460 line. I help rewrite the microcode for the instructional pipeline for the 460 before joining the TI 16-bit microproc project in the 1974, helping design the hardware and firmware multi/divide instruction set. After a month, TI moved me to their super computer project, which really pissed off my friends at IBM(????), TI's target was Control Data and Cray computing, not IBM, but IBM was really pissed off about TI entering the super computing and stop buying ICs from them. Since IBM really gave TI a big break in the IC business, ie the Microsoft story, it was sad. IBM was able to slow TI's market acceptence, something IBM did not do with MS.
Coursedesign wrote on 7/17/2005, 10:13 PM
I remember PRIME computers, and somewhere in storage I have a poster with all the minicomputer manufacturers of the era, a really really unusual piece. :O)

Amdahl was a genius also, must have been a lot of fun to work with him.

I can't recall what TI was doing in minicomputers? I do remember their 9900 microprocessor, very elegant architecture with even memory-to-memory instructions.

Goldwires inside the chip packaging of course, but how were they able to recycle so little gold cost-effectively from a nasty ceramic or plastic encasing?
apit34356 wrote on 7/18/2005, 1:56 AM
Amdahl was a very hard worker, a good promoter, but very weak in the boardroom. I like Gene a lot, though IBM believed that he should stayed in-house, of course they wanted another 5-10 years out of the 360/370 line. Gene's project forced IBM to intro the 370 sooner that planned. One interesting dinner I remember where Gene was able to convince F.of.Japan to to use a new fab. process, really radial in the 70's in the ic design, but really cost effective in highspeed ics, which the two are considered to be at the oppose ends in super computing design. I wish I had a film unit there to capture the expressions of F.of.Japan people when they realized what this new design approach met, they now were in the high-end computer market and a new leader in high speed ics .

The 9900 design permitted large number of micro-cpus to share memory/program space. TI believed that memory size and speed would increase "ln", but speed of dynamic memory,(because of cycle\refresh issues) never approach basic CPU cycle speed,( of course-static memory can). This design issue, plus IBM pushing the Intel in a "in your face move" and TI's limited resources......, resulted in TI selling the computer division to HP during the late 80's. We built a large number of miltilary systems using the 9900 design.

About the Gold, Gold will not chemically bond under extreme high temperatures, all you do is crush the ic, chemically dissolve or just melt the mix, gold seperates out. The purest Gold available was used in the IBM ics and circuit boards. if you are famaliar with isotopes and elements, certain isotopes of gold have unique properties.