Early computers lacked any form of operating system. The user had sole use of the machine; he would arrive at the machine armed with his program and data, often on punched paper tape. The program would be loaded into the machine, and the machine set to work, until the program stopped, or maybe more likely, crashed. Programs could generally be debugged via a front panel using switches and lights; it is said that Alan Turing was a master of this on the early Manchester Mark I machine. He drew the seminal concepts of operating systems from the Universal Turing Machine concept.
Later, machines came with libraries of support code which were linked to the user's program to assist in operations such as input and output. This would become the genesis of the modern-day operating system. However, machines still ran a single job at a time; at Cambridge University in England the job queue was at one time a washing line from which tapes were hung with clothes pegs. The color of the pegs indicated the priority of the job.
As machines became more powerful, the time needed for a run of a program diminished and the time to hand off the equipment became very large by comparison. Accounting for and paying for machine usage went from checking the wall clock to using the computer to do the timing. Run queues went from being people waiting at the door to stacks of media waiting on a table to using the hardware of the machine such as switching which magnetic tape drive were online or stacking punch cards on top of the previous jobs cards in the reader. Operating the computer went from a task performed by the program developer to a job for full time dedicated machine operators. When commercially available computer centers found they had to deal with accidental or malicious tampering of the accounting information, equipment vendors were encouraged to enhance the properties of the runtime libraries to prevent misuse of the systems resources. Accounting