Semana del libro importado hasta con 50% dcto  Ver más

menú

0
  • argentina
  • chile
  • colombia
  • españa
  • méxico
  • perú
  • estados unidos
  • internacional
portada Mainframes, Computing on Big Iron (en Inglés)
Formato
Libro Físico
Idioma
Inglés
N° páginas
178
Encuadernación
Tapa Blanda
Dimensiones
22.9 x 15.2 x 1.0 cm
Peso
0.27 kg.
ISBN13
9781520216454

Mainframes, Computing on Big Iron (en Inglés)

Patrick Stakem (Autor) · Independently Published · Tapa Blanda

Mainframes, Computing on Big Iron (en Inglés) - Stakem, Patrick

Libro Nuevo

$ 26.070

$ 36.210

Ahorras: $ 10.140

28% descuento
  • Estado: Nuevo
  • Quedan 100+ unidades
Origen: Estados Unidos (Costos de importación incluídos en el precio)
Se enviará desde nuestra bodega entre el Viernes 26 de Julio y el Viernes 02 de Agosto.
Lo recibirás en cualquier lugar de Chile entre 1 y 3 días hábiles luego del envío.

Reseña del libro "Mainframes, Computing on Big Iron (en Inglés)"

This book covers the topic of mainframe computers, Big Iron, the room-sized units that dominated and defined computing in the 1950's and 1960's. The coverage is of efforts mainly in the United States, although significant efforts in the U.K., Germany, and others were also involved. Coverage is given for IBM and the 7 dwarfs, Burroughs, Control Data, General Electric, Hineywell, NCR, RCA, and Univac. There is also coverage of machines from Bendix, DEC, Philco, Sperry-Rand, and Sylvania.The predecessor architectures of Charles Babbage and his Differential Engine and Analytical Engine are discussed, as well as the mostly one-off predecessors Colossus, Eniac, Edvac, Whirlwind, ASCC, Sage, and Illiac-IV, How did we get where we are? Initially computers were big, unique, heavy mainframes with a dedicated priesthood of programmers and system engineers to keep them running. They were enshrined in specially air conditioned rooms with raised floor and access control. They ran one job at a time, taking punched cards as input, and producing reams of wide green-striped paper output. Data were collected on reels of magnetic tape, or large trays of punched cards. Access to these very expensive resources was necessarily limited. Computing was hard, but the advantages were obvious - we could collect and crunch data like never before, and compute things that would have worn out our slide rules. The book is focused mostly on computers that the author had experience with, although it does cover some of the one-off predecessors that lead to the mainframe industry of the 1960's. Thus, this book is not comprehensive. It probably missed your favorite. Not every machine from every manufacturer is discussed.Computers were built for one of two purposes, business accounting, or scientific calculations. There was also research in the fledging area of Computer Science, an area not yet well defined. The computers used peripherals from the Unit Record equipment, designed for business data processing. Data were typed on cards, sorted, and printed mechanically. This was a major improvements over the manual method. Herman Hollerith figured this out, and improved the processing of the U. S. Census in 1890. This took 1 year, as opposed to 8 years for the previous census. Hollerith set up a company based in Georgetown (part of the District of Columbia) on 29th street to manufacturing punched card equipment. There is a plaque on the building which housed the Tabulating Machine Company, later know as IBM. At the same time, business and science were both using mechanical calculators to handle computations. These were little changed over a hundred years or so. The technology base changed from mechanical to relay to tube, and things got faster. The arithmetic system changed from decimal to binary, because the switching elements in electronics was two-state. The next step was to put a "computer" between the card reader and the printer, and actually crunch the data.Then, a better idea evolved. Most of the time, the "big iron" was not computing, it was waiting. So, if we could devise a way to profitably use the idle time, we would increase the efficiency of the facility. This lead to the concept of time-sharing. There was a control program whose job it was to juggle the resources so that a useful program was always running. This came along about the time that remote terminals were hooked to the mainframe, to allow access from multiple, different locations. In a sense, the computer facility was virtualized; each user saw his very own machine (well, for limited periods of time, anyway). If the overhead of switching among users was not too great, the scheme worked.This evolved into a "client-server" architecture, in which the remote clients had some compute and storage capability of their own, but still relied on the big server.And, in the background, something else amazing was happening. Mainframes were built from relays and vacuum tubes, magnetic co

Opiniones del libro

Ver más opiniones de clientes
  • 0% (0)
  • 0% (0)
  • 0% (0)
  • 0% (0)
  • 0% (0)

Preguntas frecuentes sobre el libro

Todos los libros de nuestro catálogo son Originales.
El libro está escrito en Inglés.
La encuadernación de esta edición es Tapa Blanda.

Preguntas y respuestas sobre el libro

¿Tienes una pregunta sobre el libro? Inicia sesión para poder agregar tu propia pregunta.

Opiniones sobre Buscalibre

Ver más opiniones de clientes