🖥️

Overview of Computer Organization Concepts

Nov 21, 2024

Lecture Notes on Structured Computer Organization

Introduction to Structured Computer Organization

  • Definition: Structured Computer Organization is a way of organizing computers as a hierarchy of levels, each performing specific functions.
  • Virtual Machines: Imaginary, simpler computers used to design real computers.
  • Languages and Levels: Computer languages are designed at multiple levels to bridge the gap between human needs and machine capabilities.

Milestones in Computer Architecture

  • Historical Development:
    • Mechanical Computers (1642-1945): Started with Blaise Pascal's calculating machine.
    • Vacuum Tubes (1945-1955): First generation with machines like ENIAC.
    • Transistors (1955-1965): Second generation marked by smaller and more reliable machines.
    • Integrated Circuits (1965-1980): Third generation with IBM's System/360.
    • Very Large Scale Integration (VLSI) (1980-present): Current generation with personal computers and smartphones.

Computer Systems

  • Processors:
    • Organization: Includes control unit, arithmetic logic unit (ALU), and registers.
    • Instruction Execution: Follows a fetch-decode-execute cycle.
    • RISC vs. CISC: RISC (Reduced Instruction Set Computer) focuses on a small, high-speed set of instructions, while CISC (Complex Instruction Set Computer) supports a wider array of instructions.
  • Memory:
    • Hierarchy: Registers, cache, main memory, secondary memory (disk storage).
    • Cache Memory: Used to speed up access by storing frequently accessed data.
  • I/O Systems: Include devices like keyboards, mice, and printers, each with specific control and data pathways.

The Digital Logic Level

  • Gates and Boolean Algebra:
    • Fundamental Circuits: Built from basic gates (AND, OR, NOT, NAND, NOR) and analyzed using Boolean algebra.
    • Implementation: Complex circuits can be constructed from simple gates using principles of Boolean algebra.
  • Basic Digital Logic Circuits:
    • Combinational Circuits: Output determined solely by input state.
    • Arithmetic Circuits: Includes adders and arithmetic logic units (ALUs).
  • Memory Organization:
    • Construction of memory cells, registers, and arrays.
    • Flip-Flops: Basic memory elements for storing binary data.

Parallel Computer Architectures

  • On-Chip Parallelism: Use of multiple processing units on a single chip.
  • Shared-Memory Multiprocessors: Multiple CPUs accessing a common memory space.
  • Grid Computing: Distributed computing across a network of computers.