Evolution of Programming Languages and Data Representation and Storage

Unit I: Introduction to Computing

Programming in C

Understanding how languages evolve and how computers store and represent different types of data

Evolution of Programming Languages

Programming came into existence to enable humans to instruct machines to perform tasks efficiently and accurately. Over time, programming languages and practices have evolved to meet the growing complexity of computing tasks and the increasing demands of software development.
šŸ­
Essential Skill
in Every Industry
⚔
Driving Innovation
& Automation
šŸš€
Technological
Advancement

What is Code?

In computing, "code" refers to a set of instructions written in a programming language that tells a computer what to do. It consists of statements, expressions, and other constructs that are used to create programs.

Early History of Programming

Mid-19th Century: The Foundation

Charles Babbage conceptualized the idea of a programmable mechanical computer, known as the Analytical Engine. Although Babbage's design was never completed, it laid the foundation for modern computing.

Babbage's Analytical Engine

Babbage's Analytical Engine design

Punched Cards for Analytical Engine

Punched cards used for programming the Analytical Engine

Ada Lovelace: First Programmer

Ada Lovelace, often considered the world's first computer programmer, wrote an algorithm for the Analytical Engine in the mid-19th century. Her work on the engine's programming is considered pioneering in the field of computing.

Ada Lovelace, First Computer Programmer

Ada Lovelace (1815-1852), Mathematician and Writer

Mid-20th Century: Electronic Era

The development of electronic computers led to the creation of early programming languages such as assembly language and machine code. These languages directly corresponded to the instructions executed by the computer's hardware.

"Programming is a way to communicate with computers, defining a set of instructions that are compiled together for the CPU to complete specific tasks."

Machine Language & Communication

Machine Code

Computer is a machine and understands machine language only, consisting of only 0s and 1s in binary that a computer can understand.

10110000 01100001
11000000 10101010
01010101 11110000

In the real world we communicate using different languages. Similarly, each programming language has its own collection of keywords and syntax for constructing instructions.

Why So Many Languages?

  • Writing machine (binary) code is difficult for humans
  • Programming languages provide convenient, human-readable way
  • Different languages serve different purposes
  • Evolution driven by specific needs and improvements

Compilation Process

Programming languages use Compiler or Interpreter to convert high-level language to low-level language that machines can understand.

Five Generations of Programming Languages

First Generation

Machine Languages

  • Simplest type of computer language
  • Uses binary code (0s and 1s)
  • Interfaces directly with hardware
  • Machine-specific applications
  • Only executes on original hardware

Second Generation

Assembly Languages

  • Human-readable mnemonics
  • Easier than binary code
  • Requires assembler for conversion
  • Used for operating systems
  • Device drivers development

Third Generation

Procedural Languages

  • High-level programming languages
  • Human language-like syntax
  • Examples: C, C++, Java, FORTRAN, PASCAL
  • Requires compiler or interpreter
  • More user-friendly

Advanced Generations (4GL & 5GL)

Fourth Generation Languages

Focus on WHAT to do

  • Everyday human language syntax
  • Focus on tasks, not implementation
  • Database handling
  • Report generation
  • GUI development
Examples: SQL, Python, Perl, Ruby, MATLAB

Fifth Generation Languages

Visual Programming & AI

  • Latest stage in programming evolution
  • Visual programming tools
  • Constraint-based logic
  • Define goals, system generates code
  • Artificial Intelligence focus
Examples: Prolog, OPS5, Mercury

Classification Summary

Low-Level Languages: First two generations (1GL, 2GL)
High-Level Languages: Next three generations (3GL, 4GL, 5GL)

Programming Language Levels

Programming languages are divided into 3 categories based on speed, ease of use, memory, and level of abstraction.

Low-Level

Machine Language

No Abstraction

Mid-Level

Assembly Language

Less Abstraction

High-Level

Human-like Languages

Higher Abstraction

High-Level Languages

High-level language is a computer language which can be understood by users. It's very similar to human languages and has a set of grammar rules that make instructions more easily understandable.

Abstraction Concept

Machine language provides no abstraction, assembly language provides less abstraction, and high-level languages provide higher abstraction by hiding internal implementation details.

Programming Paradigms

A programming paradigm is an approach to solve problems using programming languages. These paradigms are based on readability, complexity, customization, and optimization.

Imperative Paradigm (Focus on HOW)

The oldest and most basic programming approach. Code describes a step-by-step process for program execution. Developers focus on how to get an answer step by step.

Types: Procedural and Object-Oriented Programming
Examples: Java, C, Python, Ruby
  • More readable and complex
  • Easy customization
  • Step-by-step instruction focus
Programming Paradigms Classification

Classification of Programming Paradigms (Source: Medium)

Declarative Paradigm (Focus on WHAT)

Writing declarative code forces you to ask first what you want out of your program rather than how to achieve it.

Types: Logic and Functional Programming
Examples: SQL, Haskell, Prolog
  • Better optimization
  • Focus on desired outcome
  • Less concerned with implementation

Multi-Paradigm Support

Over the years, some imperative languages have received updates allowing them to support declarative-style programming. Examples: Java, Python

Future of Programming

Paradigm Shifts Throughout History

The evolution of programming has been marked by major transitions:

  • Procedural Programming → Object-Oriented Programming (OOP)
  • OOP → Functional Programming
  • Sequential → Concurrent Programming
  • Traditional → Reactive and Event-Driven Programming

šŸ¤– Artificial Intelligence & Machine Learning

AI and ML technologies are reshaping industries. Growing demand for skills in natural language processing, computer vision, and deep learning.

šŸ› ļø Low-Code/No-Code Development

Democratizing software development, enabling non-technical individuals to create applications. Traditional programming skills remain valuable alongside these platforms.

"The future of programming is characterized by innovation, adaptation, and continuous learning. Developers who embrace emerging technologies and cultivate diverse skill sets will thrive in this dynamic landscape."

The Journey Continues

From Charles Babbage's mechanical computer to today's AI-powered systems, programming has evolved from simple machine instructions to sophisticated problem-solving tools that drive innovation across every industry.

5 Generations

From Machine Code to AI-Driven Languages

Multiple Paradigms

Imperative, Declarative, and Multi-Paradigm

Continuous Evolution

Adapting to New Computing Challenges

Key Takeaways

  • Programming languages evolved to make human-computer interaction easier
  • Each generation brought higher levels of abstraction
  • Modern paradigms focus on both "what" and "how" to solve problems
  • Future trends point toward AI integration and democratized development
  • Continuous learning and adaptation are essential for developers
"Programming: Transforming Human Ideas into Digital Reality"

Part II: Data Representation and Storage

Understanding how computers store and represent different types of data

What We'll Cover

Binary Number System - The language of computers
Data Types - Different kinds of data computers handle
Memory Organization - How data is stored in computer memory
Character Encoding - How text is represented
Number Systems - Binary, Decimal, Hexadecimal conversions

Why Do Computers Use Binary?

Electronic Switches

ON = 1
OFF = 0

Reliability

Easy to distinguish
between two states

Digital Logic

Perfect for
Boolean operations

Everything in a computer is ultimately represented as 0s and 1s

Binary Number System

Decimal

13
→

Binary

1101

How it works:

1
1
0
1
8Ɨ1=8
4Ɨ1=4
2Ɨ0=0
1Ɨ1=1
8 + 4 + 0 + 1 = 13

Fundamental Data Types in C

char

Size: 1 byte (8 bits)

Range: -128 to 127

Use: Single characters

int

Size: 4 bytes (32 bits)

Range: -2³¹ to 2³¹-1

Use: Whole numbers

float

Size: 4 bytes (32 bits)

Precision: ~7 decimal digits

Use: Decimal numbers

double

Size: 8 bytes (64 bits)

Precision: ~15 decimal digits

Use: High-precision decimals

Memory Organization

Each memory location has an address

1000
1001
1002
1003
1004
1005
1006
1007

Byte

8 bits
Smallest addressable unit

Word

Usually 4 bytes
Size of int on most systems

View Detailed Memory Layout

How Integers are Stored

Positive: 42

0
0
1
0
1
0
1
0

Negative: -42

1
1
0
1
0
1
1
0
Two's Complement: Method used to represent negative numbers
Sign Bit: Most significant bit indicates positive (0) or negative (1)
Range: With n bits, we can represent -2^(n-1) to 2^(n-1)-1

Floating Point Representation

IEEE 754 Standard (32-bit float)

S
E
E
E
E
E
E
E
E
M
M
M
Sign (1 bit)

0 = Positive
1 = Negative

Exponent (8 bits)

Power of 2
(Biased by 127)

Mantissa (23 bits)

Fractional part
(Significant digits)

Character Encoding

Character

'A'
→

ASCII Code

65
→

Binary

01000001
ASCII: American Standard Code for Information Interchange (0-127)
Extended ASCII: 8-bit encoding (0-255)
Unicode: Universal character encoding (supports all languages)
UTF-8: Variable-length encoding for Unicode

Number System Conversions

Decimal to Binary

Method: Divide by 2, collect remainders

Example: 13 Ć· 2 = 6 R1
6 Ć· 2 = 3 R0
3 Ć· 2 = 1 R1
1 Ć· 2 = 0 R1

Result: 1101

Binary to Decimal

Method: Sum powers of 2

Example: 1101
1Ɨ2³ + 1Ɨ2² + 0Ɨ2¹ + 1Ɨ2⁰
8 + 4 + 0 + 1

Result: 13

Hexadecimal

Base 16: 0-9, A-F

Example: 13₁₀ = D₁₆

Binary: 1101ā‚‚ = D₁₆

Use: Compact representation

Octal

Base 8: 0-7

Example: 13₁₀ = 15ā‚ˆ

Binary: 1101ā‚‚ = 15ā‚ˆ

Use: Historical importance

Practical Storage Examples

Integer: 42

Memory: 4 bytes

0
0
1
0
1
0
1
0

Hex: 0x0000002A

Character: 'C'

ASCII: 67

Memory: 1 byte

0
1
0
0
0
0
1
1

Hex: 0x43

String: "Hi"

H: ASCII 72 (0x48)

i: ASCII 105 (0x69)

Memory: 3 bytes (including \\0)

Storage: [72][105][0]

Float: 3.14

IEEE 754:

Sign: 0 (positive)

Exponent: Biased

Mantissa: Fraction bits

Memory: 4 bytes

Key Takeaways

Binary Foundation: All computer data is stored as binary (0s and 1s)
Data Types: Different types require different amounts of memory and storage formats
Memory Addressing: Each byte in memory has a unique address
Character Encoding: Text is stored using encoding schemes like ASCII and Unicode
Number Systems: Understanding binary, decimal, and hexadecimal conversions is essential
Next: We'll explore how to use these concepts in C programming!