1. Introduction to System Programming

Introduction to System Programming

System programming involves building and managing the operation of a computing system. It is a crucial aspect of computer science that deals with writing and maintaining system software. This note introduces the fundamental concepts of system programming, focusing on processes and threads, multithreading, memory isolation, and concepts like concurrency and parallelism.

System Programming Definition

System Programming refers to the act of developing and managing an operating system's functionality. It involves the creation of programs that directly interact with the operating system, providing services to other software and controlling hardware functionalities.

Process in System Programming

A process can be viewed as a program in a state of execution. Processes are managed by the operating system and can be seen in the task manager of a computer system. They constitute the dynamic entity within a computer system, consuming resources to perform specific tasks.

Key Points:

Code Example:

Consider the following code snippet in C which creates a child process using the fork() function.

#include <stdio.h>
#include <sys/types.h>
#include <unistd.h>

void main() {
    pid_t pid;
    /* fork a child process */
    pid = fork();

    if (pid < 0) { /* error occurred */
        fprintf(stderr, "Fork Failed");
        return 1;
    }
    else if (pid == 0) { /* child process */
        printf("I am the child %d\\n",pid);
    }
    else { /* parent process */
        printf("I am the parent %d\\n",pid);
    }
}

Threads in System Programming

A thread is the smallest unit of execution within a process. It is a flow of execution through the process code with its own program counter, registers, and stack.

Key Points:

Multithreading

Multithreading refers to the ability of a processor to execute multiple threads concurrently. It is a widespread programming and execution model that allows multiple threads to exist within the context of a single process, sharing the process resources but able to execute independently.

Key Points:

Concurrency and Parallelism

Concurrency is the execution of the multiple instruction sequences at the same time. It happens when two or more tasks can start, run, and complete in overlapping time periods. It doesn't necessarily mean they'll ever both be running at the same instant. For example, multitasking on a single-core machine.

// Concurrency in C with the use of pthread
#include <stdio.h>
#include <pthread.h>

void *printMessage(void *message) {
   printf("%s\\n", (char *)message);
   pthread_exit(NULL);
}

int main() {
   pthread_t thread1, thread2;

   char *message1 = "Thread 1";
   char *message2 = "Thread 2";

   pthread_create(&thread1, NULL, printMessage, (void *)message1);
   pthread_create(&thread2, NULL, printMessage, (void *)message2);

   pthread_join(thread1, NULL);
   pthread_join(thread2, NULL);

   return 0;
}

Parallelism is the simultaneous execution of multiple things. It occurs when tasks literally run at the same time, for example, on a multicore processor.

// Parallelism in C with the use of OpenMP
#include <stdio.h>
#include <omp.h>

void main() {
    #pragma omp parallel num_threads(2)
    {
        int id = omp_get_thread_num();
        printf("Hello from thread %d\\n", id);
    }
}

System programming is a complex yet fascinating field, involving a deep understanding of both software and hardware interactions. This note only scratches the surface, and there are many more concepts like scheduling, memory management, file systems, etc., to explore.

Reference

The content in this document is based on the original notes provided in Azerbaijani. For further details, you can refer to the original document using the following link:

Original Note - Azerbaijani Version