Section 1

Preview this deck

What are the two main mechanisms used to protect Sections?

Front

Star 0%
Star 0%
Star 0%
Star 0%
Star 0%

0.0

0 reviews

5
0
4
0
3
0
2
0
1
0

Active users

0

All-time users

0

Favorites

0

Last updated

6 years ago

Date created

Mar 14, 2020

Cards (27)

Section 1

(27 cards)

What are the two main mechanisms used to protect Sections?

Front

Mutex locks and semaphores. Notes that mutex locks and semaphores by themselves are not efficient unless they block while waiting for access as opposed to "busy waiting." It is the operating system that provides a mechanism for this service, but the developer's job to implement it.

Back

What is the classic simplified deadlock scenario?

Front

(look at pic)

Back

What is a physical address?

Front

The actual address in memory used to receive instructions or data. The act of translating a logical address into a physical address is called address binding

Back

What is contiguous memory allocation?

Front

Processes are stored in single, unbroken chunks. When processes terminate, holes are formed, where they can be used for future processes. Over time, memory can be filled with many small holes, too small for contiguous processes to fit (this is called External Fragmentation). Compaction can be used to defragment memory

Back

Name the two main types of IPC and describe at least one consideration in their implementation

Front

1. Shared Memory - How to manage synchronization between two or more processes. 2. Message Passing - How to establish the link. How many links. What is the capacity of the link. How to bound the size of the data passed between processes. Is the link unidirectional or bidirectional.

Back

What is paging?

Front

Solves the problem of external fragmentation. Memory is divided into frames of a fixed size (512 bytes-1GB). processes are divided into pages of the same size. The logical address is <page #, offset>. Pages can be scattered throughout memory. Internal fragmentation is the trade-off

Back

What is caching and why is it used?

Front

Caching is when the OS moves data from a larger/slower memory location to a smaller/faster memory location. This is done so that data which the OS anticipates being necessary soon is available for quick access when it is needed. Operating systems use a variety of algorithms to anticipate which data will be necessary soon

Back

Why are threads an important part of an operating system?

Front

They allow a single process to support concurrency or parallelism. For example, a word processor can support one thread that receives keyboard/mouse input from a user, and can also run a spell checker or autosave in the "background"

Back

What is the difference between concurrency and parallelism?

Front

Concurrency: One processor that supports more than one task by allowing all the tasks to make progress Parallelism: Multiple processors that can perform more than one task simultaneously. The multiple processors can also do concurrency.

Back

What is the "tricky" part of "SJRF" scheduling algorithm? How do you solve it?

Front

Determining how long each process will need in its next CPU Cycle

Back

Address Binding: Compile time, Load time, Execution time

Front

Compile Time: The compiler decides what absolute address will be used. The compiler must know where the process will reside in memory Load Time: Compiler generates relocatable code. Final binding is done when the process is loaded. Execution Time: Process can be moved in memory during execution. Binding must be delayed until run time

Back

What are the 5 Process States?

Front

1. New 2. Executing 3. Waiting 4. Ready 5. Terminated

Back

Name and describe the 6 scheduling algorithms from the text

Front

1. First Come First Serve (FCFS) - process/thread is allowed to run until the CPU-burst is complete regardless of the time. 2. Shortest Job First (SJF) - as processes enter the ready queue, the shortest job gets priority. 2a. Shortest Job Remaining First (SJRF) - use preemption to swap out process with less time remaining 3. Priority Scheduling - Order (and preempt) based on user-given priority (starvation is an issue) 4. Round-Robin - use time quanta on a not-to-exceed basis. Then order RCFS 5. Multilevel Queue Scedhuling - use multiple queues for different level processes. Sort based on characteristics of CPU-bursts 6. Multilevel Feedback Queue - Same as above but allow processes to move between queues.

Back

What are the 5 Operating System responsibilities in connection with Process Management

Front

1. Creating and deleting processes 2. Suspending and resuming processes 3. Providing a mechanism for process synchronization 4. Providing a mechanism for process communication 5. Providing a mechanism for deadlock handling

Back

What three requirements must be met in order to "solve" the Critical-Section Problem?

Front

1. Mutual Exclusion 2. Progress (no deadlocks) 3. Bound Waiting (no starvation)

Back

What is the translation look-aside buffer (TLB)

Front

It resides in cache. When the CPU looks for a page to frame translation, it checks the TLB first. If the page is not in the TLB, it goes to the page table in memory. The page number is then stored in the TLB along with the corresponding frame number. Access time is almost cut in half with a high TLB hit rate.

Back

What is a thread?

Front

An individual task that receives time from the CPU to execute its instructions sequentially. A thread shares time with other threads within the same process as well as threads in other processes

Back

What are the 9 Operating System Services?

Front

1. User Interface 2. Program Execution 3. I/O Operations 4. File-System Manipulation 5. Communication 6. Error Detection 7. Resource Allocation 8. Accounting 9. Protection and Security

Back

The typical OS has two Process Schedulers. Name them and describe their role.

Front

1. Short-term Scheduler - responsible for selecting the next process to be executed and allocating CPU resources 2. Long-term Scheduler - responsible for selecting which processes go on the ready queue.

Back

What is aging and why/when is it used?

Front

Gradually increasing the priority of a process. It is used when a lower priority process is continually preempted by higher priority processes and is at risk of being starved out.

Back

Sockets are a common and convenient form of IPC (even between remote systems). Name and describe three types of sockets

Front

1. Connection-oriented - Transfer Control Protocol (TCP) and have a client-server relationship. 2. Connectionless - User Datagram Protocol (UDP) and pass data peer to peer 3. Multicast - UDP and pass data one to many

Back

A process in memory has 5 distinct parts or sections. Name and describe them

Front

1. Text section - executable code 2. Program counter - instruction currently being executed 3. Data section - global variables 4. Stack - local variables and function parameters 5. Heap - dynamically allocated memory (new, malloc, calloc)

Back

What are the three main purposes of an Operating System?

Front

1. Provide an environment in which a user can execute programs conveniently and efficiently 2. Allocate resources fairly and efficiently 3. Act as a control program to a) supervise the execution of programs to protect the system and b) to manage and control the operation of I/O systems

Back

What role does hardware play in helping to solve the Critical-Section problem?

Front

HW provides instructions that allow the software to either test and modify the content of a word or to swap the contents of two words atomically

Back

Give a generic example of a system call for 1) Process control, 2) File Management, and 3) Communication

Front

1. CreateProcess(), fork(), wait() 2. Open(), Close(), Read(), Write() 3. Send(), Recieve()

Back

How do multiprocessor systems achieve load balancing?

Front

Push migration - a specific task will periodically check the load on each processor and move processes from one to another Pull migration - idle processor finds a waiting task on a busy processor and begins executing it

Back

What is a logical address?

Front

(Also called virtual address) The address that the CPU generates and sends to the Memory Management Unit (MMU)

Back