Friday 15 May 2015

Synchronization in Java

 Throughout the discussion of threads thus far, you've really only learned about threads from an asynchronous perspective. In other words, you've only been concerned with getting threads up and running and not worrying too much about how they actually execute. You can only think in these terms when you are dealing with a single thread or with threads that don't interact with the same data. In reality, there are many instances where it is useful to have multiple threads running and accessing the same data. In this type of scenario, the asynchronous programming approach just won't work; you must take extra steps to synchronize the threads so they don't step on each other's toes.

The problem of thread synchronization occurs when multiple threads attempt to access the same resources or data. As an example, imagine the situation where two threads are accessing the same data file; one thread may be writing to the file while the other thread is simultaneously reading from it. This type of situation can create some very unpredictable, and therefore undesirable, results.


Note
When data objects are shared between competing threads, they are referred to as condition variables.

When you are dealing with threads that are competing for limited resources, you simply must take control of the situation to ensure that each thread gets equal access to the resources in a predictable manner. A system where each thread is given a reasonable degree of access to resources is called a fair system. The two situations you must try to avoid when implementing a fair system are starvation and deadlock. Starvation occurs when a thread is completely cut off from the resources and can't make any progress; the thread is effectively frozen. Where starvation can apply to a number of threads individually, deadlock occurs when two or more threads are waiting for a mutual condition that can never be satisfied; they are starving each other.

A Hypothetical Example

A popular hypothetical example that more clearly demonstrates the problem of deadlock is the dining philosophers. The story goes that there are five hungry philosophers sitting around a table preparing to eat. In front of each philosopher is a bowl of rice, while between each philosopher there is a chopstick. To take a bite of rice, a philosopher needs two chopsticks: one from the left and one from the right. In this situation, the philosophers are equivalent to threads, with the chopsticks representing the limited, shared resources they all need access to. Their desired function is to eat the rice, which requires access to a pair of chopsticks.

The philosophers are only allowed to pick up one chopstick at a time, and they must always pick up the left chopstick and then the right. When a philosopher gets both chopsticks, he can take a bite of rice and then put down both chopsticks. This sounds like a pretty reasonable system of sharing the limited resources so everyone can eat. But consider what happens when each philosopher goes after the chopsticks with equal access to them. Each philosopher immediately grabs the chopstick to his left, resulting in every philosopher having a single chopstick. They all then reach for the chopstick on their right, which is now being held by the philosopher to their right. They are all waiting for another chopstick, so they each just sit holding a single chopstick indefinitely. Both figuratively and literally, they are starving each other!

This is a very good example of how a seemingly fair system can easily go awry. One potential solution to this problem is to force each philosopher to wait a varying amount of time before attempting to grab each chopstick. This approach definitely helps, and the philosophers will probably get to eat some rice, but the potential for deadlock, and therefore starvation, is still there. You are counting on blind luck to save the day and keep the philosophers well fed. In case you didn't guess, this isn't the ideal approach to solving deadlock problems.

You have two approaches to solving deadlock in a situation like this: prevention or detection. Prevention means designing the system so that deadlock is impossible. Detection, on the other hand, means allowing for deadlock but detecting it and dealing with its consequences when they arise. As with a medical illness, it doesn't take a huge mental leap to realize that prevention usually involves much less pain than detection, which results in sort of a chemotherapy for deadlock. My vote is clearly for avoiding deadlock in the first place. Besides, trying to detect deadlock can often be a daunting task in and of itself.

Getting back to the famished philosophers, the root of the problem is the fact that there is no order imposed on the selection of chopsticks. By assigning a priority order to the chopsticks, you can easily solve the deadlock problem; just assign increasing numbers to the chopsticks. Then force the philosophers to always pick up the chopstick with the lower number first. This results in the philosopher sitting between chopsticks 1 and 2 and the philosopher sitting between chopsticks 1 and 5 both going for chopstick 1. Whoever gets it first is then able to get the remaining chopstick, while the other philosopher is left waiting. When the lucky philosopher with two chopsticks finishes his bite and returns the chopsticks, the process repeats itself, allowing all the philosophers to eat. Deadlock has been successfully avoided!

Synchronizing Threads

If you're thinking the dining philosophers example seems fairly simple, you're right. But don't get too confident yet-real-world thread synchronization situations can get extremely messy. Fortunately, Java provides a very powerful solution to the whole issue: the synchronized modifier. The synchronized modifier is used to flag certain parts of your code as synchronized, resulting in limited, predictable access for threads. More specifically, only one thread is allowed access to a synchronized section of code at a time.
For synchronized methods, it works like this: each synchronized method is given a lock, which determines if it can be accessed, similar to a real lock. When a thread attempts to access the method, it must first see if it is locked, in which case the thread is denied access. If the method isn't locked, the thread gets access and the method then becomes locked. Pretty simple, right?

Note
Locks can apply to both methods as well as entire classes, but not directly to individual blocks of code. You are allowed to specify an object or class that is locked for a particular block of synchronized code, but the block itself isn't locked.
Synchronized sections of code are called critical sections, implying that access to them is critical to the successful threaded execution of the program. Critical sections are also sometimes referred to as atomic operations, meaning that they appear to other threads as if they occur at once. In other words, just as an atom is a discrete unit of matter, atomic operations effectively act like a discrete operation to other threads, even though they may really contain many operations inside.
You can use the synchronizedmodifier to mark critical sections in your code and make them threadsafe. Following are some examples of using the synchronizedmodifier:
synchronized public void addEmUp() { 
  float a, b;
  a += b;
  b += a;
}

public void moveEmOut() {
  Rectangle rect;
  synchronized (rect) {
    rect.width -= 2;
  }
  rect.height -= 2;
}
The first example shows how to secure an entire method and make it synchronized; only one thread is allowed to execute the addEmUpmethod at a time. The moveEmOutmethod, on the other hand, contains a synchronized block of code within it. The synchronized block protects the width of the rectangle from being modified by multiple threads at once. Notice that the rect object itself is used as the lock for the block of code. Also notice that the modification of the height of the rectangle isn't included in the synchronized block, and therefore is subject to access by multiple threads at once.

Note
It's important to note that even though there are legitimate situations where you will need to make a block of code synchronized, in general it is better to apply synchronization at the method level. Employing method synchronization as opposed to block synchronization facilitates a more object-oriented design and results in code that is easier to debug and maintain.
There is a subtle problem when using synchronized methods that you may not have thought about. Check out the following code sample:
public class countEngine {
  private static int count;
  public synchronized void decCount() {
    count--;
  }
}
The decCount method is synchronized, so it appears that the countmember variable is protected from misuse. However, countis a class variable, not an instance variable, because it is declared as being static. The lock on the synchronized method is performed on the instance object, so the class data isn't protected. The solution is to synchronize the block using the class as the locked object, like this:
public class countEngine {
  private static int count;
  public void decCount() {
    synchronized (getClass()) {
      count--;
    }
  }
}
Notice that the getClassmethod is called to retrieve the class for the synchronized block. This is a perfect example of where you have to use block synchronization over method synchronization to get a desired result.

Volatile Variables


In rare cases where you don't mind threads modifying a variable whenever they please, Java provides a means of maintaining the variable's integrity. The volatilemodifier allows you to specify that a variable will be modified asynchronously by threads. The purpose of the volatilemodifier is to protect against variable corruption through registered storage of the variable. In an asynchronous environment, corruption can sometimes occur when a variable is stored in CPU registers. The volatile modifier tells the runtime system to always reference a variable directly from memory, instead of using a register. Furthermore, the variable is read from and written back to memory after each access, just to be safe. It's fairly rare that you'll need to use the volatilemodifier, but if you feel like living on the edge, it's there for your enjoyment! 

No comments:

Post a Comment