Published on

Introduction to Parallel Programming in Java

Hey guys! This is going to be a post explaining parallel programming, how to use it in Java and various other information related to this field. So let's get into it!

Parallel Programming In Java!

Figure 1: Parallel Programming In Java!šŸš€

What is Parallel Programming?

Parallel programming is the practice of running multiple tasks at the same time using multiple compute resources. Even though multiple compute resources are executing these asynchronous tasks, it is important to note that in most cases every single one of those resources are running on the same CPU.

These tasks are defined based on the function assigned to them, which is also known as functional parallelism.

Why use Parallel Programming?

While there are numerous benefits of using parallel programming, some of the main ones include:

  • Overcoming the maximum clock speed of processors: When processors have reached their maximum clock speed, we can get increase the productivity of the CPU by using parallelism
  • Use in Artificial intelligence: Using parallel programming can increase the reaction times of autonomous self-driving cars to levels faster than reaction times of most humans(which is around 0.25 seconds on average)
  • Ability to use resources on the Wide Area Network(WAN) and the Internet
  • Increase previous limits on data storage
  • Increased cost per performance
  • Increases the effective utilization of hardware resources
  • Eliminates single source of failure issues

Utilization of parallelism in Java

Java uses parallelism in the following concepts:

  • Java Fork/Join framework: this framework is based on the idea of divide and conquer algorithms. This framework is divided into two parts: the "fork" and the "join". The "fork" part consists of recursively breaking the task into smaller and smaller components until a task can no longer be broke down any longer. When this point is reached the "join" part of this framework is initiated, which involves joining the small tasks into a single result. In the case that the task returns void, the program will wait until all small subtasks are finished execution. Here is a diagram demonstrating this framework:
Java Fork/Join framework visualized

Figure 2: Java Fork/Join framework visualized

  • Threads and processes: Java allows threads, which can be though of as a series of executed statements, to run multiple tasks using the Thread class. Using this class new types of threads can be created such as users threads and daemon threads. User threads are basically high priority threads while daemon threads are generally though of as low level threads
  • Locks: Java uses the concept of a Lock to protect against race conditions. A race condition, generally defined, is a scenario where parallel execution of a Java program done by multiple threads at the same time causes concurrency issues in the form of inconsistent data from the use of shared objects, non thread-safe operations on collections, etc..
  • Volatile keyword: The volatile keyword is a way to make variables thread-safe. In case you're wondering, thread-safety is basically the idea of being able to share a code block in concurrent and multi-threaded environments. To explain the volatile keyword in an easily comprehensible way let's go through an example:
public class RunningShoe {
    private String shoeName;
    private String shoeModel;
    private int shoeSize;
    private volatile ShoeType shoeType;

}

In the example above, whenever a value is assigned to the shoeType variable the values for all the other variables are written to main memory. If we extend this example: whenever a volatile variable is assigned a value, the values of all other variables visible to the thread doing the execution will written to main memory, which is also known as RAM. Here is a diagram of the Java memory model that sheds some light on modern computer architecture:

Basic memory model

Figure 3: Basic memory model

Disadvantages of Parallel programming

Although parallel programming is generally though of as beneficial, there are a number of disadvantages it has. Here are a few of them:

  • Systems that use parallel programming can be complex and design intensive to create

  • Parallel programming can cause many concurrency issues such as race conditions, deadlocks, and memory inconsistency issues. Let's briefly go over these issues:

    • Race conditions: Occur from parallel execution of a program by multiple threads. To give an example of this issue imagine a banking application where two threads can deposit funds into any bank account. Therefore, if by some rare happenstance both threads deposit funds into the same account at the same time the result will cause data inconsistencies. This is because while one thread is doing the read operation in order to subsequently deposit funds into a particular account, the second thread will have already deposited fund into the same account. Therefore the first thread will receive an outdated reading for the current balance of that account. What is left in the end is two threads have two different account balances for the same account. This is what is referred to as a race condition
    • Deadlocks: Occurs when two threads sharing the same resource prevent each other from accessing the resource. This results in both threads indefinitely waiting for the other thread to unlock access to the resource, an event that will never happen. Here is an example of this problem: Thread A gains access to resource A and Thread B gains access to resource B. Then Thread A attempts to gain access to resource B when it will be released. Thread B also tries to access resource A once Thread A releases the lock on resource A. And so the problem is revealed: Each thread waiting for the other to release their locks on the resource they currently own in order to access the resource they currently don't have access to. This is called a deadlock
  • Higher than average power consumption from a hardware perspective

  • Potentially more expensive as a result of increased data transfers, synchronization, communication, and thread management attempts

Conclusion

Whew! If you made it this far, congrats! You have a dedication to learning that is sure to pay offšŸ‘

Although there are many more details about Parallel Programming in Java that weren't covered in this post, for the sake of brevity letĀ us end it here. Hopefully you now have a better idea of what Parallel Programming in Java is, along with its advantages and disadvantages.

Thanks for reading this blog post on Parallel Programming in Java!

If you have any questions or concerns please feel free to post a comment in this post and I will get back to you when I find the time.

If you found this article helpful please share it and make sure to follow me on Twitter and GitHub, connect with me on LinkedIn and subscribe to my YouTube channel.