Tag Archives: Java

Amazon Web Services Simple Queue Service (AWS SQS) Using the Java 2 Software Development Kit

Introduction

AWS SQS Message Queues are a way to exchange messages between applications. Senders, send data objects to a queue and receivers, receive objects from a queue. Amazon’s Simple Queue Service (AWS SQS) is a service offered by AWS that offers scalability and reliability by being distributed across Amazon.

A message queue decouples applications. An message producer only knows about the queue and knows nothing about the queue’s consumers. Likewise, a message consumer only knows about the queue and knows nothing about the queue’s other consumers or producers. Moreover, producers and consumers know nothing about timing, and are asynchronous.

For more on queues and message-passing in general, there are many resources online. Here is a good reference from MIT: Reading 22: Queues and Message-Passing.

Use Case

Suspend disbelief, or more accurately, simply build the system regardless of what you think about the soundness behind the business plan. Famous entrepreneur John Bunyan from Washington State has a plan to get rich and famous by finally proving conclusively that Bigfoot – or Sasquatch for the cultured – exists and uses the extensive system of hiking trails to move around.

Against his accountant’s advice, he liquidated half his fortune to install a series of hidden cameras along Washington State’s hiking trails to take photos every fifteen minutes. As he is a busy man, he does not have time to analyze all the photos personally, and so he want’s image analysis software to analyze the images. If the software registers a Sasquatch, he wants the images to personally go to his email account so he can register the image as a Squatch or not.

Now, with if 10,000 cameras take a picture every 15 minutes, that is 600,000 images per hour. Assume each image takes up to five minutes to process. Hopefully you can see, we have a scalability issue.

There are various ways to deal with this scalability issue, but as this is a tutorial on SQS, we use AWS SQS. And, as I am fond of admonishing in all my tutorials, if the “business case” seems suspect, then suspend disbelief and focus on the AWS code.

Design

Enough apologizing for the business case, let’s focus on the application’s design. The following diagram illustrates the dilemma.

  • Every n minutes a Station sends an observation to an AWS queue.
  • There are 1 or more SquatchFinder components who’s job is to pick up an observation from the queue and process the observation.
  • Station is the producer while SasquatchFinder is the consumer.
Stations send observations to the queue and SasquatchFinders get observations from the queue.
Queues implement an asynchronous Producer/Consumer design pattern.

We can formalize our requirements with a simple class diagram. A Station creates an Observation. A SasquatchFinder processes an Observation.

Class diagram illustrating the design.

All communication with AWS isfrom external processes is via its REST API. AWS SQS is no different. Moreover, SQS queues only accept textual data. But a common need is for the queue to accept binary data, such as an image. Also, JSON is a textual data transport format.

We can translate the Observation into a JSON document. The image is converted to base64 encoding so it can be represented as text. Note the encodedImage in this tutorial is always truncated with <snip>, as the base64 string is quite long.

{ 
  timestamp: “1558493503”,
  latitude:”46.879967”,
  longitude:”-121.726906”,
  encodedImage:"/9j/4AA <snip> 3QEUBGX/9k="
}

Base64 Encoding

Images are binary. However, all binary can be represented by a String provided it is encoded and decoded correctly. Base64 is an encoding scheme that is converts binary to a string. It’s useful because it allows embedding binary data, such as an image, in a textual file, such as a webpage or JSON document. AWS SQS queues only allow textual data, and so if you wish to store an image on an AWS SQS queue, you must convert it to a string. And the easiest way to accomplish this is by using Base64 format to encode binary data to strings when transporting data and decode strings to binary data when storing the data. For an example of Base64 and DynamoDB, refer to this site’s tutorial: Using the AWS DynamoDB Low-Level Java API – Sprint Boot Rest Application.

Station – Producer

Before coding the application, let’s create a queue. You can create a queue via the Java 2 API SDK; however, here we create the queue manually and then use this queue to send and receive messages.

Create SQSQueue

  • Navigate to the SQS console and select standard Queue.
  • Click the Configure Queue button.
  • Name the queue SasquatchImageQueue.
  • Accept the defaults for the Queue Attributes.
  • After creating the queue you should see a screen similar to the following.
  • Click on the Permissions tab and notice that we have not created a permission. We return to the Permissions tab after creating the two necessary users.

There are two types of queues offered by AWS SQS, Standard Queues and First In First Out (FIFO) Queues. Standard queues provide what is called best-effort ordering. Although messages are usually delivered in the order they are received, there are no guarantees. Moreover, messages can also be processed more than once. FIFO queues, in contrast, guarantee first in first out delivery and processing only once.

In this tutorial we primarily use standard queues. However, toward the end of this tutorial we illustrate using a FIFO queue.

Create SQSQueue Users

We need to create two users, one to interact with the queue for sending messages and another for receiving messages. If you have created IAM users before, note we do not assign the user to any group or assign any policies. Instead, we allow the queue to determine its permissions. Of course, we assign the user programmatic access and download the credentials file.

  • Navigate to the IAM console and create a new user called SasquatchProducerUser that has programmatic access.
  • Save the user’s credentials locally.
  • Create a second user called SasquatchConsumerUser that also has programmatic access.
  • Save the user’s credentials locally.
  • You should have two users created with programmatic access.

Queue Permissions

Initially only a queue’s creator, or owner, can read or write to a queue. The creator must grant permissions. We do this using a queue policy. We write the policy using the ASW SQS Console, although you write it manually if you wished.

Consumer Permissions

  • Navigate to the SasquatchConsumerUser summary screen and copy the Amazon Resource Name (ARN).

The ARN should appear similar to the following.

arn:aws:iam::743327341874:user/SasquatchConsumer

The Amazon Resource Number, or ARN, uniquely identifies an Amazon resource, in this case, the SasquatchConsumer user.

  • Return to the SQS console and select the SasquatchImageQueue and click on the Permissions tab.
  • Click Add a Permission.
  • In the resultant popup, paste the ARN in the Principal text box.
  • Check the DeleteMessage, GetQueueUrl, and ReceiveMessage Actions.
  • Click Save Changes.
  • After creating the SasquatchConsumerUser, navigate to the SasquatchProducerUser and copy the ARN for the producer.
arn:aws:iam::743327341874:user/SasquatchProducerUser
  • Navigate back to the SQS Queue and add this user to the queue as a permission. Allow the ChangeMessageVisibility, DeleteMessage, GetQueueAttributes, GetQueueUrl, PurgeQueue, and SendMessage Actions.
  • After adding the permissions for both users the queue should appear similar to the following image.

If you are still uncertain as to adding a permission to a queue, here is a tutorial by Amazon: Adding Permissions to an Amazon SQS Queue. You can also add Server-Side Encryption, as this tutorial illustrates: Creating an Amazon SQS Queue with Server-Side Encryption (SSE).

Although we do not discuss Policy documents, the following illustrates that a JSON document underlies the settings we set using the console. It is, however, important you understand policy documents, as they are at the heart of AWS security. For more information on AWS SQS Policies refer to this documentation: Using Identity-Based (IAM) Policies for Amazon SQS.

One thing to note is that here we assigned permissions to the queue using AWS SQS rather than the consumer or producer user we created. We could have just as easily used an IAM Policy, as the documentation in the link in the preceding paragraph discusses.

Sending Message Via Console

Although there is probably rarely a business reason, for testing purposes you can manually add a message to a queue. Although we will not use the message, let’s explore sending a message using the AWS SQS Console.

  • Refer to the observations.json document and copy one of the observations. Of course, in the code listing below the image is truncated.
    {
      "stationid": 221,
      "date": "1992-03-12",
      "time": "091312",
      "image": "/9j/4AA <snip> 0Wxkf/9k="
    }
  • Select the queue and from Queue Actions select Send a Message.
  • Copy a single message from observations.json and add the entry to the Message Body.
  • Click Send Message and within a minute the Messages Available column should show one message on the queue.
  • Purge the queue by selecting Purge Queue from Queue Actions.

Java Project – Producer

As discussed, a producer, well, produces messages. If we fully implemented the design above we would have many Stations and many . However, to keep the tutorial simple we limit ourselves to one Station in one project.

Project Setup

Although I developed the tutorial using Eclipse, you can use your own IDE or even the command-line. However, you really should use Maven or Gradle. Here we use Maven. It is assumed you are familiar with using Maven to build Java projects.

POM

  • Create a new project named SQSTutorialProducer.
  • Create or overwrite the POM file with the following POM.
<project xmlns="http://maven.apache.org/POM/4.0.0"
	xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>com.tutorial.aws</groupId>
  <artifactId>SQSTutorialProducer</artifactId>
  <version>0.0.1-SNAPSHOT</version>
  <properties>
    <java.version>1.8</java.version>
    <maven.compiler.source>1.8</maven.compiler.source>
    <maven.compiler.target>1.8</maven.compiler.target>
  </properties>
  <dependencyManagement>
    <dependencies>
      <dependency>
	<groupId>software.amazon.awssdk</groupId>
	<artifactId>bom</artifactId>
	<version>2.5.25</version>
  	  <type>pom</type>
  	  <scope>import</scope>
	  </dependency>
    </dependencies>
  </dependencyManagement>
  <dependencies>
    <dependency>
      <artifactId>auth</artifactId>
      <groupId>software.amazon.awssdk</groupId>
    </dependency>
    <dependency>
      <artifactId>aws-core</artifactId>
      <groupId>software.amazon.awssdk</groupId>
    </dependency>
    <dependency>
      <groupId>software.amazon.awssdk</groupId>
      <artifactId>auth</artifactId>
    </dependency>
    <dependency>
      <artifactId>sqs</artifactId>
      <groupId>software.amazon.awssdk</groupId>
    </dependency>
   <dependency>
       <groupId>org.slf4j</groupId>
       <artifactId>slf4j-api</artifactId>
       <version>1.7.5</version>
   </dependency>
   <dependency>
       <groupId>org.slf4j</groupId>
       <artifactId>slf4j-simple</artifactId>
       <version>1.6.4</version>
   </dependency>
  </dependencies>
  <build>
    <plugins>
      <plugin>
        <groupId>org.apache.maven.plugins</groupId>
	<artifactId>maven-dependency-plugin</artifactId>
	<executions>
  	  <execution>
	    <id>copy-dependencies</id>
	    <phase>prepare-package</phase>
	    <goals>
	      <goal>copy-dependencies</goal>
	    </goals>
	    <configuration>				 <outputDirectory>${project.build.directory}/lib</outputDirectory>				 
              <overWriteReleases>false</overWriteReleases>
              <overWriteSnapshots>false</overWriteSnapshots>						 
              <overWriteIfNewer>true</overWriteIfNewer>
	    </configuration>
        </execution>
    </executions>
  </plugin>
  <plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-jar-plugin</artifactId>
    <configuration>
      <archive>
        <manifest>
	  <addClasspath>true</addClasspath>
	  <classpathPrefix>lib/</classpathPrefix>
	  <mainClass>com.aws.tutorial.sqs.main.Station</mainClass>
	</manifest>
      </archive>
    </configuration>
  </plugin>
</plugins>
</build>
</project>

In the POM we use the AWS BOM so we can avoid specifying AWS library versions. We add dependencies for the required AWS libraries. We also specify that maven is to build an executable jar with the required dependencies packaged in the jar.

Notice the following.

  <properties>
    <java.version>1.8</java.version>
    <maven.compiler.source>1.8</maven.compiler.source>
    <maven.compiler.target>1.8</maven.compiler.target>
  </properties>

If we do not specify Java 1.8 or higher, the compilation will fail, as the AWS builders are static interface methods that do not work with older Java versions. Although on your machine, the code might compile, you could have issues if you have multiple Java SDKs on your computer. By explicitly setting the version, source, and target we avoid any potential issues with compilation.

Station

Let’s create a simple executable Java class named Station. This will simulate a bona-fide message producer.

  • First create an com.aws.tutorial.sqs.main package.
  • Create a class named Station with a main method in the created package.
  • Have the main method printout a message that the class executed.
package com.aws.tutorial.sqs.main;

public class Station {
  public static void main(String[] args) {
    System.out.println("Station running....");
  }
}

Executable Jar

  • Compile and package the project. If running from the command-line you would type the following.
$ mvn clean compile package
  • After building, execute the program from the command-line. The printout should appear.
$ java -jar SQSTutorialProducer-0.0.1-SNAPSHOT.jar 
Station running....

Now that we have created the consumer’s basic structure, we can modify it to send an SQS message.

Sending A Message

In this example we send a message to the queue using the SDK. The data payload is a string of JSON data. You use hardcoded data to send to the queue. Obviously in a real-world application the data would come from a different source. To simulate sending messages from a bona-fide producer, a delay is introduced between sending each message.

  • Before modifying the program, create a new class named TestData in the com.aws.tutorial.sqs.main package.
  • Copy three observations from the observations.json file.
  • Or, if you do not wish escaping the strings yourself, use the TestData.java from this tutorial’s Git project. Note: if you use Eclipse, it will escape the strings for you when you paste the string immediately after the opening quotation. The image’s base64 code is shortened so they can be easily displayed.
package com.aws.tutorial.sqs.main;

public class TestData {

	public static String observationOne = "    {\n" + 
			"      \"stationid\": 221,\n" + 
			"      \"date\": \"2019-03-12\",\n" + 
			"      \"time\": \"091312\",\n" + 
   		        "      \"image\": \"/9j/4A <snip> \"\n" + 
			"    }";
	
	public static String observationTwo = "    {\n" + 
			"      \"stationid\": 222,\n" + 
			"      \"date\": \"2016-02-09\",\n" + 
			"      \"time\": \"091312\",\n" + 
			"      \"image\": \"/9j/4A <snip> \"\n" +  
			"    }";
	
	public static String observationThree = "    {\n" + 
			"      \"stationid\": 223,\n" + 
			"      \"date\": \"2017-12-22\",\n" + 
			"      \"time\": \"091312\",\n" + 
			"      \"image\": \"/9j/4A <snip> \"\n" + 
			"    }";
}
  • Modify Station to have a constructor that takes three strings, the key, secret key, and the queue’s URL.
  • Create two member variables, one of type SqsClient and the other String.
  • In the Station constructor initialize the SqsClient.
  • Create a method named sendMessage that sends the message to the queue.
  • Finally, modify main to send all three messages in TestData.java and pause between sending each message.
package com.aws.tutorial.sqs.main;

import javax.annotation.PreDestroy;
import software.amazon.awssdk.auth.credentials.AwsBasicCredentials;
import software.amazon.awssdk.auth.credentials.StaticCredentialsProvider;
import software.amazon.awssdk.regions.Region;
import software.amazon.awssdk.services.sqs.SqsClient;
import software.amazon.awssdk.services.sqs.model.SendMessageRequest;
import software.amazon.awssdk.services.sqs.model.SendMessageResponse;

public class Station {

  SqsClient sqsClient;
  String queueUrl;
	
  public Station(String key, String secretKey, String queueUrl) {
    AwsBasicCredentials awsCreds = AwsBasicCredentials.create(key, secretKey);

    this.sqsClient = SqsClient.builder()
      .credentialsProvider(StaticCredentialsProvider
      .create(awsCreds)).region(Region.US_EAST_1).build();

    this.queueUrl = queueUrl;
  }
	
  public String sendMessage(String message) {
    SendMessageRequest request = SendMessageRequest.builder()
      .queueUrl(this.queueUrl).messageBody(message)
      .delaySeconds(5).build();		

    SendMessageResponse response = this.sqsClient.sendMessage(request);
    return response.messageId();
  }

  @PreDestroy
  public void preDestroy() {
    this.sqsClient.close();
  }

  public static void main(String[] args) {
    System.out.println("Station running....");
    Station station = new Station("AKIA22EODDUZONNX2EMP",
      "LUXJ5WQjW0p4bk1gC5oGBUi41rxA7oSvWWA/8SqH",
      "https://sqs.us-east-1.amazonaws.com/743327341874/SasquatchImageQueue");

    String id = station.sendMessage(TestData.observationOne);
    System.out.println("sent message: " + id);
    try {
      Thread.sleep(10000);
    } catch (InterruptedException e) {
        e.printStackTrace();
      }
    id = station.sendMessage(TestData.observationTwo);
    System.out.println("sent message: " + id);
    try {
      Thread.sleep(5000);
    } catch (InterruptedException e) {
      e.printStackTrace();
    }
    id = station.sendMessage(TestData.observationThree);
    System.out.println("sent message: " + id);
    }
}
  • Compile and run the application and you should see the following output.
Station running....
sent message: b861220e-a37a-424d-880c-5dd67a052967
sent message: 5185e68b-a16f-4300-8ee5-7ef5cca0eb53
sent message: 161f7444-ae7b-4890-b022-0447933054c3
  • Navigate to the queue in the AWS Console and you should see three messages in the Messages Available column.

The consumer has only one SqsClient instance that is initialized in the Station constructor and closed in a method annotated with the @PreDestroy annotation. This annotation is used to mark a method that should be called when a class is about to be destroyed for garbage collection.

Credentials

The client requires credentials to operate. This is the user account that the application uses to authenticate itself to the AWS SDK. Here we hardcode the credentials for simplicity. For more information on AWS Java 2 SDK and credentials, refer to SDK Documentation.

SqsClient

The SqsClient is an interface that extends SdkClient, and is the client for accessing AWS SQS service. You use the SqsClientBuilder to build the client. You build the client by passing the credentials and the region.

 this.sqsClient = SqsClient.builder()
      .credentialsProvider(StaticCredentialsProvider
      .create(awsCreds)).region(Region.US_EAST_1).build()

All requests to SQS must go through the client. Different types of requests are named accordingly. For instance requesting to send a message requires a SendMessageRequest, requesting to delete a message requires a DeleteMessageRequest. If you have worked with the other services offered by the Java 2 SDK such as DynamoDb or S3, then this pattern should be familiar.

SendMessageRequest

The SendMessageRequest wraps requests to send messages to the client. You build the request using a SendMessageRequestBuilder. Above we are setting the queue’s URL, the message’s body, and how long to delay before sending the message. We obtained the queue’s URL from the AWS SDK Console.

SendMessageRequest request = SendMessageRequest.builder()
  .queueUrl(this.queueUrl).messageBody(message)
  .delaySeconds(5).build();
The URL is in the Details tab of the queue in the AWS Console.

SendMessageResponse

The client sends the request and receives a response. The SendMessageResponse wraps the response. The method then returns the messageId and main prints the value to the console.

SendMessageResponse response = this.sqsClient.sendMessage(request);
return response.messageId();

Now that we have created three messages and sent them to SQS, we can write a consumer to consume the messages. Now let’s create a Java project named SQSTutorialConsumer.

Java Project – Consumer

Consumers, well, consume messages. Let’s create a consumer for the messages on the queue. As with the producer, we greatly simplify the consumer by creating an executable class that runs from the command-line.

Project Setup

Let’s create a Java Maven project for the Consumer.

POM

  • Create a Java project named SQSTutorialConsumer as a Maven project.
  • Create a POM file with the following.
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>com.tutorial.aws</groupId>
  <artifactId>SQSTutorialConsumer</artifactId>
  <version>0.0.1-SNAPSHOT</version>
  <properties>
    <java.version>1.8</java.version>
    <maven.compiler.source>1.8</maven.compiler.source>
    <maven.compiler.target>1.8</maven.compiler.target>
  </properties>
  <dependencyManagement>
    <dependencies>
      <dependency>
	<groupId>software.amazon.awssdk</groupId>
	<artifactId>bom</artifactId>
	<version>2.5.25</version>
  	  <type>pom</type>
  	  <scope>import</scope>
	  </dependency>
    </dependencies>
  </dependencyManagement>
  <dependencies>
    <dependency>
      <artifactId>auth</artifactId>
      <groupId>software.amazon.awssdk</groupId>
    </dependency>
    <dependency>
      <artifactId>aws-core</artifactId>
      <groupId>software.amazon.awssdk</groupId>
    </dependency>
    <dependency>
      <groupId>software.amazon.awssdk</groupId>
      <artifactId>auth</artifactId>
    </dependency>
    <dependency>
      <artifactId>sqs</artifactId>
      <groupId>software.amazon.awssdk</groupId>
    </dependency>
   <dependency>
       <groupId>org.slf4j</groupId>
       <artifactId>slf4j-api</artifactId>
       <version>1.7.5</version>
   </dependency>
   <dependency>
       <groupId>org.slf4j</groupId>
       <artifactId>slf4j-simple</artifactId>
       <version>1.6.4</version>
   </dependency>
  </dependencies>
  <build>
    <plugins>
      <plugin>
        <groupId>org.apache.maven.plugins</groupId>
	<artifactId>maven-dependency-plugin</artifactId>
	<executions>
  	  <execution>
	    <id>copy-dependencies</id>
	    <phase>prepare-package</phase>
	    <goals>
	      <goal>copy-dependencies</goal>
	    </goals>
	    <configuration>				 <outputDirectory>${project.build.directory}/lib</outputDirectory>				 
              <overWriteReleases>false</overWriteReleases>
              <overWriteSnapshots>false</overWriteSnapshots>						 
              <overWriteIfNewer>true</overWriteIfNewer>
	    </configuration>
        </execution>
    </executions>
  </plugin>
  <plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-jar-plugin</artifactId>
    <configuration>
      <archive>
        <manifest>
	  <addClasspath>true</addClasspath>
	  <classpathPrefix>lib/</classpathPrefix>
	  <mainClass>com.aws.tutorial.sqs.main.SasquatchFinder</mainClass>
	</manifest>
      </archive>
    </configuration>
  </plugin>
</plugins>
</build>
</project>

SasquatchFinder

  • First create a com.aws.tutorial.sqs.main package.
  • Next create a class named SasquatchFinder in the package.
  • Create a main method in the class and have it printout that it ran.
package com.aws.tutorial.sqs.main;

public class SasquatchFinder {
  public static void main(String[] args) {
    System.out.println("SasquatchFinder running....");
  }
}
  • Build the project.
$ mvn clean compile package
  • After building the project, execute the program from the command-line.
$ java -jar SQSTutorialConsumer-0.0.1-SNAPSHOT.jar 
SasquatchFinder running....

Now that we have the project’s basic outline, we can add code to receive messages.

Receive Message

  • As with the Station in the SQSTutorialProducer project, create member variables.
  • Create a main method that initializes the SqsClient. Be certain to use the consumer’s credentials and not the producer’s.
  • Create a new method named processMessage and have it use a ReceiveMessageRequest to receive a message.
  • Create a new method named deleteMessage and have it use a DeleteMessageRequest to delete a message.
  • Modify processMessage to call deleteMessage after a delay.
  • Modify main to loop continuously processing messages.
package com.aws.tutorial.sqs.main;

import java.util.List;
import software.amazon.awssdk.auth.credentials.AwsBasicCredentials;
import software.amazon.awssdk.auth.credentials.StaticCredentialsProvider;
import software.amazon.awssdk.regions.Region;
import software.amazon.awssdk.services.sqs.SqsClient;
import software.amazon.awssdk.services.sqs.model.DeleteMessageRequest;
import software.amazon.awssdk.services.sqs.model.Message;
import software.amazon.awssdk.services.sqs.model.ReceiveMessageRequest;

public class SasquatchFinder {

	private SqsClient sqsClient;
	private String queueUrl;
	public static int finderId = 1;

	public SasquatchFinder(String key, String secretKey, String queueUrl) {
		AwsBasicCredentials awsCreds = AwsBasicCredentials.create(key, secretKey);
		this.sqsClient = SqsClient.builder().credentialsProvider(StaticCredentialsProvider.create(awsCreds))
				.region(Region.US_EAST_1).build();
		this.queueUrl = queueUrl;
	}

	public void processMessage() {

		ReceiveMessageRequest receiveMessageRequest = ReceiveMessageRequest.builder().queueUrl(this.queueUrl)
				.maxNumberOfMessages(1).build();
		List<Message> messages = this.sqsClient.receiveMessage(receiveMessageRequest).messages();
		if(messages == null || messages.size() == 0) return;
		messages.stream().map(s -> s.body()).forEach(System.out::println);
		try {
			System.out.println("sleeping for 10 seconds...");
			Thread.sleep(10000);
		  this.deleteMessage(messages);
		} catch (InterruptedException e) {
			e.printStackTrace();
		}
	}

	public void deleteMessage(List<Message> messages) {
		String receiptHandle = messages.get(0).receiptHandle();
		DeleteMessageRequest deleteRequest = DeleteMessageRequest.builder().queueUrl(this.queueUrl)
				.receiptHandle(receiptHandle).build();
		this.sqsClient.deleteMessage(deleteRequest);
	}

	public static void main(String[] args) {
		System.out.println("SasquatchFinder " + SasquatchFinder.finderId + " running....");
		SasquatchFinder finder = new SasquatchFinder("AKIA22EODDUZAMDPWSX7", "805hbufO3Sn18eDsBDrOzCgB/eT5KVPM/AIkIpoZ",
				"https://sqs.us-east-1.amazonaws.com/743327341874/SasquatchImageQueue");
		try {
			while (true) {
				finder.processMessage();
			}
		} catch (Exception e) {
			e.printStackTrace();
		}
		System.out.println("SasquatchFinder " + SasquatchFinder.finderId + " stopped.");
	}
}
  • Compile and run the producer and, if you had ran the consumer in the previous section, you should see the following output.
SasquatchFinder 1 running....
    {
      "stationid": 221,
      "date": "2019-03-12",
      "time": "091312",
      "image": "/9j/4AAQ <snip> kf/9k="
    }
sleeping for 10 seconds...
    {
      "stationid": 223,
      "date": "2017-12-22",
      "time": "091312",
      "image": "/9j/4AAQ <snip> kf/9k="
    }
sleeping for 10 seconds...
    {
      "stationid": 222,
      "date": "2016-02-09",
      "time": "091312",
      "image": "/9j/4AAQ <snip> kf/9k="
    }
sleeping for 10 seconds...
  • Navigate to the queue in the AWS Console and you should see no messages, as they were deleted after processing.

In this simple consumer we first create a client for interacting with the queue. We then obtain a single message from the queue. The program pauses to simulate processing. It then deletes the message from the queue by using the receiptHandle.

Because the program loops, it processes all three messages place on the queue when we created the consumer.

ReceiveMessageRequest

The ReceiveMessageRequest wraps the request to receive a message from an SQS queue. We use a builder to create the request. Then we specify the queue URL and the maximum number of messages to fetch. Finally, we specified a single message; however, you can specify multiple messages if desired.

ReceiveMessageRequest receiveMessageRequest = ReceiveMessageRequest.builder().queueUrl(this.queueUrl)
				.maxNumberOfMessages(1).build();

DeleteMessageRequest

After processing the message you should delete it from the queue. We do this by obtaining the receiptHandle of the received message which is then used to delete the message.

String receiptHandle = messages.get(0).receiptHandle();
		DeleteMessageRequest deleteRequest = DeleteMessageRequest.builder().queueUrl(this.queueUrl)
				.receiptHandle(receiptHandle).build();

The program processes all messages on the queue. This is a simple consumer, but you could have multiple consumers consuming messages from the same queue.

Message Visibility

A message might be processed twice when using a standard queue. A message, when picked up by a consumer for processing becomes invisible for a configurable time. When we created the queue we accepted the visibility timeout of 30 seconds. However, if processing takes longer than the visibility timeout, the message can be processed by another consumer. The following diagram illustrates.

There is a following wrinkle. What happens when the message is deleted from the queue a second time?

  • Open the SQS Console and send a single message to the queue.
  • Modify SasquatchFinder to sleep for 40 seconds between each message.
public void processMessage() {
  ReceiveMessageRequest receiveMessageRequest = ReceiveMessageRequest
    .builder().queueUrl(this.queueUrl).maxNumberOfMessages(1).build();

  List<Message> messages = this.sqsClient
    .receiveMessage(receiveMessageRequest).messages();

  if(messages == null || messages.size() == 0){
   return;
 }
  messages.stream().map(s -> s.body()).forEach(System.out::println);
  try {
    System.out.println("sleeping for 40 seconds...");
    Thread.sleep(40000);
    this.deleteMessage(messages);
  } catch (InterruptedException e) {
      e.printStackTrace();
    }
}
  • After building the application, open two command-line windows and execute the program in the two different windows at the same time.

One running instance gets the message from the queue. The message’s visibility timeout set at 30 seconds begins. The instance sleeps for 40 seconds to simulate processing.

$ java -jar SQSTutorialConsumer-0.0.1-SNAPSHOT.jar 2
SasquatchFinder 2 running....
mymessage
sleeping for 40 seconds...

The other instance finds no message on the queue, as the message is not visible.

$ java -jar SQSTutorialConsumer-0.0.1-SNAPSHOT.jar 1
SasquatchFinder 1 running....

However, after thirty seconds the message is visible again on the queue and it is picked up and processed by the other instance.

$ java -jar SQSTutorialConsumer-0.0.1-SNAPSHOT.jar 1
SasquatchFinder 1 running....
mymessage
sleeping for 40 seconds...

Meanwhile, the instance that first picked up the message finishes processing and deletes the message. In reality, it attempts to delete the message. But, as the other process already requested the message and a new receipt handle was issued, the message is not truly deleted.

$ java -jar SQSTutorialConsumer-0.0.1-SNAPSHOT.jar 2
SasquatchFinder 2 running....
mymessage
sleeping for 40 seconds...
Deleted message AQEB3/lhW8cY2cTsl2gd/GOsyPrt1J/SQn+ZR06ngf24aL5C8SqfUSPZfAl4uc2IwuZuLhZ/5BXYLWVU7AvmgSf0kb4zm0owKh01EXC4pGhvtNSsioLnk3nd4KiS5YEUO/EssCnRM1we7rXw0eLyd2LehOpPOZ49893lIJ6opy1vamQxxk6C+7iGcWbY0dMNTvrZqVaZw2JW/eZV5wI99rdUwRP16+RFj7XWsxEI5KJcExgnWY3jDRQv1mXqe5ZgWI9M7mqPH/rrx8afBdV2P53B7OK0uRm3vUGMzmW/xUgbsxsy5UB0+DZGLaccUAbegtC74LQ6BLZs64VlFxc8jAC2sp2gheLAZ849j4JkMrA8nWf+P+xKCjqdALeGrN754DcxnvhZv79R6sOGcp2lBtTOsA== by SasquatchFinder 2

As the message is still being processed by the second instance, the first does not see the message. The second instance then deletes the message.

$ java -jar SQSTutorialConsumer-0.0.1-SNAPSHOT.jar 1
SasquatchFinder 1 running....
mymessage
sleeping for 40 seconds...
Deleted message AQEBgZK7kq12asCcVVNbFQNlQmmgYTXXO8OzgoJzHpAnqdsBtMnaBxSBvjjgyVqO3nqYwuhFoxPWgXhUoUcgDzejHHSG6dM/VNG1Wdv3Q93THsJPj6BSQSH/sLjX7qvdFYT20Es0jdhN4dQTNMPyaA3sA7a2x025cUYLsegKfMlWVfCDThABbn+0evwgkn3hmzwLBvAWZEGIp0mooZvYf6WiLcblbqCnx+Gh5j5/XvmIpWuT9ux3DQSTYH+f+XdfUxclXP6exwAYyyFm7xHJnlF9LXcRcKmv2QitpQjgjK3yQBLrogU6dPf8Zp34K8iwMr1TBXEi5mZnfPSA7Cl3a4N2c+MxB+OupGIGGY6uoy2gFLSiaaunsij/weB0FFaYaE/MFhMsXdMMhNho2o/lrq6SOA== by SasquatchFinder 1

Notice that both messages have a different receiptHandle. The queue has an internal mechanism to avoid errors when a message is processed and subsequently deleted twice. However, it does not prevent processing a message multiple times. If we manipulated the processing time and/or the visibility timeout, we could have the message processed even more times.

To actually delete the underlying message, the most recent receipt handle must be provided. So in our example above, the first attempt to delete the message came after the second receipt handle was returned and so the message was not deleted. But the second attempt to delete the message was the most recent receipt handle and so the message was deleted. To delete a message you must pass the most recently issued receipt handle.

You should design your system to not be dependent upon the number of times a message is processed. Your system should be idempotent. If you need strict processing of once and only once, then use a FIFO queue.

Message Attributes & Dead Letter Queue

Let’s explore two topics important when working with AWS SQS queues: message attributes and dead letter queues. A message can have associated metadata. However, to receive messages with associated metadata the ReceiveMessageRequest must be explicitly instructed to fetch the associated metadata in addition to the message itself. A message might not be successfully processed. Rather than leaving the message on the queue to fail indefinitely, a dead letter queue can be configured to send message that fail a configurable number of times.

DeadLetter Queue

  • Create a new standard queue named DeadLetterQueue.
  • SelectSasquatchImageQueue and from the Queue Actions dropdown select Configure Queue.
  • Modify SasquatchImageQueue to use DeadLetterQueue for its Dead Letter Queue.

Message Attributes

  • Select SasquatchImageQueue and send a new message.
  • When creating the message, add two message attributes.
  • Open the SQSTutorialConsumer project and modify the processMessage method in SasquatchFinder. Note that you comment the call to delete the message.
public void processMessage() {
  ReceiveMessageRequest receiveMessageRequest = 
    ReceiveMessageRequest.builder().queueUrl(this.queueUrl)	 		 	 
     .maxNumberOfMessages(1).messageAttributeNames("*").build();		

List<Message> messages = 
  this.sqsClient.receiveMessage(receiveMessageRequest)
    .messages();

  if (messages == null || messages.size() == 0) {			
    return;		
  }

  messages.stream().map(s -> s.body()).forEach(System.out::println);

  for (Message message : messages) {
    System.out.println(message.messageId());
    Map<String, MessageAttributeValue> attributes = message
      .messageAttributes();

    Set<String> keys = attributes.keySet();
    
    for (String key : keys) {
      System.out.println(key + ":" + attributes.get(key).stringValue());
    }
  }
  try {
    System.out.println("sleeping for 10 seconds...");
    Thread.sleep(10000);
    //this.deleteMessage(messages);
  } catch (InterruptedException e) {
    e.printStackTrace();
  }
}
  • Compile and run the application. The message should process three times.
SasquatchFinder 1 running....
abc
e6ede972-9a6d-4c86-8c00-b16fe18977ff
attribute1:abc
attribute2:ded
sleeping for 10 seconds...
abc
e6ede972-9a6d-4c86-8c00-b16fe18977ff
attribute1:abc
attribute2:ded
sleeping for 10 seconds...
abc
e6ede972-9a6d-4c86-8c00-b16fe18977ff
attribute1:abc
attribute2:ded
sleeping for 10 seconds...
  • Return to the AWS Console and you should see that the message is placed on DeadLetterQueue.

To receive message attributes we were required to build the ReceiveMessageRequest with the explicit instruction to receive the message attributes by specifying messageAttributeNames. That method can take one or more attribute names, or a * to signify all attributes.

The message was sent to DeadLetterQueue, the queue configured as the SasquatchImageQueue dead letter queue.

If you wish to learn more about message attributes, here is a tutorial on Amazon’s website: Sending a Message with Attributes to an Amazon SQS Queue.

If you wish to learn more about dead-letter queues, here is a tutorial on Amazon’s website: Configuring an Amazon SQS Dead-Letter Queue.

maxNumberOfMessages

The ReceiveMessageRequest can receive more than one message at a time if more are available on a queue. Above we set the maximum number of messages as one. Let’s explore what happens we change the setting to more messages.

  • Modify the SasquatchFinder class by creating a new method called deleteMessages.
  • Have the method iterate over all received messages.
public void deleteMessages(List<Message> messages) {
  for(Message message:messages) {
    String receiptHandle = message.receiptHandle();
    DeleteMessageRequest deleteRequest = 
      DeleteMessageRequest.builder().queueUrl(this.queueUrl)
      .receiptHandle(receiptHandle).build();

    this.sqsClient.deleteMessage(deleteRequest);
    System.out.println("Deleted message " + receiptHandle 
    + " by SasquatchFinder " + SasquatchFinder.finderId);
  }
}
  • Modify processMessage to call deleteMessages rather than deleteMessage.
public void processMessage() {
  ReceiveMessageRequest receiveMessageRequest = 
    ReceiveMessageRequest.builder().queueUrl(this.queueUrl)
      .maxNumberOfMessages(10).messageAttributeNames("*").build();

  List<Message> messages = this.sqsClient
    .receiveMessage(receiveMessageRequest).messages();
  if (messages == null || messages.size() == 0) {
    return;
  }
  messages.stream().map(s -> s.body()).forEach(System.out::println);
  for (Message message : messages) {
    System.out.println(message.messageId());
    Map<String, MessageAttributeValue> attributes = message
      .messageAttributes();

    Set<String> keys = attributes.keySet();
    for (String key : keys) {
      System.out.println(key + ":" + attributes.get(key).stringValue());
    }
  }
  try {
    System.out.println("sleeping for 10 seconds...");
    Thread.sleep(10000);
    this.deleteMessages(messages);
  } catch (InterruptedException e) {
      e.printStackTrace();
    }
}
  • Compile the application.
  • After compiling, navigate to the AWS SQS Console and add five messages to the queue, with the message body of a1, a2, a3, a4, and a5 respectively.
  • Run the application and you should see output similar to the following.
SasquatchFinder 1 running....
a4
98a42736-e4b5-4dfd-9428-3e32d2ea145d
sleeping for 10 seconds...
Deleted message AQEBqmAqpGs85ERM2Y8EnD4zjBPO1KxomlhJgQCPQ+JO3gjYhRcZbflS1gKJT1kas0JId7bX4X+OmFWQfC8r+gZGr02jwBcKlhvSUIv0tx13Q88EPpzMJDNbB9w9oKbgR+hc8c0nZQPPjJ2uHu7KeQfTmIdK/dt49cs/GHFRZeq3pIUWN2jJO8h0UdlpLeFKbB96WjPvakAnXDFd46meejQvBod0x18L1Y1dBt6cZc5+9AbB6eb4bJjV5dKvyDCtIUP2XFZ8iwtZF1lxntzqXxdMGYCjzaQ/oqQ5EmVJ/pFMTgWlUTks+qVFMu7a/sOCfQm7bFwE3AofXQROAK3B0crssZTbzoqQ9oJv+nj0kn596gidN+gygrISvF9vESIG1M5Ll+Lk2ADWQeO+2UA/AJax3A== by SasquatchFinder 1
a1
a5
c167bb7a-f356-4d5b-aa0f-ea90075cef50
f0d79263-05da-485e-bf6a-fa6b3f9fe92a
sleeping for 10 seconds...
Deleted message AQEBGwtlQPM080KnHDAOWUsZKUQ4PWfLP2g/AFn0sr9ERDOJFssjl7rNXl3mL6ryqoH9EgiPEGyGXwPm6n/FSsfbPA9OSMJYLq0Fho9qtpkcoI0mmAqRPQ/7h0J++zAmmf3bflcD9BqJS+hz4a/Di8Eo6GB0oWJUFZEFYcKWnIUGMNgnQfY3xs1DF9UuNZdsu7h3KN9hGGy3vSTuLvJJox7DDHSgY+QU3nisT5dTSfltKc9vJMQq2mPxB/f2EUmgwKQ82f10A6lPlSjVuiyNtGkKVau3BorKINz3dtG+xAHd5wWfALFExyip7zFZl6wVsnzfKox9QBaxRSrukIfx3+w5rIilq1QujPpNqLKItlxOvaXvDvxi/8lWv31S5UNlY7ooEOYSIkh1wnNwXKY7ZP4aQQ== by SasquatchFinder 1
Deleted message AQEBLIUJqmODdigrnQ88hzta9Zr+PaQnctLqmYrQT0iU5ZxvaLPy0PGNTe7eKwLHbBvc+WdDbLXK951WaPYWoY9dbMJZMyRNnjEj3doGoUkmBOm0LzTs1xDkV+QPb3fGH3s+mxh2TFhX3KFOwXrvf4uqkpx9mHdGioMWa86NSsCUUEQ3vXGUXprSdGsSqXUsoAug7v6wBU3QIPzeQm8pRLmjbZPdx+ndeV80FwnFkxDfNx/mtpAibum4ON4CxDUB66jLC7nVRe0XxXBllM2G/brS7jseqbz+Q61qbFjLNWKo96kTBIrYDjvZEmcSQdp37cYMf4rO/vsr+/XCNUtbtcD8h9Xk8Fc+atcIsuQSlrLbYMplVgN3EwogYlXJsB9GSOlVQVpO+gwOLBXonXJ6i3EAbQ== by SasquatchFinder 1
a2
a5
e65fbcc2-2c4a-42f6-8b61-ca97dad4826e
b2bc665c-4c1c-42c7-b3d2-c1d5bf048ee9
sleeping for 10 seconds...
Deleted message AQEB2FZyDGQEOUgLxR9wIxAiJbk++Ktec9RLon3nAZr7bPeQu2QJ8iVxRMNg92ZgvoPY5qsBndcRGEQjI5zKHQ/r62tg4+LMWwFLSDBhDF3d55w6OosgLf+K7AIBICGAeTJanTkhCzQlWYM+HCDFEve+NhPsr5+/zabaeZrkKwSBh8E2jTCmr29LmNR6ld9Bz0NSboj5gi+Gxa3dTu+xPGMLMjANVQ1Qa1BhoYEI0QP8kl9gL8aBpLhkeW1eWXgRaRtRcTAVpjxF73ZlUEFVNyYeE/Mwz9ZT2lWRftj6dv5p2PUG5Z6VtbbBw/9AXQElJUTgfHKGd4iGEjo4A3l6ff6g/NVJzm/LkGq6909txbTIk8PSp5istS4bM318W6VG2ten9jYSU7+pj8H809AHoW3VEw== by SasquatchFinder 1
Deleted message AQEBMdzd33/uz7jNQMnBJu1ne7GRh9g2xHx6X0cPWLsU0emEN0G5SGbr3nF/9QklDrrW42BX1HW6IDWxvhlI4/bOByZobYOfjmv5Cr8rDEJYnNKWxqxBZeQqjArKTy90WeEs0puUw4l6PouEZOv35daHO0h01A8Dpk/oMlVBi/OZFCIM4fetG2tUxwa7eU15WiEF4mklZqqJx2bVTbdiZqwhOucgqXlyXK3IJ5FtBFd6ACtEyX1tQmIBn6njmk/CBuX0v5+LzaxlntHy9Q+FpjuPLEyyE5wGqIk9B8Kcqv469pnaE3UJJaCK7DxgG70rF/7M1kYzaDRbRBYJB9jS3W9b8qZpj1JU4JM4euH9xBP4j59MvdwgIs4lSPvO1F3NtdCuNeOOMF15/n1WvU2U31jSeg== by SasquatchFinder 1

As the example illustrates, you can specify the maximum number of messages to process, but not the number of messages. This should seem reasonable, as the consumer does not know how many messages are in the queue before processing. As an aside, note that the messages were not processed in the same order they were received in the listing above.

First In First Out (FIFO) Queue

Let’s modify the project to use a FIFO queue and rerun the two consumer instances simultaneously. Note that neither the consumer nor the producer know they queue’s type. They only know it’s url.

  • Create a new queue named SasquatchImageQueue.fifo of type FIFO Queue.
  • Click Quick-Create Queue.
  • Create a new permission, but let’s be lazy and check the Everybody checkbox and the All SQS Actions checkbox. You would obviously not do this in production.
  • Modify both the consumer and producer to use this queue’s URL.
https://sqs.us-east-1.amazonaws.com/743327341874/SasquatchImageQueue.fifo
  • Modify the sendMessage method in the producer. Note the removal of the delaySeconds and the addition of the messageGroupId.
public String sendMessage(String message) {	
  SendMessageRequest request = SendMessageRequest.builder()
      .queueUrl(this.queueUrl).messageBody(message)
      .messageGroupId("mygroup").build();

  SendMessageResponse response = this.sqsClient.sendMessage(request);
  return response.messageId();
}
  • Compile and run the producer application after changing the queue and the three messages are sent to the queue.
  • Compile and run the consumer application and the three messages are processed in the same order they were received.

Message Visibility

  • Modify SasquatchFinder processMessage to simulate processing by sleeping for 40 seconds.
public void processMessage() {
  ReceiveMessageRequest receiveMessageRequest = 
    ReceiveMessageRequest.builder().queueUrl(this.queueUrl)
    .maxNumberOfMessages(1).build();

  List<Message> messages = this.sqsClient
    .receiveMessage(receiveMessageRequest).messages();

  if(messages == null || messages.size() == 0) {
    return;
  }

  messages.stream().map(s -> s.body()).forEach(System.out::println);
  try {
	System.out.println("sleeping for 40 seconds...");
	Thread.sleep(40000);
        this.deleteMessage(messages);
  } catch (InterruptedException e) {
        e.printStackTrace();
  }
}
  • Compile and run the application. Note you get an SqsException.
SasquatchFinder 2 running....
messageMine
sleeping for 40 seconds...
software.amazon.awssdk.services.sqs.model.SqsException: Value AQEBBJL+BlwyhRLnQGxaIKDkkrEv1sU6VnHzYM51Q0UFdx2lDyWvKoI/JYcs7MktVJ1Nmyr1mCVX/cpcqS9dMqq7Ual92VLEXDS9hEYM/qg1vdEGHB60OktMzpidyWBenQQyybzXofO+pAdKOYpC/wiEw8GBPsmFDCHpVn1hxHeLSNJyw10SwNv3DTXQXk4Pe+v3yGf23bf8sDk7Rx7ApqWYi8n8z9uijZAQBdwuFpUrZslivMWCzid6AFOXI/k83+/tKnSMyT0/Mx0rng0v1k4WliSgv5YJo5HyEZTt+cOBwfA= for parameter ReceiptHandle is invalid. Reason: The receipt handle has expired. (Service: Sqs, Status Code: 400, Request ID: 845b9538-4104-5428-aa2f-c05092244385)
	at software.amazon.awssdk.core.internal.http.pipeline.stages.HandleResponseStage.handl <snip> at com.aws.tutorial.sqs.main.SasquatchFinder.main(SasquatchFinder.java:58)
SasquatchFinder 2 stopped.

Attempting to delete messages fail when executed after the visibility timeout window if using FIFO queues.

Conclusions

In this tutorial we created an Amazon SQS Queue. After creating the queue, we created a message producer and a message consumer using the AWS Java 2 SDK. We then explored several topics such as message attributes, dead-letter queues, and message visibility. We also created a FIFO queue.

Amazon’s SQS Queue is a easy to use queue that takes the infrastructure management hassle away from the organization. In this tutorial we only examined SQS basics. For more information, refer to both the Java 2 SDK Developer’s Guide and the SQS Developer’s Guide. Remember, the API from version 1 to 2 changed, so when in doubt, assume you need a builder for an object and that you must configure the object when building it. However, the API is consistent and once you start working with the API translating 1.1. code to 2 is intuitive.

GitHub Project

The GitHub Project, SQSTutorial is available here.

Using the AWS DynamoDB Java API – Spring Boot Rest Application

Introduction

In this tutorial we use the Amazon Web Services Java 2 Application Programming Interface (API) to create a Rest application using Spring Boot that reads and writes to a DynamoDB database. This tutorial assumes AWS familiarity, Java programming experience, and Spring Boot experience. However, even without this experience, this tutorial should still prove useful, as it provides considerable supplementary resources for you to review. If you want to learn the AWS DynamoDB Java API then this tutorial is for you.

Here we create a simple database consisting of “observation stations” and “observations” gathered via a camera. Whatever…suspend disbelief and just go with it. Now, suppose, the stations require a means of uploading observations to an associated DynamoDB table. We decide upon a Rest API for stations to upload data. We implement this API using a Spring Boot Rest application. Again, if this all sounds suspect, suspend disbelief and focus on the AWS code and not the application design.

In this tutorial we,

  • create two database tables using the DynamoDB console,
  • create a couple items using the console,
  • create an IAM programatic user,
  • create a Spring Boot application that provides Rest endpoints so a client application can,
    • write an observation,
    • read an observation,
    • update an observation,
    • delete an observation,
    • batch write multiple observations,
    • conditionally query for a station’s observations,
    • and conditionally update observations,
  • and test the Rest endpoints using Postman.

This tutorial’s purpose is to explore the DynamoDB, not introduce Spring Boot, Rest, or JSON and assumes basic knowledge of all three. However, if new to any of these topics, links are provided to learn them before continuing.

NoSQL Databases

DynamoDB is a key-value and document NoSQL database. If unfamiliar with NoSQL Document databases, you should familiarize yourself before continuing. The following is an introductory video introducing NoSQL Databases.

The following are two good written introductory articles covering NoSQL and DynamoDB.

https://youtu.be/ujWV3-m1pLo

Note that Amazon also offers DocumentDB, which we could use as an alternative to DynamoDB. However, DocumentDB will be covered in a different tutorial.

A DynamoDB database can be described as the following. Tables consist of items. An item has one or more attributes. In a table you define the partition key and optionally define a sort key. The partition key is a key-value pair that not only uniquely identifies an item, it determines how the item is distributed on a computer’s storage. A sort key not only logically sorts items, it stores the items accordingly. Obviously, there is more to NoSQL physical storage and how it achieves its scalability, but that is beyond this tutorial’s scope.

Amazon Web Services & DynamoDB

Amazon DynamoDB is a NoSQL key-value and document database offered as a cloud service. It is fully managed and allows users to avoid the administrative tasks associated with hosting an enterprise database. As with almost all Amazon’s offerings, it is accessible via a Rest API.

Amazon offers software development kits (SDKs) to simplify working with the Rest API. The languages offered are Java, C++, C#, Ruby, Python, JavaScript, NodeJs, PHP, Objective-C, and Go. In this article we use the Java API. There are currently two versions of the API, in this tutorial we use the Java 2 SDK.

The Java 2 AWS SDK is a rewrite of the Java 1.1 AWS SDK and changes from a more traditional programming paradigm of instantiating objects using constructors and then setting properties using setters to a fluent interface/builder programming style.

Fluent Interface

The fluent interface is a term created by Martin Fowler and Eric Evans. It refers to an programming style where the public methods (the API) can be chained together to perform a task. It is used by the AWS Java SDK 2.0 when using builders. The builder tasks perform tasks but then return an instance of the builder. This allows chaining methods together. For more information on the fluid interface and builders, refer to this blog post: Another builder pattern for Java.

DynamoDB Low-Level API

As with all AWS APIs, DynamoDB is exposed via Rest endpoints. The AWS SDKs provide an abstraction layer freeing you from calling Rest directly. Above that layer, the Java SDK provides a class named DynamoDBMapper that allows working with DynamoDB similarly to the Java Persistence Framework (JPA). Although useful, using the lower-level API is not that difficult. Moreover, there are many situations where you would not wish to create a dependency in your object model that relies on DynamoDB.

For example, suppose we implemented a system that stored widgets in DynamoDB. If using the DynamoDBMapper, the Widget model class would be dependent upon DynamoDB via annotations mapping the class to the Widgets table.

Alternatively, if we do not wish to use the DynamoDBMapper we can implement something similar to the following diagram. It is a typical DAO pattern, where the only direct dependency upon the AWS SDK is the WidgetDaoImpl class. For more information on the DAO design pattern, refer to the following introductory article: DAO Design Pattern.

In this tutorial on the AWS DynamoDB Java APl, we use the SDKs direct calls to the underlying DynamoDB Rest endpoints. As an aside, note that we do not use the DAO design pattern, instead putting the data access logic directly in the controller class for brevity. We do, however, use the Spring MVC design pattern using Rest.

  • Note: If willing to make your application dependent upon DynamoDBMapper, you should consider using it, as it greatly simplifies working with DynamoDB.
  • Java: DynamoDBMapper documentation

Tutorial Use-Case – Station Observations

Imagine we have stations responsible for taking photo observations. A station has a coordinate, address, and a name. A station has one Coordinate. A station has one address. A station can have unlimited observations.

Although this tutorial does not discuss NoSQL database design, from the diagram below it seems reasonable we need two tables, Station and Observation. Moreover, as the Observation table is very write intensive – stations will be sending observations to the application on a continuous basis – it makes sense to not include observations as a collection within a Station instance but keep it as a separate table. Remember, these are JSON documents, not relational tables. It is unreasonable to design Observations as a list of items within a Station and would lead to an excessively large and unwieldy database.

If there were enough Stations, for even more efficiency we might create a separate table for each station’s observations. This would allow greater throughput for both writing and reading observations. But, in this tutorial we simply define a stationid to identify an observation’s station and will create an index on this value.

DynamoDB Console

The AWS Management Console provides an easy web-based way of working with Amazon’s cloud services. Although not covered in this tutorial, for those new to AWS, here is a short video by Amazon explaining the Management Console. Note that AWS also offers a command-line interface and Application Programming Interfaces (APIs) for accessing its cloud services.

  • AWS Essentials: How to Navigate the AWS Console by LinuxAcademy.

Before beginning the programming portion of this tutorial we must create the DynamoDB database.

Create Station Table

  • After entering the AWS Management Console, navigate to the DynamoDB console.
  • Click the Create table button.
  • Provide Station as the table’s name and id as the table’s primary key.

Creating Station Items

Remember, DynamoDB is schema-less. We create an item but do not define a table’s schema. Instead, we create a couple items with the desired structure.

  • Click the Items tab and click the Create Item button.
  • Create an id and name attribute, assigning id as a Number datatype and name as a String. Assign the values 221 and “Potomac Falls” respectively.
  • Create an attribute named address and assign it the Map datatype.
  • Add a city, street, and zip attribute as String datatypes to the address map. In the example below, I assigned Potomac, 230 Falls Street, and 22333 as the attribute values.
  • Create coordinate as a Map and assign it a latitude and longitude attribute as Number datatypes. I assigned the values 38.993465 and -77.249247 as the latitude and longitude values.
  • Repeat for one more station.

We created two items in the Station table. Here are the two items as JSON.

{
  "address": {
    "city": "Potomac",
    "street": "230 Falls Street",
    "zip": "22333"
  },
  "coordinate": {
    "latitude": 38.993465,
    "longitude": -77.249247
  },
  "id": 221,
  "name": "Potomac Falls"
}
{
  "address": {
    "city": "Frederick",
    "street": "9871 River Street",
    "zip": "221704"
  },
  "coordinate": {
    "latitude": 39.23576,
    "longitude": -77.4545
  },
  "id": 234,
  "name": "Monocacy River"
}

You can view the JSON after creating an item by selecting the item and then selecting the text view in the popup.

Note that the preceding JSON document is generic JSON. The actual JSON, as stored by DynamoDB (including datatypes), is as follows. Where the M, S, N, SS, etc. represent the element datatypes.

{
  "address": {
    "M": {
      "city": {
        "S": "Frederick"
      },
      "street": {
        "S": "9871 River Street"
      },
      "zip": {
        "S": "221704"
      }
    }
  },
  "coordinate": {
    "M": {
      "latitude": {
        "N": "39.23576"
      },
      "longitude": {
        "N": "-77.4545"
      }
    }
  },
  "id": {
    "N": "234"
  },
  "name": {
    "S": "Monocacy River"
  }
}

The DynamoDB datatypes are:

  • String = S,
  • Numbers = N,
  • Boolean = BOOL,
  • Binary = B,
  • Date = S (stored as a string),
  • String Set = SS,
  • Number Set = NS,
  • Binary Set = BS,
  • Map = M,
  • and List = L.

For example, in the following JSON document an observation’s address and coordinate are both Map datatypes, the city, street, zip are String datatypes, and the latitude and longitude are Number datatypes.

You can toggle between JSON and DynamoDB JSON in the popup window, as the following illustrates (note the DynamoDB JSON checkbox).

Create Observation Table

After creating the Station table we need to create the Observation table.

  • Create a new table named Observation.
  • Assign it a partition key of id and a sort key of stationid.

Composite Key (Partition Key & Sort Key)

The partition key is a table’s primary key and consists of a single attribute. DynamoDB uses this key to create a hash that determines the item’s storage. When used alone, the partition key uniquely identifies an item, as no two items can have the same partition key. However, when also defining a sort key, one or more items can have the same partition key, provided the partition key combined with the sort key is unique. Think of it as a compound key.

The Sort key helps DynamoDB more effectively store items, as it groups items with the same sort key together (hence the name sort key, as it sorts the items using this key).

An observation should have an id that identifies it and observations should be sorted by station, so we defined a stationid as the table’s sort key.

Create Sample Observations

As with the Station table, create some Observation items rather than define a schema.

  • Find three images, of small size, to use for this project. If you wish, use the three sample images from this tutorial’s Git project.
  • Navigate to Code Beautify’s Convert Your Image to Base64 webpage and convert the three images to a Base64 string.
  • Create a JSON list of observations.
  • Or, if you wish, simply use the JSON sampleData.json file provided in this tutorial’s Git project.

The following is a JSON list of four observations. The image base64 string is truncated so it can be easily displayed here. You can obtain the original file, named observations.json, from this tutorial’s Git project.

{
  [
    {
      "stationid": 221,
      "date": "1992-03-12",
      "time": "091312",
      "image": "/9j/4AAQSkZJRgABAQAAYABg <snip> kf/9k="
    },
    {
      "stationid": 221,
      "date": "1999-09-22",
      "time": "071244",
      "image": "/9j/4AAQSkZJ <snip> D9KhoA//2Q=="
    },
    {
      "stationid": 234,
      "date": "2001-11-09",
      "time": "111322",
      "image": "/9j/4AAQSkZ <snip> WoGf/9k="
    },
    {
      "stationid": 234,
      "date": "2013-01-12",
      "time": "081232",
      "image": "/9j/4AAQS <snip> q5//2Q=="
    }
  ]
}

Base64 Encoding

Images are binary. However, all binary can be represented by a String provided it is encoded and decoded correctly. Base64 is an encoding scheme that is converts binary to a string. It’s useful because it allows embedding binary data, such as an image, in a textual file, such as a webpage or JSON document. DynamoDB uses Base64 format to encode binary data to strings when transporting data and decode strings to binary data when storing the data. Therefore, the image sent to the Rest endpoints we create should be base64 encoded.

Create IAM Application User

Before beginning the Spring Boot application, we need a user with programatic access to the AWS DynamoDB API. If you are unfamiliar with IAM, the following introductory video should prove helpful. Otherwise, let’s create a user.

  • Navigate to the IAM Console and click Add user.
  • Create a new user named DynamoDBUser.
  • Assign DynamoDBUser with Programmatic access.
  • Create a new group named dynamo_users with AmazonDynamoDBFullAccess.
  • Assign DynamoDBUser to the dynamo_users group.
  • If you created the user correctly, you should see the following Summary screen.
  • Save the credentials file, credentials.csv, to your local hard-drive.

Spring Boot Application

Now that we have created the two needed tables and created a user we can begin the sample application. We create a Rest API for stations to save, retrieve, update, and delete observations. Not much explanation is devoted to Spring Boot, so if you have never created a Spring Boot Rest application you might consider completing a tutorial or two on Spring Boot and Rest. The following are links to two tutorials; however, there are many more on the web.

Setup Project

  • Create a new Project, I used Eclipse and created a new Maven application.
  • Modify the Maven POM file to match the following POM.
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>com.tutorial.aws</groupId>
  <artifactId>DynamoDbTutorial</artifactId>
  <version>0.0.1-SNAPSHOT</version>
  <parent>
    <groupId>org.springframework.boot</groupId>
      <artifactId>spring-boot-starter-parent</artifactId>
      <version>2.0.5.RELEASE</version>
  </parent>
  <properties>
    <java.version>1.8</java.version>
  </properties>
  <dependencyManagement>
    <dependencies>
      <dependency>
        <groupId>software.amazon.awssdk</groupId>
	<artifactId>bom</artifactId>
	<version>2.5.25</version>
	<type>pom</type>
	<scope>import</scope>
      </dependency>
    </dependencies>
  </dependencyManagement>
  <dependencies>
    <dependency>
      <groupId>org.springframework.boot</groupId>
      <artifactId>spring-boot-starter-web</artifactId>
    </dependency>
    <dependency>
      <groupId>org.springframework.boot</groupId>
      <artifactId>spring-boot-starter-test</artifactId>
      <scope>test</scope>
    </dependency>
    <dependency>
      <groupId>com.jayway.jsonpath</groupId>
      <artifactId>json-path</artifactId>
      <scope>test</scope>
    </dependency>
    <dependency>
      <artifactId>auth</artifactId>
      <groupId>software.amazon.awssdk</groupId>
    </dependency>
    <dependency>
      <artifactId>aws-core</artifactId>
      <groupId>software.amazon.awssdk</groupId>
    </dependency>
    <dependency>
      <groupId>software.amazon.awssdk</groupId>
      <artifactId>auth</artifactId>
    </dependency>
    <dependency>
      <artifactId>dynamodb</artifactId>
      <groupId>software.amazon.awssdk</groupId>
    </dependency>
  </dependencies>
  <build>
    <plugins>
      <plugin>
        <groupId>org.springframework.boot</groupId>
	<artifactId>spring-boot-maven-plugin</artifactId>
      </plugin>
      <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-jar-plugin</artifactId>
        <version>3.1.1</version>
        <executions>
          <execution>
	    <phase>package</phase>
	    <goals>
              <goal>jar</goal>
	    </goals>
	    <configuration>
 	      <classifier>client</classifier>
	      <includes>
	        <include>**/factory/*</include>
	      </includes>
	    </configuration>
          </execution>
        </executions>
      </plugin>
    </plugins>
  </build>
</project>

In the POM we define the AWS Bill of Materials (BOM) and the required AWS libraries. Note that when using a BOM it is unnecessary to specify the library versions, as the BOM manages versions. We also define the Spring Boot libraries required.

  • Create an application.properties file in the resources folder. Open credentials.csv and add the credentials to the file with the following property names.
  • NOTE: THIS USER WAS DELETED BEFORE PUBLISHING THIS TUTORIAL.
cloud.aws.credentials.accessKey=AK <snip> WP
cloud.aws.credentials.secretKey=yLJJ <snip> asUR
cloud.aws.region.static=us-east-1

Create Spring Boot Application Class

  • Create a new class named SiteMonitorApplication in the com.tutorial.aws.dynamodb.application package.
  • Annotate the class with @SpringBootApplication annotation.
  • Create the main method and have it launch the Spring Boot application.
package com.tutorial.aws.dynamodb.application;

import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.context.annotation.ComponentScan;

@SpringBootApplication
@ComponentScan({ "com.tutorial.aws.dynamodb" })
public class SiteMonitorApplication {

    public static void main(String[] args) {
        SpringApplication.run(SiteMonitorApplication.class, args);
    }
}

Create Observation Data Object

  • Create a class named Observation in the com.tutorial.aws.dynamodb.model package.
  • Create variables with the same names and types as in the JSON data created above.
package com.tutorial.aws.dynamodb.model;

import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.ObjectMapper;
import java.util.List;

public class Observation {
	private long stationid;
	private String date;
	private String time;
	private String image;
        private List<String> tags;
	
	public long getStationid() {
		return stationid;
	}

	public void setStationid(long stationid) {
		this.stationid = stationid;
	}

	public String getDate() {
		return date;
	}

	public void setDate(String date) {
		this.date = date;
	}


	public String getTime() {
		return time;
	}


	public void setTime(String time) {
		this.time = time;
	}

	public String getImage() {
		return image;
	}

	public void setImage(String image) {
		this.image = image;
	}

	public void setTags(List<String> tags) {
		this.tags = tags;
	}
	
	public List<String> getTags() {
		return this.tags;
	}
	
	@Override
	public String toString() {
		try {
			ObjectMapper mapper = new ObjectMapper();
			return mapper.writeValueAsString(this);
		} catch (JsonProcessingException e) {
			e.printStackTrace();
			return null;
		}
	}
}

The Observation object’s attributes are the same as in the JSON Observation document. Notice in the toString method we used an ObjectMapper from the Jackson library. We did not include this library in our POM, as the spring-boot-starter-web library includes this library.

The ObjectMapper maps JSON to Objects and Objects to JSON. It is how Spring Rest accomplishes this task. In the toString method we are telling the ObjectMapper instance to write the Observation object as a JSON string. For more on the ObjectMapper, here is a tutorial that explains the class in more depth: Jackson ObjectMapper.

Create Rest Controller

The Rest Controller provides the external API visible to Stations to send data to our application. Through the API, client applications will transmit data to the DynamoDB database. Different stations can develop its own client application in any language that supports Rest. The only requirement is that the station’s data follows the expected JSON format.

  • Note: we are violating the MVC Design pattern by putting data access directly in the Controller. Suspend disbelieve and ignore this anti-pattern.

Let’s create a Rest Controller to define our application’s API.

  • Create a class named ObservationApiController in the com.tutorial.aws.dynamodb.api package and annotate it with the @RestController annotation.
  • Assign it a top-level path of /observations.
  • Create a Rest endpoint for uploading a new Observation. Assign it the /observation mapping and name the method createObservation.
  • Have the method take an Observation as the request’s body.
  • Have the method print the uploaded Observation to the command-line.
package com.tutorial.aws.dynamodb.api;

import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RequestBody;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;
import com.tutorial.aws.dynamodb.model.Observation;

@RestController
@RequestMapping(value = "/observations")
public class ObservationApiController {

	@PostMapping("/observation")
	public void createObservation(@RequestBody Observation 
          observation) {
	  System.out.println(observation.toString());
	}	
}
  • Compile the application using Maven and start the application.
  • After the application starts, we can test using Postman.

Test using Postman

Postman is a useful tool for testing JSON endpoints. If you have never used Postman, you might consider completing a couple tutorials first.

  • Create a new request named AddObservation that exercises the Rest endpoint.
http://localhost:8080/observations/observation
  • Place one of the observations from the previously created JSON document in the request’s Body. Assign the type as JSON (application/json).
JSON Request in Postman for saving Observation.
  • Click Send to send the request to the Spring Rest endpoint. If everything is correct, you should see the Observation as JSON printed to the command-line.
{"stationid":221,"date":"1992-03-12", "time":"091312","image":"/9j/4AAQSkZJRgAB <snip> Wxkf/9k=","tags":null}
  • Copy the image base64 string and navigate to the CodeBeautify website’s Convert Your Base64 to Image webpage. Paste the string in the provided textarea and click Generate Image. If the base64 string was sent correctly, you should see the same image you sent to the Rest endpoint.

Create DynamoDB Client

Now that we have the basic Spring Boot application in place, we can start building the actual API to DynamoDB. But before working with DynamoDB, we need to create a DynamoDBClient instance.

  • Create a class named ObservationService in the com.tutorial.aws.dynamodb.service package.
  • Add the spring @Service annotation so spring sees this class as a controller.
  • Add the key and secretKey parameters and use the @Value annotation to indicate they are parameters from the application’s application.properties file (Spring Framework documentation).
  • Create a @PostConstruct and @PreDestroy methods (or implement a Spring InitializingBean).
  • Create a member variable entitled dynamoDbClient of type DynamoDbClient.
  • Instantiate and load the credentials for dynamoDbClient in the initialize method.
  • Close the dynamoDbClient in the preDestroy method.
package com.tutorial.aws.dynamodb.service;

import javax.annotation.PostConstruct;
import javax.annotation.PreDestroy;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Service;
import com.tutorial.aws.dynamodb.model.Observation;
import software.amazon.awssdk.auth.credentials.AwsBasicCredentials;
import software.amazon.awssdk.auth.credentials.StaticCredentialsProvider;
import software.amazon.awssdk.regions.Region;
import software.amazon.awssdk.services.dynamodb.DynamoDbClient;


@Service
public class ObservationService {

  @Value("${cloud.aws.credentials.accessKey}")
  private String key;

  @Value("${cloud.aws.credentials.secretKey}")
  private String secretKey;

  private DynamoDbClient dynamoDbClient;

  @PostConstruct
  public void initialize() {
    AwsBasicCredentials awsCreds = AwsBasicCredentials.create(key, secretKey);
    DynamoDbClient client = DynamoDbClient.builder()
        .credentialsProvider(StaticCredentialsProvider.create(awsCreds))
        .region(Region.US_EAST_1).build();

    this.dynamoDbClient = client;
  }

  @PreDestroy
  public void preDestroy() {
    this.dynamoDbClient.close();
  }
}

DynamoDBClient

The DynamoDBClient provides access to the DynamoDB API. All interaction with DynamoDB is done through this class. It has methods for for reading, writing, updating, and other interactions with DynamoDB tables and Items. For more information, refer to the API documentation.

Write Observation

Let’s first write an Observation to DynamoDB. Alternatively, you could say we PUT an item to DynamoB, as we are making an HTTP Put request to DynamoDB. We do this using the DynamoDBClient putItem method combined with a PutItemRequest.

Modify Service Class

  • Create a method named writeObservation that takes an Observation as a parameter.
  • Create a HashMap that uses String as the key and AttributeValue as the value.
  • Put each of the Observation variables into the HashMap, being sure the keys are correctly named. The keys should have exactly the same name as the JSON.
  • When creating the AttributeValueBuilder for each variable, ensure the correct datatype method is used.
  • Build a new PutItemRequest and then have dynamoDbClient call its putItem method to write the observation to the Observation DynamoDB table.
package com.tutorial.aws.dynamodb.service;

import java.util.HashMap;
import javax.annotation.PostConstruct;
import javax.annotation.PreDestroy;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Service;
import com.tutorial.aws.dynamodb.model.Observation;
import software.amazon.awssdk.auth.credentials.AwsBasicCredentials;
import software.amazon.awssdk.auth.credentials.StaticCredentialsProvider;
import software.amazon.awssdk.core.SdkBytes;
import software.amazon.awssdk.regions.Region;
import software.amazon.awssdk.services.dynamodb.DynamoDbClient;
import software.amazon.awssdk.services.dynamodb.model.AttributeValue;
import software.amazon.awssdk.services.dynamodb.model.PutItemRequest;

<snip>

public void writeObservation(Observation observation) {
   HashMap<String, AttributeValue> observationMap = new HashMap<String,
       AttributeValue>();

    observationMap.put("id", AttributeValue.builder()
      .s(observation.getStationid() + observation.getDate() + 
      observation.getTime()).build());

    observationMap.put("stationid", AttributeValue.builder()
      .n(Long.toString(observation.getStationid())).build());

    observationMap.put("date", AttributeValue.builder()
      .s(observation.getDate()).build());

    observationMap.put("time", AttributeValue.builder()
      .s(observation.getTime()).build());

    observationMap.put("image", AttributeValue.builder()
      .b(SdkBytes.fromUtf8String(observation.getImage())).build());

    if (observation.getTags() != null) {
      observationMap.put("tags", AttributeValue.builder()
        .ss(observation.getTags()).build());
    }

    PutItemRequest request = PutItemRequest.builder()
      .tableName("Observation").item(observationMap).build();

    this.dynamoDbClient.putItem(request);
  }
}

AttributeValue

There are four different AttributeValue classes in the DynamoDB Java API. Here we use the one in the software.amazon.awssdk.services.dynamodb.model package (api documentation). Remember, tables store items. An item is comprised of one or more attributes. An AttributeValue holds the value for an attribute. AttributeValue has a builder (api documentation) used to build an AttributeValue instance. An attribute value can be a string, number, binary data, list, or collection. You use the appropriate method corresponding to the datatype to set the AttributeValue object’s value. For instance, for a String use s(String value), binary use b(SdkBytes b), and for a collection of strings use ss(Collection ss). For a complete list, refer to the API documentation.

AttributeValue instances are placed in a Map, where the key is the attribute’s name in the database table. The Observation’s attributes are mapped using the appropriate builder methods.

  • Observation.id is a String, so we use,
.s(observation.getStationid() + observation.getDate() + 
       observation.getTime()).build()
  • The image, although encoded, is binary, and so we use,
observationMap.put("image", AttributeValue.builder()
      .b(SdkBytes.fromUtf8String(observation.getImage())).build());
  • The tags are an optional list of strings, so we wrap it in a conditional and use,
if (observation.getTags() != null) {
  observationMap.put("tags", AttributeValue.builder()
     .ss(observation.getTags()).build());
}

PutItemRequest

The PutItemRequest wraps the JSON request sent to the DynamoDBClient putItem method. A PutItemRequestBuilder builds a PutItemRequest. Above, we first added the table name, followed by the item to put. The item is a key-value map of the observation’s attributes. After building the PutItemRequest instance, the DynamoDBClient instance uses the request to write the observation to the DynamoDB Observation table.

PutItemRequest request = PutItemRequest.builder().tableName("Observation")
   .item(observationMap).build();
this.dynamoDbClient.putItem(request);

For more information, refer to the API documentation.

Create Rest Endpoint

  • Auto-wire the ObservationService to ObservationApiController using the @Autowired annotation.
  • Modify the saveObservation method to call the service’s writeObservation method.
package com.tutorial.aws.dynamodb.api;

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RequestBody;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;

import com.tutorial.aws.dynamodb.model.Observation;
import com.tutorial.aws.dynamodb.service.ObservationService;

@RestController
@RequestMapping(value = "/observations")
public class ObservationApiController {
  @Autowired
  ObservationService observationService;
	
  @PostMapping("/observation")
  public void saveObservation(@RequestBody Observation observation) {
    this.observationService.writeObservation(observation);
  }	
}
  • Build the application and start the Spring Boot application.

Test with Postman

  • Return to the request created previously in Postman and click the Send button.
  • Go to the table in the AWS Console and click Items.
  • If everything was done correctly, you should see two items in the Observation table.
{
  "date": {
    "S": "1999-09-22"
  },
  "id": {
    "S": "2211999-09-22071244"
  },
  "image": {
    "B": "LzlqLzRBQ <snip> JRPT0="
  },
  "stationid": {
    "N": "221"
  },
  "tags": {
    "SS": [
      "observation",
      "river",
      "sample"
    ]
  },
  "time": {
    "S": "071244"
  }
}

Read Observation

Now that we have written an Observation to DynamoDB, let’s create a Rest endpoint that fetches an Observation.

Modify Service Class

  • Modify ObservationService to have a getObservation method that takes an observation’s id as a parameter.
  • Create a HashMap to store the AttributeValue collection. These are used to query the database.
  • Add the observationId to the HashMap.
  • Build a GetItemRequest, assigning the HashMap as the key.
  • Get the item from the response and use its values to create a new Observation.
  • As tags is a list of items, you must loop through them to add them to the Observation object instance.
public Observation getObservation(String observationId) {
  HashMap<String,AttributeValue> key = new 
    HashMap<String,AttributeValue>();

  key.put("id", AttributeValue.builder().s(observationId).build());
  GetItemRequest request = GetItemRequest.builder()
    .tableName("Observation").key(key).build();
  
  Map<String,AttributeValue> responseItem = 
  this.dynamoDbClient.getItem(request).item();

  Observation observation = new Observation();
  observation.setDate(responseItem.get("date").s());
  observation.setTime(responseItem.get("time").s());
  observation.setImage(responseItem.get("image").b().asUtf8String());
  observation.setStationid(Long.parseLong(responseItem
    .get("stationid").n()));
		
  if(responseItem.get("tags") != null && responseItem.get("tags")
    .ss().size() > 0) {

    HashSet<String> vals = new HashSet<>();
    responseItem.get("tags").ss().stream().forEach(x->vals.add(x));
    observation.setTags(vals);
  }
  return observation;
}

GetItemRequest

The GetItemRequest wraps a JSON Get request to DynamoDB. To fetch a particular Observation we must provide the id to the Get request. The key is a Map of AttributeValue items. In this case we added only one attribute, the id.

GetItemRequest request = GetItemRequest.builder()
    .tableName("Observation").key(key).build();

Create Rest Endpoint

  • Add a method named getObservation that takes the observation’s id as a path variable.
  • Call the ObservationService getObservation method and return the result.
@GetMapping("/observation/{observationid}")
public Observation getObservation(@PathVariable("observationid") String 
  observationId) {

  return this.observationService.getObservation(observationId);

}

Test With Postman

  • Create a new request in Postman with the following URL.
  • Add one of the ids from the observations added earlier.
http://localhost:8080/observations/observation/2211992-03-12091312
  • After creating the request, click Send and the following should appear as the response (if you used 211992-03-12091312) .
{
    "stationid": 221,
    "date": "1992-03-12",
    "time": "091312",
    "image": "/9j/4AA <snip> kf/9k=",
    "tags": [
        "rapids",
        "rocks",
        "observation",
        "cold"
    ]
}

Delete Observation

So far we have added and fetched an Observation to DynamoDB. Now let’s delete an Observation.

Modify Service Class

  • Add a deleteObservation method that takes an observation’s id as a parameter.
  • Create a HashMap to hold the attributes.
  • Build a new DeleteItemRequest and use the HashMap as the key.
  • Use the dynamoDbClient to delete the observation.
public void deleteObservation(String observationId) {
  HashMap<String,AttributeValue> key = new HashMap<>();
  key.put("id", AttributeValue.builder().s(observationId).build());
  DeleteItemRequest deleteRequest = DeleteItemRequest.builder()
    .key(key).tableName("Observation").build();
  this.dynamoDbClient.deleteItem(deleteRequest);
}

DeleteItemRequest

The DeleteItemRequest wraps a JSON Delete HTTP request. As with all requests, we use a builder. The builder uses the table and the key to delete the Observation.

Create Rest Endpoint

  • Create a new Rest endpoint to delete observations.
  • Have the observation’s id passed to the endpoint as a path variable only add /delete after the variable.
  • Call the ObservationService deleteObservation method.
@DeleteMapping("/observation/{observationid}/delete")
public void deleteObservation(@PathVariable("observationid") String 
  observationId) {
  this.observationService.deleteObservation(observationId);
}

Test with Postman

  • Create a new Request using Postman.
  • Assign it DELETE from the dropdown to indicate it is an Http Delete request.
http://localhost:8080/observations/observation/2211992-03-12091312/delete
  • Click Send and the record should be deleted. Navigate to the Items in the AWS Console to ensure the Observation was deleted.

Update Observation

An Observation can have one or more tags. This is something that seems likely to be added at a later date and/or modified. Let’s create an endpoint that allows adding/modifying an observation’s tags.

Update Service Class

  • Create a method named updateObservationTags that takes a list of tags and an observation id as parameters.
  • Create a HashMap to hold AttributeValue objects.
  • Use the AttributeBuilderValue builder to add the tags to the HashMap with :tagval as the key.
  • Create a second HashMap to hold the observation’s id.
  • Build an UpdateItemRequest that uses an update expression.
public void updateObservationTags(List<String> tags, String observationId) {
  HashMap<String, AttributeValue> tagMap = new HashMap<String, 
    AttributeValue>();

  tagMap.put(":tagval", AttributeValue.builder().ss(tags).build());
  HashMap<String, AttributeValue> key = new HashMap<>();
  key.put("id", AttributeValue.builder().s(observationId).build());

  UpdateItemRequest request = UpdateItemRequest.builder()
    .tableName("Observation").key(key)
    .updateExpression("SET tags = :tagval")
    .expressionAttributeValues(tagMap).build();

  this.dynamoDbClient.updateItem(request);
}

UpdateItemRequest

The DynamoDBClient instance uses the UpdateItemRequest to build the request to update the item. As with fetching and deleting, it needs a key to properly select the correct item. But it also needs the values to update. You provide an update expression and then provide the attributes. Note that the key for the attribute, :tagval, matches the expression. The request then uses the key and the update expression to update the item.

Add Rest Endpoint

  • Add an endpoint that takes the observation id as a path variable and a JSON array of tags as the request body.
  • Call the ObservationService updateObservationTags method.
@PostMapping("/observation/{observationid}/updatetags")
public void updateObservationTags(@PathVariable("observationid") String 
  observationId, @RequestBody List<String> tags) {
  
  this.observationService.updateObservationTags(tags, observationId);

}

Test With Postman

  • Create a new Http Post Request with the following URL.
  • Add the id value as a path parameter directly in the URL.
http://localhost:8080/observations/observation/2211992-03-12091312/updatetags
  • Add the following values as the Body and assign the type as JSON (application/json).
["observation","rocks","rapids","cold"]
  • Click Send and then navigate to the AWS Console to view observations. The Observation should have the tags added.

Batch Write Observations

Sometimes multiple items must be written to a database.

Update Service Class

public void batchWriteObservations(List<Observation> observations) {
  ArrayList<WriteRequest> requests = new ArrayList<>();
  HashMap<String, AttributeValue> observationMap = new HashMap<>();
  for(Observation observation : observations) {
    observationMap.put("id", AttributeValue.builder()
      .s(observation.getStationid() + observation.getDate() + 
      observation.getTime()).build());

    observationMap.put("stationid", AttributeValue.builder()
      .n(Long.toString(observation.getStationid())).build());

    observationMap.put("date", AttributeValue.builder()
      .s(observation.getDate()).build());

    observationMap.put("time", AttributeValue.builder()
      .s(observation.getTime()).build());

    observationMap.put("image", AttributeValue.builder()
      .b(SdkBytes.fromUtf8String(observation.getImage())).build());

    if (observation.getTags() != null) {
      observationMap.put("tags", AttributeValue.builder()
        .ss(observation.getTags()).build());
    }

    WriteRequest writeRequest = WriteRequest.builder()
      .putRequest(PutRequest.builder().item(observationMap)
      .build()).build();

    requests.add(writeRequest);
  }

  HashMap<String,List<WriteRequest>> batchRequests = new HashMap<>();
  batchRequests.put("Observation", requests);
  BatchWriteItemRequest request = BatchWriteItemRequest.builder()
    .requestItems(batchRequests).build();

  this.dynamoDbClient.batchWriteItem(request);
}

The DynamoDbClient batchWriteItem method takes a BatchWriteItemRequest as a parameter. The BatchWriteItem can write or delete up to 25 items at once and is limited to 16 MB of data. Note that it still makes as many calls as you have items; however, it makes these calls in parallel.

You create a List to hold the WriteRequest for each Observation. Each Observation is written to a Map as key-value pairs. The map is added to a WriteRequest, which is then added to the list until all observations are prepared as WriteRequest instances.

 WriteRequest writeRequest = WriteRequest.builder()
      .putRequest(PutRequest.builder().item(observationMap)
      .build()).build();

Each list of WriteRequest instances is added to another map. The table name is the key and the list is the values. In this way a single batch write could write to multiple tables. After creating the map of the lists of WriteRequest instances, the whole thing is used to create a BatchWriteItemRequest which is used by the DynamoDbClient batchWriteItem method.

 HashMap<String,List<WriteRequest>> batchRequests = new HashMap<>();
  batchRequests.put("Observation", requests);
  BatchWriteItemRequest request = BatchWriteItemRequest.builder()
    .requestItems(batchRequests).build();

Create Rest Endpoint

  • Add a Rest endpoint that takes a list of observations as JSON in the request body.
  • Call the batchWriteObservations method in the ObservationService.
@PostMapping("/observation/batch")
public void batchSaveObservation(@RequestBody List<Observation> 
  observations) {

  this.observationService.batchWriteObservations(observations);

}
  • Build and run the application.

Test With Postman

  • Create a new Post Request in Postman with the following URL.
http://localhost:8080/observations/observation/batch
  • Add the following to the Body and assign it the type JSON (application/json).
[
    {
      "stationid": 221,
      "date": "2007-12-12",
      "time": "180000",
      "image": "/9j/4AAQSkZJRgABAQA <snip> kf/9k="
    },
    {
      "stationid": 221,
      "date": "2009-05-22",
      "time": "043455",
      "image": "/9j/4AAQSkZJRgABAQAA <snip> /8AD9KhoA//2Q=="
    },
    {
      "stationid": 234,
      "date": "2019-10-18",
      "time": "121459",
      "image": "/9j/4AAQSkZJRgABAQA <snip> VWoGf/9k="
    },
    {
      "stationid": 234,
      "date": "2017-09-22",
      "time": "093811",
      "image": "/9j/4AAQSkZJRgAB <snip> 5//2Q=="
    }
  ]
  • Click Send then navigate to the AWS Console Observation table’s items and the observations should be added.

Conditionally Fetch Observations

A common requirement is to fetch records based upon certain criteria. For example, suppose we wish to fetch all observations belonging to a particular station. When using DynamoDB any variable used for a query must be indexed. So before creating a query, we first create an index on the Observation table’s stationid variable.

Create Index

  • Navigate to the Observation table in the AWS Console.
  • Click Create Index.
  • Select stationid as the index’s partition key and be certain to define it as a Number.
  • Click Create Index to create the index.

Secondary Indexes

Secondary Indexes allow retrieving data from a table using an attribute other than the primary key. You retrieve data from the index rather than the table. For more on DynamoDB secondary indexes, refer to the following article by LinuxAcademy: A Quick Guide to DynamoDB Secondary Indexes.

Update Service Class

public List<Observation> getObservationsForStation(String stationId){
  ArrayList<Observation> observations = new ArrayList<>();
  Condition condition = Condition.builder()
    .comparisonOperator(ComparisonOperator.EQ)
    .attributeValueList(AttributeValue.builder()
    .n(stationId).build()).build();

  Map<String, Condition> conditions = new HashMap<String, Condition>();
  conditions.put("stationid",condition);

  QueryRequest request = QueryRequest.builder().tableName("Observation")
    .indexName("stationid-index").keyConditions(conditions).build();

  List<Map<String, AttributeValue>> results = this.dynamoDbClient
    .query(request).items();
		
  for(Map<String,AttributeValue> responseItem: results) {
    Observation observation = new Observation();
    observation.setDate(responseItem.get("date").s());
    observation.setTime(responseItem.get("time").s());
    observation.setImage(responseItem.get("image").b().asUtf8String());
    observation.setStationid(Long.parseLong(
      responseItem.get("stationid").n()));
			
    if(responseItem.get("tags") != null && responseItem.get("tags").ss()
      .size() > 0) {
      HashSet<String> vals = new HashSet<>();
      responseItem.get("tags").ss().stream().forEach(x->vals.add(x));
      observation.setTags(vals);
    }
    observations.add(observation);
  }		
  return observations;		
}

First we created a Condition using its associated builder. The condition is “=<the station id passed to function>”.

Condition condition = Condition.builder()
    .comparisonOperator(ComparisonOperator.EQ)
    .attributeValueList(AttributeValue.builder()
    .n(stationId).build()).build();

We then added the Condition to a map and specified stationid as the key and condition as the value. We then built the QueryRequest using its associated builder.

 QueryRequest request = QueryRequest.builder().tableName("Observation")
    .indexName("stationid-index").keyConditions(conditions).build();

Create Rest Endpoint

  • Create a new Rest endpoint that specifies stationid as a path variable.
  • Call the ObservationService getObservationsForStation method.
@GetMapping("/station/{stationid}")
public List<Observation> getObservations(@PathVariable("stationid") String 
  stationId) {
  return this.observationService.getObservationsForStation(stationId);
}
  • Build and run the application.

Test With Postman

  • Create a new Get request in Postman that includes a station’s id in the url.
  • Click Send and the Observation items for station 221 should appear as the response body.
http://localhost:8080/observations/station/221
[
    {
        "stationid": 221,
        "date": "2009-05-22",
        "time": "043455",
        "image": "/9j/4AA <snip> 0/8AD9KhoA//2Q==",
        "tags": null
    },
    {
        "stationid": 221,
        "date": "2007-12-12",
        "time": "180000",
        "image": "/9j/4 <snip> /rn+q07/sHxfyNUK0Wxkf/9k=",
        "tags": null
    },
    {
        "stationid": 221,
        "date": "1992-03-12",
        "time": "091312",
        "image": "/9j/4AAQSkZJRgABAQAAYAB <snip> K0Wxkf/9k=",
        "tags": [
            "rapids",
            "rocks",
            "observation",
            "cold"
        ]
    },
    {
        "stationid": 221,
        "date": "1999-09-22",
        "time": "071244",
        "image": "/9j/4n0g27Qu <snip> A//2Q==",
        "tags": [
            "observation",
            "river",
            "sample"
        ]
    }
]

Further Topics

There are several topics not explored in this tutorial. First, you can scan a database table. When you scan the table you return all the items in the table. Second, this tutorial did not discuss conditionally updating or deleting items. However, the principles are the same as conditionally querying a table for items. Also, it is helpful to explore the command-line examples for working with DynamoDB, as they help understand the SDK. Finally, we did not cover the Java 1.1 AWS SDK.

From Java 1.1 AWS SDK to Java 2 AWS SDK

There are many more examples and tutorial on the Web using the Java 1.1 API rather than the Java 2 API. However, the primary difference between the two versions is the builder pattern. Many, if not most, of the Java 1.1 tutorials remain useful. The pattern is the same:

  • create a request type
  • setup the request with the desired parameters,
  • pass the request to the DynamoDB client,
  • obtain the result.

In the Java 1.1 SDK you perform these steps using constructors and setters and getters. In the Java 2 SDK you use builders. Practically all classes in the Java 2 AWS SDK use builders. Use this as a starting point if you have a particularly good tutorial using the Java 1.1. SDK. Although not foolproof, doing this has helped me translate many Java 1.1. examples to Java 2 SDK.

Further Resources

  • More on DynamoDB design patterns.
  • This tutorial, although it uses the Java 1 AWS API, is a very good introduction covering the same topics in this tutorial. Just remember, think builder, although the techniques in the API are the same, the Java 2 version of the API uses builders extensively.

Conclusion

In this tutorial we explored the lower-level API of the Java 2 SDK by using the AWS DynamoDB Java API. We wrote an item, updated an item, deleted an item, and batch uploaded items. We also explored conditionally querying items.

As with all of the SDK, it is based upon builders, requests, and the client. You build a request to pass to the DynamoDBClient which in turn returns a response. You do not create a new instance of a request and set properties via setters, but rather, you use a builder to build a request.

DynamoDB is a non-relational database and so you cannot just write a conditional query on any field. You can only use fields that are indexed in a query. This seems logical if you consider that DynamoDB is designed for massive amounts of data that is relatively unstructured.

Git-hub Project

Amazon’s AWS S3 Java API 2.0 (Using Spring Boot as Client)

In this tutorial you use the AWS S3 Java API in a Spring Boot application. Amazon’s S3 is an object storage service that offers a low-cost storage solution in the AWS cloud. It provides unlimited storage for organizations regardless of an organization’s size. It should not be confused with a fully-featured database, as it only offers storage for objects identified by a key. The structure of S3 consists of buckets and objects. An account can have up to 100 buckets and a bucket can have an unlimited number of objects. Objects are identified by a key. Both the bucket name and object keys must be globally unique. If working with S3 is unfamiliar, refer to the Getting Started with Amazon Simple Storage Service guide before attempting to work with the AWS S3 Java API in this tutorial.

In this tutorial we explore creating, reading, updating, listing, and deleting objects and buckets stored in S3 storage using the AWS S3 Java API SDK 2.0 to access Amazon’s Simple Storage Service (S3).

First we perform the following tasks with objects:

  • write an object to a bucket,
  • update an object in a bucket,
  • read an object in a bucket,
  • list objects in a bucket,
  • and delete an object in a bucket.

After working with objects, we then use the Java SDK to work with buckets, and perform the following tasks:

  • create a bucket,
  • list buckets,
  • and delete a bucket.

This tutorial uses the AWS SDK for Java 2.0. The SDK changed considerably since 1.X and the code here will not work with older versions of the API. In particular, this tutorial uses the 2.5.25 version of the API.

Do not let using Spring Boot deter you from this tutorial. Even if you have no interest in Spring or Spring Boot, this tutorial remains useful. Simply ignore the Spring part of the tutorial and focus on the AWS S3 code. The AWS code is valid regardless of the type of Java program written and the Spring Boot code is minimal and should not be problematic.

And finally, you might question why this tutorial creates a Rest API as Amazon also exposes S3 functionality as a REST API, which we will explore in a later tutorial. Suspend disbelief and ignore that we are wrapping a Rest API in another Rest API. Here the focus is programmatically accessing the API using the Java SDK. The tutorial should prove useful even if you are a Java developer with no interest in Spring Boot.

  • The AWS Java 2.0 API Developers Guide is available here.

Prerequisites

Before attempting this tutorial on the AWS S3 Java API you should have a basic knowledge of the Amazon AWS S3 service. You need an AWS developer account. You can create a free account on Amazon here. For more information on creating an AWS account refer to Amazon’s website.

The Spring Boot version used in this tutorial is 2.0.5 while the AWS Java SDK version is 2.5.25. In this tutorial we use Eclipse and Maven, so you should have a rudimentary knowledge of using Maven with Eclipse. And we use Postman to make rest calls. But, provided you know how to build using Maven and know Rest fundamentals, you should be okay using your own toolset.

You must have an AWS development account.

Creating A Bucket – Console

Amazon continually improves the AWS console. For convenience, we create a user and bucket here; however, you should consult the AWS documentation if the console appears different than the images and steps presented. These images and steps are valid as of April 2019. For more information on creating a bucket and creating a user, refer to Amazon’s documentation.

Let’s create a bucket to use in this tutorial.

  • Log into your account and go to the S3 Console and create a new bucket.
  • Name the bucket javas3tutorial * and assign it to your region. Here, as I am located in Frederick Maryland, I assigned it to the US East region (N. Virginia).
  • Accept the default values on the next two screens and click Create bucket to create the bucket.

Note that in this tutorial I direct you to create buckets and objects of certain names. In actuality, create your own names. Bucket names must be globally unique, A name such as mybucket was used long ago.

Bucket names must be globally unique across all of S3.
Click Create bucket to start creating a bucket.
Assign bucket name and region.

Accept the defaults and click Next.
Accept the defaults and click Next button.

Click Create bucket if options are correct.

After creating the bucket you should see the bucket listed in your console. Now we must create a user to programmatically access S3 using the Java SDK.

The bucket appears in your S3 buckets screen.

Creating an S3 User – Console

As with creating a bucket, the instructions here are not intended as comprehensive. More detailed instructions are provided on the AWS website. To access S3 from the Java API we must create a user with programmatic access to the S3 Service. That user is then used by our program as the principal performing AWS tasks.

  • Navigate to the Identity and Access Management (IAM) panel.
  • Click on Users and create a new user.
  • Provide the user with Programmatic access.
Creating a user with programmatic access.
  • After creating the user, create a group.
Create a group by clicking Create group.
  • Assign the AmazonS3FullAccess policy to the group.
Assigning AmazonS3FullAccess to a user.
  • Navigate past create tags, accepting the default of no tags.
Accept default and do not assign tags.
  • Review the user’s details and click Create user to create the user.
Review user settings and click Create user.
  • On the success screen note the Download .csv button. You must download the file and store in a safe place, otherwise you will be required to create new credentials for the user.
After creating user, click Download .csv to save the public and private keys.

The content of the credentials.csv will appear something like the following. Keep this file guarded, as it contains the user’s secret key and provides full programatic access to your S3 account.

Note: I deleted this user and group prior to publishing this tutorial.

User name,Password,Access key ID,Secret access key,Console login link
java_tutorial_user,,XXXXXXXXXXX,oaUl6jJ3QTdoQ8ikRHVa23wNvEYQh5n0T5lfz1uw,https://xxxxxxxx.signin.aws.amazon.com/console

After creating the bucket and the user, we can now write our Java application.

Java Application – Spring Boot

We use Spring boot to demonstrate using the AWS Java SDK. If you are unfamiliar with Spring Boot, refer to this tutorial to get started with Spring Boot and Rest.

Project Setup

We setup the project as a Maven project in Eclipse.

Maven Pom

  • Add the Spring Boot dependencies to the pom file.
  • Add the AWS Maven Bill of Materials (BOM) to the pom file.
  <dependencyManagement>
    <dependencies>
      <dependency>
	<groupId>software.amazon.awssdk</groupId>
	<artifactId>bom</artifactId>
	<version>2.5.25</version>
	<type>pom</type>
	<scope>import</scope>
      </dependency>
    </dependencies>
  </dependencyManagement>

A BOM is a POM that manages the project dependencies. Using a BOM frees developers from worrying that a library’s dependencies are the correct version. You place a BOM dependency in a dependencyManagement, then when you define your project’s dependencies that are also in the BOM, you omit the version tag, as the BOM manages the version.

To better understand a BOM, navigate to the BOM and review its contents.

  • Navigate to the Maven repository for the BOM.
https://mvnrepository.com/artifact/software.amazon.awssdk/bom
  • Click on the latest version (2.5.25 as of the tutorial).
The AWSSDK BOM.
  • Click on the View All link.
Summary of the AWS Java SDK Bill of Materials 2.25.
  • Click the link to the pom and the BOM appears. This is useful, as it lists the AWS modules.
The listing of BOM files. Click on the pom to view the xml pom definition.
Snippet of the AWS SDK BOM contents.
  • Add the auth, awscore, and s3 artifacts to the pom. Note that we do not require specifying the version, as the BOM handles selecting the correct version for us.
  • Add the spring dependencies to the pom.
  • The complete pom should appear as follows.

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
	xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>com.tutorial.aws</groupId>
  <artifactId>tutorial-aws</artifactId>
  <version>0.0.1-SNAPSHOT</version>
  <name>TutorialAWS</name>
  <parent>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-parent</artifactId>
    <version>2.0.5.RELEASE</version>
  </parent>
  <properties>
    <java.version>1.8</java.version>
  </properties>
  <dependencyManagement>
    <dependencies>
      <dependency>
	<groupId>software.amazon.awssdk</groupId>
	<artifactId>bom</artifactId>
	<version>2.5.25</version>
	<type>pom</type>
	<scope>import</scope>
      </dependency>
    </dependencies>
  </dependencyManagement>
  <dependencies>
    <dependency>
      <artifactId>auth</artifactId>
      <groupId>software.amazon.awssdk</groupId>
    </dependency>
    <dependency>
      <artifactId>aws-core</artifactId>
      <groupId>software.amazon.awssdk</groupId>
    </dependency>
    <dependency>
      <artifactId>s3</artifactId>
      <groupId>software.amazon.awssdk</groupId>
    </dependency>
    <dependency>
      <groupId>org.springframework.boot</groupId>
      <artifactId>spring-boot-starter-web</artifactId>
    </dependency>
    <dependency>
      <groupId>org.springframework.boot</groupId>
      <artifactId>spring-boot-starter-test</artifactId>
      <scope>test</scope>
    </dependency>
    <dependency>
      <groupId>software.amazon.awssdk</groupId>
      <artifactId>auth</artifactId>
    </dependency>
  </dependencies>
  <build>
    <plugins>
      <plugin>
	<groupId>org.springframework.boot</groupId>
	<artifactId>spring-boot-maven-plugin</artifactId>
      </plugin>
      <plugin>
	<groupId>org.apache.maven.plugins</groupId>
	<artifactId>maven-jar-plugin</artifactId>
	<version>3.1.1</version>
	<executions>
          <execution>
	  <phase>package</phase>
	  <goals>
	    <goal>jar</goal>
	  </goals>
	  <configuration>
	    <classifier>client</classifier>
	    <includes>
	      <include>**/factory/*</include>
	    </includes>
	  </configuration>
	</execution>
      </executions>
      </plugin>
    </plugins>
  </build>
</project>

After creating the POM you might want to try building the project to ensure the POM is correct and you setup the project correctly. After that, we need to add the AWS user credentials to your project.

AWS Credentials

When your application communicates with AWS, it must authenticate itself by sending a user’s credentials. The credentials consists of the access key and secret access key you saved when creating the user. There are several ways you might provide these credentials to the SDK, for example, you can put the credentials file in a users home directory, as follows, and they will be automatically detected and used by your application.

~/.aws/credentials 
C:\Users\<username>\.aws\credentials

For more information on the alternative ways of setting an application’s user credentials, refer to the Developer’s Guide. But here we are going to manually load the credentials from the Spring boot application.properties file

  • If you did not start with a bare-bones Spring Boot project, create a new folder named resources and create an application.properties file in this folder.
  • Refer to the credential file you saved and create the following two properties and assign the relevant values. Of course, replace the values with the values you downloaded when creating a programatic user.
Add the two properties to the application.properties file.
cloud.aws.credentials.accessKey=XXXXXXXXXXXXXXXXXXXX
cloud.aws.credentials.secretKey=XXXXXXXXXXXXXXXXXXXXXXXXXXXXX
cloud.aws.region.static=us-east-1

Binary File

  • Add a small binary file to the resources folder. For example, here we use sample.png, a small image file.

Spring Boot Application

Now that we have the project structure, we can create the Spring Application to demonstrate working with the AWS S3 Java API.

  • Create the com.tutorial.spring.application, com.tutorial.spring.controller, com.tutorial.spring.data, and the com.tutorial.spring.service packages.
  • Create a new Spring application class named SimpleAwsClient in the application package.
package com.tutorial.aws.spring.application;

import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.context.annotation.ComponentScan;

@SpringBootApplication
@ComponentScan({ "com.tutorial.aws.spring" })
public class SimpleAwsClient {
  public static void main(String[] args) {
    SpringApplication.run(SimpleAwsClient.class, args);
  }
}

Data Object (POJO)

  • Create a simple data object named DataObject in the data package.
  • Add the variable name and create the getter and setter for this property.

package com.tutorial.aws.spring.data;

public class DataObject {
	
	String name;
	
	public String getName() {
		return name;
	}

	public void setName(String name) {
		this.name = name;
	}
}
  • Ensure the program compiles.

We now have the project’s structure and can work with S3 using the SDK.

Writing Objects to S3

We implement the example application as a Spring Boot Rest application. The standard architecture of this application consists of a Controller, a Service, and a data access layer. In this tutorial there is no need for a data access layer, and so the application consists of a controller and service. Begin by creating a Service class that interacts with the AWS SDK.

Service

  • Create a new class named SimpleAwsS3Service and annotate it with the @Service annotation.
  • Create the key and secretKey properties and populate them from the application.properties file.
  • Add an S3Client as a private variable.
  • Create a method named initialize and annotate it with the @PostContstruct annotation.
  • Create a method named uploadFile that takes a DataObject and writes the file to S3.
package com.tutorial.aws.spring.service;

import java.io.File;
import java.io.FileNotFoundException;
import java.net.URISyntaxException;

import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Service;

import com.tutorial.aws.spring.data.DataObject;

import software.amazon.awssdk.auth.credentials.AwsBasicCredentials;
import software.amazon.awssdk.auth.credentials.StaticCredentialsProvider;
import software.amazon.awssdk.awscore.exception.AwsServiceException;
import software.amazon.awssdk.core.exception.SdkClientException;
import software.amazon.awssdk.core.sync.RequestBody;
import software.amazon.awssdk.regions.Region;
import software.amazon.awssdk.services.s3.S3Client;
import software.amazon.awssdk.services.s3.model.ObjectCannedACL;
import software.amazon.awssdk.services.s3.model.PutObjectRequest;
import software.amazon.awssdk.services.s3.model.S3Exception;

@Service
public class SimpleAwsS3Service {
	
  @Value("${cloud.aws.credentials.accessKey}")
  private String key;

  @Value("${cloud.aws.credentials.secretKey}")
  private String secretKey;
  
  private S3Client s3Client;

  @PostConstruct
  public void initialize() {
     AwsBasicCredentials awsCreds = AwsBasicCredentials.create(key, secretKey);

    s3Client = S3Client.builder().credentialsProvider(StaticCredentialsProvider
            .create(awsCreds)).region(Region.US_EAST_1).build();
  }
	
  public void uploadFile(DataObject dataObject) throws S3Exception, 
    AwsServiceException, SdkClientException, URISyntaxException, 
    FileNotFoundException {

    PutObjectRequest putObjectRequest = PutObjectRequest.builder()
        .bucket("javas3tutorial").key(dataObject.getName())
        .acl(ObjectCannedACL.PUBLIC_READ).build();
			
    File file = new File(getClass().getClassLoader()
        .getResource(dataObject.getName()).getFile());

    s3Client.putObject(putObjectRequest, RequestBody.fromFile(file));
  }
}

Rest Controller

  • Create a new RestController named SimpleAwsController in the com.tutorial.aws.spring.controller package.
  • Annotate the class with a /javas3tutorialbucket endpoint (or the name you desire).
  • Create an endpoint named /addobject that takes a POST request.
  • Create an endpoint named /fetchobject/{filename} that takes a GET request.
  • Create an endpoint named /listobjects that takes a GET request.
  • Create an endpoint named /updateobject that takes a PUT request.
  • Create an endpoint named /deleteobject that takes a DELETE request.
  • Create a class variable for the SimpleAwsService and annotate it with the @Autowired annotation.
package com.tutorial.aws.spring.controller;

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.DeleteMapping;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.PutMapping;
import org.springframework.web.bind.annotation.RequestBody;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;
import com.tutorial.aws.spring.data.DataObject;
import com.tutorial.aws.spring.service.SimpleAwsS3Service;

@RestController
@RequestMapping(value = "/javas3tutorialbucket")
public class SimpleAwsController {

  @Autowired
  SimpleAwsS3Service simpleAwsS3Service;

  @PostMapping("/addobject")
  public void createObject(@RequestBody DataObject dataObject) throws Exception {
    this.simpleAwsS3Service.uploadFile(dataObject);
  }
	
  @GetMapping("/fetchobject/{filename}")
  public void fetchObject(@PathVariable String filename){
  }

  @GetMapping("/listobjects")
  public List<String> listObjects() throws {
  }
	
  @PutMapping("/updateobject")
  public void updateObject(@RequestBody DataObject dataObject) {
  }
	
  @DeleteMapping("/deleteobject")
    public void deleteObject(@RequestBody DataObject dataObject) {
  }		
}

There are many concepts in the preceding code. Let’s examine each in turn.

Builder Pattern and Fluent Interface

The fluent interface is a term created by Martin Fowler and Eric Evans. It refers to a programming style where the public methods (the API) can be chained together to perform a task. It is used by the AWS S3 Java API 2.x when using builders. The builder tasks perform tasks but then return an instance of the builder. This allows chaining methods together. For more information on the fluid interface and builders, refer to this blog post: Another builder pattern for Java.

AwsBasicCredentials

The AwsBasicCredentials class implements the AwsCredentials Interface and takes a key and secret key. These credentials are then used by an S3Client to securely authenticate to AWS.

S3Client

The S3Client class is a client for accessing AWS. As with most the API, it uses a builder to construct itself. The builder uses the credentials and region to create the S3Client. The S3Client is then used for all communication between a client application and AWS.

PutObjectRequestR

The PutObjectRequest is for uploading objects to S3. You create and configure the class using its associated builder, PutObjectRequest.Builder interface. We provide the bucket name, the object name, and although not required, we pass an access control list specifying the public has read access of the resource.

PutObjectRequest putObjectRequest = PutObjectRequest.builder()
    .bucket("javas3tutorial").key(dataObject.getName())
    .acl(ObjectCannedACL.PUBLIC_READ).build();

The ObjectCannedACL provides, well, a pre-canned access control list. Valid values are:

AUTHENTICATED_READ,
AWS_EXEC_READ,
BUCKET_OWNER_FULL_CONTROL,
BUCKET_OWNER_READ,
PRIVATE,
PUBLIC_READ,
PUBLIC_READ_WRITE, and
UNKNOWN_TO_SDK_VERSION.

The S3Client then uses the PutObjectRequest to upload the object to S3.

Running The Program

  • Compile, and run the Spring Application.
  • Send the request using Postman or curl and note the error response. S3 denied access.
Uploading the object fails with an Access Denied error.

The failure is because of the ACL we attempted to set. We wished to grant public read access. But, when creating the bucket, we failed to allow for this. We need to return to the bucket configuration and explicitly allow public access.

By default public access is denied.

Object Visibility

  • Sign into the AWS Console and navigate to the bucket. Note that neither the bucket nor the objects are public.
  • Click on the bucket and the following popup should appear.
  • Click on the Permissions link.
  • Un-check the two checkboxes under the Manage public access… heading. By unchecking them we are allowing new ACLs and uploading public objects.
  • A new popup appears just to be sure that we wish to do this. What this is telling you, of course, is this is generally not a good idea unless you truly wish making the objects in a bucket public.
  • Type confirm and click the Confirm button.
  • Return to Postman and try again. Postman should receive a 200 Success HTTP Code.
  • Refresh the bucket screen in AWS and the file should appear.
  • Click on the file and in the resulting popup, click on the object’s URL and the object should load in a browser. If not, copy and paste the url into a browser.



Downloading Objects On S3

Downloading an object involves creating a GetObjectRequest and then passing it to an S3Client to obtain the object. Here we download it directly to a file, although note you can work with the object as it is downloading.

Service

  • Implement the downloadFile method as follows in the SimpleAwsService class.
  • Create a GetObjectRequest, get the classpath to the resources folder, and then use s3Client to download sample.png and save it as test.png.
public void downloadFile(DataObject dataObject) throws NoSuchKeyException, S3Exception, AwsServiceException, SdkClientException, IOException {

  GetObjectRequest getObjectRequest = GetObjectRequest.builder()
      .bucket("javas3tutorial").key(dataObject.getName()).build();

  Resource resource = new ClassPathResource(".");
  s3Client.getObject(getObjectRequest,Paths.get(resource.getURL()
      .getPath()+"/test.png"));
}

The builder uses the bucket name and the object key to build a GetObjectRequest. We then use the S3Client to get the object, downloading it directly to the file path passed.

Rest Controller

  • Implement the fetchobject endpoint in the SimpleAwsController class.
@GetMapping("/fetchobject/{filename}")
public void fetchObject(@PathVariable String filename) throws Exception {
  DataObject dataObject = new DataObject();
  dataObject.setName(filename);
  this.simpleAwsS3Service.downloadFile(dataObject);
}

Running the Program

  • Create a request in Postman (or curl) and fetch the file.
  • Navigate to the resources folder in the project target folder and you should see the downloaded file.

Listing Objects On S3

The steps to list files in a bucket should prove familiar by now: use a builder to build a request object, which is passed to the S3Client which uses the request to interact with AWS. However, here we work with the response as well.

Add Files

  • Navigate to the bucket on the AWS console.
  • Upload a few files to the bucket.

Service

  • Modify SimpleAwsService to implement a method named listObjects that returns a list of strings.
  • Create a ListObjectsRequest and have the s3Client use the request to fetch the objects.
  • Copy the object keys to the returned list.
public List<String> listObjects() {

  List<String> names = new ArrayList<>();
  
  ListObjectsRequest listObjectsRequest = 
  ListObjectsRequest.builder().bucket("javas3tutorial").build();
  
  ListObjectsResponse listObjectsResponse = s3Client
      .listObjects(listObjectsRequest);
  
  listObjectsResponse.contents().stream()
      .forEach(x -> names.add(x.key()));
  return names;
}

We first use a builder to create a ListObjectsRequest. The S3Client then requests the list of objects in the bucket and returns a ListObjectResponse. We then iterate through each object in the response and put the key in an ArrayList.

Rest Controller

  • Modify SimpleAwsController to implement the listObjects method.
@GetMapping("/listobjects")
public List<String> listObjects() throws Exception {
  return this.simpleAwsS3Service.listObjects();
}

Running the Program

  • Create a new request in Postman and list the objects in the bucket.

Modifying Objects

Technically speaking, you cannot modify an object in an S3 bucket. You can replace the object with a new object, and that is what we do here.

  • Replace the file used in your project with a different file. For instance, I changed sample.png with a different png file. Now sample.png in the project differs from the sample.png file in the AWS bucket.

Rest Controller

  • Modify the SimpleAwsController class so that the uploadObject method calls the uploadFile method in the SimpleAwsService class.
@PutMapping("/updateobject")
public void updateObject(@RequestBody DataObject dataObject) throws Exception {
  this.simpleAwsS3Service.uploadFile(dataObject);
}

Running the Application

  • Compile the program and create a new request in Postman.
  • Go to the file in the AWS bucket and click the Object URL and the object should have been replaced.

Deleting Objects

Deleting objects follows the same pattern: build a request, pass that request to the S3Client, and the S3Client uses it to delete the object.

Service

  • Modify the SimpleAwsService to implement the deleteFile method.
  • Create a DeleteObjectRequest and have the s3Client use the request to delete the object.
public void deleteFile(DataObject dataObject) {
  DeleteObjectRequest deleteObjectRequest = DeleteObjectRequest.builder()
      .bucket("javas3tutorial").key(dataObject.getName()).build();
  s3Client.deleteObject(deleteObjectRequest);
}

Rest Controller

  • Modify the SimpleAwsController to implement the deleteObject method.
@DeleteMapping("/deleteobject")
public void deleteObject(@RequestBody DataObject dataObject) {
  this.simpleAwsS3Service.deleteFile(dataObject);
}	

Running The Application

  • Compile the program and create a DELETE request in Postman and delete the object.
  • Navigate to the bucket on the AWS Console and the object should no longer exist.

Buckets

By this point, if you worked through the tutorial, you should be able to guess the workflow and relevant classes needed for creating, listing, and deleting buckets. The CreateBucketRequest, ListBucketRequest, and DeleteBucketRequest are the relevant request classes and each request has a corresponding builder to build the request. The S3Client then uses the request to perform the desired action. Let’s examine each in turn.

Creating Buckets

Creating a bucket consists of creating a CreateBucketRequest using a builder. Because bucket names must be globally unique, we append the current milliseconds to the bucket name to ensure it is unique.

Service

  • Create a method named addBucket to the AwsSimpleService class.
public DataObject addBucket(DataObject dataObject) {
  dataObject.setName(dataObject.getName() + System.currentTimeMillis());

  CreateBucketRequest createBucketRequest = CreateBucketRequest
	       .builder()
	       .bucket(dataObject.getName()).build();
        
  s3Client.createBucket(createBucketRequest);
  return dataObject;		
}

Rest Controller

  • Create a createBucket method in AwsSimpleRestController with a /addbucket mapping.
@PostMapping("/addbucket")
public DataObject createBucket(@RequestBody DataObject dataObject) {
  return this.simpleAwsS3Service.addBucket(dataObject);
}	

Listing Buckets

Listing buckets follows the same pattern as listing objects. Build a ListBucketsRequest, pass that to the S3Client, and then get the bucket names by iterating over the ListBucketsResponse.

Service

  • Create a new method called listBuckets that returns a list of strings to SimpleAwsService.
public List<String> listBuckets() {
  List<String> names = new ArrayList<>();
  ListBucketsRequest listBucketsRequest = ListBucketsRequest
      .builder().build();
  ListBucketsResponse listBucketsResponse = s3Client
      .listBuckets(listBucketsRequest);
  listBucketsResponse.buckets().stream()
      .forEach(x -> names.add(x.name()));
  return names;
}

The listBucketsResponse contains a List of Bucket objects. A Bucket has a name method that returns the bucket’s name.

Rest Controller

  • Add a /listbuckets endpoint to SimpleAwsController.
@GetMapping("/listbuckets")
public List<String> listBuckets() {
  return this.simpleAwsS3Service.listBuckets();
}

Deleting Buckets

Before you can delete a bucket you must delete it’s contents. Here we assume non-versioned resources. Now, you might be tempted to try the following, but consider the scalability.

for each item in bucket delete.

This is fine for a few objects in a sample project like in this tutorial, but it will quickly prove untenable, as the program will block as it makes the http connection to the S3 storage, deletes the object, and returns success. It could quickly go from minutes, to hours, to years, to decades, depending upon the number of objects stored. Remember, each call is making an HTTP request to an AWS server over the Internet.

Of course, Amazon thought of this, and provides a means of deleting multiple objects at once. The following code will not win any elegance awards for its iteration style, but it demonstrates a scalable way to delete buckets containing many objects.

Service

  • Add a method called deleteBucket that takes a bucket’s name as a String.
  • Get the keys of the objects in the bucket and iterate over the keys.
  • With each iteration, build an ObjectIdentifier and add it to an array of identifiers.
  • Every thousand keys, delete the objects from the bucket.
  • After iterating over all the keys, delete any remaining objects.
  • Delete the bucket.
public void deleteBucket(String bucket) {

  List<String> keys = this.listObjects(bucket);
  List<ObjectIdentifier> identifiers = new ArrayList<>();

  int iteration = 0;

  for(String key : keys) {
    ObjectIdentifier objIdentifier = ObjectIdentifier.builder()
        .key(key).build();
    identifiers.add(objIdentifier);
    iteration++;

    if(iteration == 1000){
      iteration = 0;
      DeleteObjectsRequest delReq = DeleteObjectsRequest.builder()
          .bucket(bucket).delete(Delete.builder()
          .objects(identifiers).build()).build();
      s3Client.deleteObjects(deleteObjectsRequest);
      identifiers.clear();
    }

  }

  if(identifiers.size() > 0) {
    DeleteObjectsRequest deleteObjectsRequest = 
        DeleteObjectsRequest.builder().bucket(bucket)
        .delete(Delete.builder().objects(identifiers)
        .build()).build();		 
   s3Client.deleteObjects(deleteObjectsRequest);
  }

  DeleteBucketRequest deleteBucketRequest = DeleteBucketRequest.builder()
      .bucket(bucket).build();
  s3Client.deleteBucket(deleteBucketRequest);
}

Rest Controller

  • Add a deletebucket endpoint to the SimpleAwsController.
@DeleteMapping("/deletebucket") 
public void deleteBucket(@RequestBody DataObject dataObject) {
  this.simpleAwsS3Service.deleteBucket(dataObject.getName());
}

Conclusions

In this tutorial on the AWS S3 Java API we worked with objects and buckets in S3. We created an object, listed objects, downloaded an object, and deleted an object. We also created a bucket, listed buckets, and deleted a bucket. Although we used Spring Boot to implement the sample application, the ASW Java code remains relevant for other Java application types.

We did not upload an object using multiple parts. For a good example on accomplishing this task, refer to the SDK Developer Guide’s sample S3 code. Also, we assumed no versioning to keep the tutorial simple. If you must support versioning then consult the documentation.

The AWS S3 Java API wraps Amazon’s S3 Rest API with convenience classes. Here you used those classes to work with objects and buckets. In a future tutorial we will work with the Rest API directly.

Further Sources

Git Project

https://github.com/jamesabrannan/s3tutorial

Spring Boot 2 REST Exceptions

This tutorial might leave you wanting more. Rather than giving you explicit if this then do that advice, I show you three different techniques you might use for handling Spring Boot 2 REST Exceptions. Those of you with experience might ask why even bother, as Spring Boot handles exceptions and presents a nice REST response by default. However, there are instances where you might require customizing exception handling, and this tutorial demonstrates three techniques. As with the other tutorials on this site, the caveat emptor applies…if you follow this tutorial with a different version of Spring Boot, or worse, Spring without the Boot, then be prepared to do further research, as Spring Boot 2’s primary purpose is to simplify Spring development. With simplification, many of the implementation details become hidden.

There are three ways we can handle exceptions using Spring Boot 2 Rest Exceptions: the default handling, exception handling in the controller, or global exception handling. In this tutorial we explore all three ways of handling exceptions.

Project Setup

Before beginning, create your Spring Boot application. If you are new to Spring Boot then you should refer to one of the tutorials here, or on the web before attempting this tutorial. This tutorial assumes you can create, compile, and run a Spring Boot REST application. It also assumes you know how to call a REST endpoint.

  • Create a new Spring Boot Maven project in Eclipse. I used Spring Initializer to create a project. (Spring Initializr Video, Written Tutorial )
  • Assign the value com.tutorial.exceptions.spring.rest as the group and the value exceptions-tutorial as the artifact.
  • For simplicity, replace the POM with the following.
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>
    <parent>
      <groupId>org.springframework.boot</groupId>
      <artifactId>spring-boot-starter-parent</artifactId>
      <version>2.1.3.RELEASE</version>
      <relativePath /> <!-- lookup parent from repository -->
    </parent>
    <groupId>com.tutorial.exceptions.spring.rest</groupId>
      <artifactId>exceptions-tutorial</artifactId>
      <version>0.0.1-SNAPSHOT</version>
      <name>exceptions-tutorial</name>
      <description>Tutorial project demonstrating exceptions in Spring Rest.</description>
      <properties>
        <java.version>1.8</java.version>
      </properties>
      <dependencies>
        <dependency>
          <groupId>org.springframework.boot</groupId>
          <artifactId>spring-boot-starter-web</artifactId>
	</dependency>
	<dependency>
          <groupId>org.springframework.boot</groupId>
          <artifactId>spring-boot-starter-test</artifactId>
          <scope>test</scope>
        </dependency>
      </dependencies>
      <build>
        <plugins>
          <plugin>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-maven-plugin</artifactId>
          </plugin>
        </plugins>
      </build>
</project>
  • Create a new class named ExceptionTutorialApplication that extends SpringBootApplication and starts the Spring application in the main method.
package com.tutorial.exceptions.spring.rest.exceptionstutorial;

import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;

@SpringBootApplication
public class ExceptionsTutorialApplication {

  public static void main(String[] args) {
    SpringApplication.run(ExceptionsTutorialApplication.class, args);
  }
}
  • Create a new class named HelloGoodbye. Create three properties, greeting, goodbye, and type.
package com.tutorial.exceptions.spring.rest.exceptionstutorial;

public class HelloGoodbye {
  private String greeting;
  private String goodbye;
  private String type;
	
  public String getType() {
    return type;
  }
  public void setType(String type) {
    this.type = type;
  }
  public String getGoodbye() {
    return goodbye;
  }
  public void setGoodbye(String goodbye) {
    this.goodbye = goodbye;
  }
  public String getGreeting() {
    return greeting;
  }
  public void setGreeting(String greeting) {
    this.greeting = greeting;
  }	
}
  • Create a new Spring service named GreetingService.
  • Suspend disbelief and implement a method named createGreeting as listed below.
package com.tutorial.exceptions.spring.rest.exceptionstutorial;

import org.springframework.stereotype.Service;

@Service
public class GreetingService {

  public HelloGoodbye createGreeting(String type) {
    HelloGoodbye helloGoodbye = new HelloGoodbye();
    if(type.equals("hello")) {
      helloGoodbye.setGreeting("Hello there.");
    }
    else if(type.equals("goodbye")) {
      helloGoodbye.setGoodbye("Goodbye for now.");
    }
    helloGoodbye.setType(type);
    return helloGoodbye;
  }
}
  • Create a new Spring rest controller and auto-wire the GreetingService.
  • Create a new method, getGreeting, that takes a request parameter named type and calls the GreetingService createGreeting method.
package com.tutorial.exceptions.spring.rest.exceptionstutorial;

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;


@RestController
@RequestMapping(value = "/greeting")
public class GreetingController {

  @Autowired
  protected GreetingService service;

  @GetMapping("/greet")
  public HelloGoodbye getGreeting(@RequestParam("type") String type) {
    HelloGoodbye goodBye = service.createGreeting(type);
    return goodBye;
  }
}
  • Compile and run the application.
  • Use Curl, a WebBrowser, or some other tool such as Postman to call the rest endpoint and assign type the value hello.

http://localhost:8080/greeting/greet?type=hello
  • Note the JSON response.
{
    "greeting": "Hello there.",
    "goodbye": null,
    "type": "hello"
}
  • Change type to goodbye and call the rest endpoint again.
http://localhost:8080/greeting/greet?type=goodbye
{
    "greeting": null,
    "goodbye": "Goodbye for now.",
    "type": "goodbye"
}
  • Change the type to wrong and note the response.
http://localhost:8080/greeting/greet?type=wrong
{
    "greeting": null,
    "goodbye": null,
    "type": "wrong"
}

The response is not very helpful when an incorrect value for type is passed to the rest endpoint. Moreover, the response will likely result in a client application throwing a NullPointerException, as both greeting and goodbye are null. Instead, we should throw an exception when an incorrect value is passed to the endpoint.

As an aside, yes, HelloGoodbye is poorly designed. Returning a null is bad programming practice. A better option would be to do something as follows. But, creating well-designed pojos is not this tutorial’s intention. Instead, go with the poorly designed HelloGoodbye implementation above.

public class HelloGoodbye {
  private String message;
  private String type;
	
  public String getType() {
    return type;
  }
  public void setType(String type) {
    this.type = type;
  }
  public String getMessage() {
    return message;
  }
  public void setMessage(String msg) {
    this.message = msg;
  }
}

Default Exception Handling

Spring Boot provides exception handling by default. This makes it much easier for both the service endpoint and client to communicate failures without complex coding.

  • Modify createGreeting to throw an Exception if type is not the value hello or goodbye.
package com.tutorial.exceptions.spring.rest.exceptionstutorial;

import org.springframework.stereotype.Service;

@Service
public class GreetingService {

  public HelloGoodbye createGreeting(String type) throws Exception {
    HelloGoodbye helloGoodbye = new HelloGoodbye();
    if (type.equals("hello")) {
      helloGoodbye.setGreeting("Hello there.");
    } else if (type.equals("goodbye")) {
      helloGoodbye.setGoodbye("Goodbye for now.");
    } else {
      throw new Exception("Valid types are hello or goodbye.");
    }
    helloGoodbye.setType(type);
    return helloGoodbye;
  }
}
  • Modify GreetingController getGreeting to throw an Exception.
package com.tutorial.exceptions.spring.rest.exceptionstutorial;

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;


@RestController
@RequestMapping(value = "/greeting")
public class GreetingController {

  @Autowired
  protected GreetingService service;

  @GetMapping("/greet")
  public HelloGoodbye getGreeting(@RequestParam("type") String type) throws Exception {
    HelloGoodbye goodBye = service.createGreeting(type);
    return goodBye;
  }
}
  • Compile, run the application, and visit the rest endpoint. Note the response returns the error as json.
{
    "timestamp": "2019-04-06T18:07:34.344+0000",
    "status": 500,
    "error": "Internal Server Error",
    "message": "Valid types are hello or goodbye.",
    "path": "/greeting/greet"
}

When changing the createGreeting method we were required to either catch the exception or throw it. This is because Exception is a checked exception (more on checked exceptions). But there were no special requirements for returning that exception to a client application as JSON. This is because Spring Boot provides a default JSON error message for errors. The relevant class is DefaultErrorAttributes which implements the ErrorAttributes interface. This class provides the following attributes when an exception occurs: timestamp, status, error, exception, message, errors, trace, and path. You can easily override the default with your own error attributes class; however, this technique is not illustrated here. Refer to this tutorial for more information on writing a custom implementation of the ErrorAttributes interface (Customize error JSON response with ErrorAttributes).

Usually, business logic exceptions warrant a business logic exception rather than a generic exception. Let’s modify the code to throw a custom exception.

  • Create a class named GreetingTypeException that extends Exception.
  • Assign it an bad request status through the @ResponseStatus annotation.
package com.tutorial.exceptions.spring.rest.exceptionstutorial;

import org.springframework.web.bind.annotation.ResponseStatus;
import org.springframework.http.HttpStatus;

@ResponseStatus(value = HttpStatus.BAD_REQUEST)
public class GreetingTypeException extends Exception {

  private static final long serialVersionUID = -189365452227508599L;

  public GreetingTypeException(String message) {
    super(message);
  }

  public GreetingTypeException(Throwable cause) {
    super(cause);
  }

  public GreetingTypeException(String message, Throwable cause) 
  {
    super(message, cause);
  }
}
  • Modify createGreeting to throw a GreetingTypeException rather than an Exception.
public HelloGoodbye createGreeting(String type) throws GreetingTypeException {

  HelloGoodbye helloGoodbye = new HelloGoodbye();

  if (type.equals("hello")) {
    helloGoodbye.setGreeting("Hello there.");
  } else if (type.equals("goodbye")) {
    helloGoodbye.setGoodbye("Goodbye for now.");
  } else {
  throw new GreetingTypeException("Valid types are hello or goodbye.");
  }

  helloGoodbye.setType(type);
  return helloGoodbye;
}
  • Compile, run the application, and visit the rest endpoint. Assign an incorrect value to the type parameter.
http://localhost:8080/greeting/greet?type=cc
{
    "timestamp": "2019-03-29T01:54:40.114+0000",
    "status": 400,
    "error": "Bad Request",
    "message": "Valid types are hello or goodbye.",
    "path": "/greeting/greet"
}
  • Create an exception named NameNotFoundException. Have the exception extend RuntimeException rather than Exception.
  • Assign it a response status of not found.
package com.tutorial.exceptions.spring.rest.exceptionstutorial;

import org.springframework.web.bind.annotation.ResponseStatus;
import org.springframework.http.HttpStatus;

@ResponseStatus(value = HttpStatus.NOT_FOUND)
public class NameNotFoundException extends RuntimeException {
  public NameNotFoundException(String message) {
    super("The id: " + message + " could not be found.");
  }
}

  • Modify GreetingService createGreeting method to take id as an integer.
  • Create a new method called getPersonName. Suspend disbelief and implement it as below. Obviously in a real-world project you would get user information from a database, LDAP server, or some other datastore.
  • Modify createGreeting to use the getPersonName method to personalize the greeting.
package com.tutorial.exceptions.spring.rest.exceptionstutorial;

import org.springframework.stereotype.Service;

@Service
public class GreetingService {
  public HelloGoodbye createGreeting(String type, int id) throws GreetingTypeException {
    HelloGoodbye helloGoodbye = new HelloGoodbye();
    if (type.equals("hello")) {
      helloGoodbye.setGreeting("Hello there " + 
        this.getPersonName(id));
    } else if (type.equals("goodbye")) {				 
      helloGoodbye.setGoodbye("Goodbye for now " + 
        this.getPersonName(id));
    } else {
      throw new GreetingTypeException("Valid types are hello or goodbye.");
    }
    helloGoodbye.setType(type);
    return helloGoodbye;
  }
	
  public String getPersonName(int id) {
    if(id==1) {
      return "Tom";
    } else if(id==2) {
      return "Sue";
    } else {
      throw new NameNotFoundException(Integer.toString(id));
    }
  }	
}
  • Modify GreetingController to take id as a request parameter and modify its call to the GreetingService createGreeting method to also pass id to the service.
package com.tutorial.exceptions.spring.rest.exceptionstutorial;

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;

@RestController
@RequestMapping(value = "/greeting")
public class GreetingController {

  @Autowired
  protected GreetingService service;
	
  @GetMapping("/greet")
  public HelloGoodbye getGreeting(@RequestParam("type") String type, @RequestParam("id") int id) {
    HelloGoodbye goodBye = service.createGreeting(type, id);
      return goodBye;
    }
}
  • Compile, run the application, and visit the endpoint.
http://localhost:8080/greeting/greet?type=hello&id=2
{
    "greeting": "Hello there Sue",
    "goodbye": null,
    "type": "hello"
}
  • Change the id query parameter’s value to six and note the exception.
http://localhost:8080/greeting/greet?type=hello&id=6
{
    "timestamp": "2019-03-31T20:30:18.727+0000",
    "status": 404,
    "error": "Not Found",
    "message": "The id: 6 could not be found.",
    "path": "/greeting/greet"
}

As an aside, notice that we had NameNotFoundException extend RuntimeException and not Exception. By doing this we made NameNotFoundException an unchecked exception (more on unchecked exceptions) and were not required to handle the exception.

Controller Error Handlers

Although Spring Boot’s default exception handling is robust, there are times an application might require more customized error handling. One technique is to declare an exception handling method in a rest controller. This is accomplished using Spring’s @Exceptionhandler annotation (javadoc).

  • Create a new simple class named GreetingError. Note that it is a pojo and does not extend Exception.
package com.tutorial.exceptions.spring.rest.exceptionstutorial;

import java.util.Date;

public class GreetingError {
  private Date timestamp;
  private String message;
    
  public Date getTimestamp() {
    return timestamp;
  }
  public void setTimestamp(Date timestamp) {
    this.timestamp = timestamp;
  }
  public String getMessage() {
    return message;
  }
  public void setMessage(String message) {
    this.message = message;
  }
}
  • Modify GreetingController to have a method named nameNotFoundException that is annotated with an @ExceptionHandler annotation.
  • Implement nameNotFoundException to return a ResponseEntity<>.
package com.tutorial.exceptions.spring.rest.exceptionstutorial;

import java.util.Date;

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.http.HttpStatus;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.ExceptionHandler;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;
import org.springframework.web.context.request.WebRequest;


@RestController
@RequestMapping(value = "/greeting")
public class GreetingController {

  @Autowired
  protected GreetingService service;
	
  @GetMapping("/greet")
  public HelloGoodbye getGreeting(@RequestParam("type") String type, @RequestParam("id") int id) throws Exception {
    HelloGoodbye goodBye = service.createGreeting(type, id);
    return goodBye;
  }
	
  @ExceptionHandler(NameNotFoundException.class)
  public ResponseEntity<?> nameNotFoundException(NameNotFoundException ex, WebRequest request) {
    GreetingError errorDetails = new GreetingError();
    errorDetails.setTimestamp(new Date());
    errorDetails.setMessage("This is an overriding of the standard exception: " + ex.getMessage());
    return new ResponseEntity<>(errorDetails, HttpStatus.NOT_FOUND);
  }
}
  • Compile, run the application, and visit the endpoint.
http://localhost:8080/greeting/greet?type=hello&id=33

{
    "timestamp": "2019-04-01T02:14:51.744+0000",
    "message": "This is an overriding of the standard exception: The id: 33 could not be found."
}

The default error handling for NameNotFoundException is overridden in the controller. But you are not limited to implementing one error handler in a controller, you can define multiple error handlers, as in the code below.

  • Modify GreetingController to throw an arithmetic exception in getGreeting.
  • Create a new exception handler for ArithmeticException.
package com.tutorial.exceptions.spring.rest.exceptionstutorial;

import java.util.Date;

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.http.HttpStatus;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.ExceptionHandler;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;
import org.springframework.web.context.request.WebRequest;


@RestController
@RequestMapping(value = "/greeting")
public class GreetingController {
  @Autowired
  protected GreetingService service;
	
  @GetMapping("/greet")
  public HelloGoodbye getGreeting(@RequestParam("type") String type, @RequestParam("id") int id) throws Exception {
    int i = 0;
    int k = 22/i;
    HelloGoodbye goodBye = service.createGreeting(type, id);
    return goodBye;
  }
	
  @ExceptionHandler(NameNotFoundException.class)
  public ResponseEntity<?> nameNotFoundException(NameNotFoundException ex, WebRequest request) {
    GreetingError errorDetails = new GreetingError();
    errorDetails.setTimestamp(new Date());
    errorDetails.setMessage("This is an overriding of the standard exception: " + ex.getMessage()); 
    return new ResponseEntity<>(errorDetails, HttpStatus.NOT_FOUND);
  }
	 
  @ExceptionHandler(ArithmeticException.class)
  public ResponseEntity<?> arithmeticException(ArithmeticException ex, WebRequest request) {
    GreetingError errorDetails = new GreetingError();
    errorDetails.setTimestamp(new Date());
    errorDetails.setMessage("This is an overriding of the standard exception: " + ex.getMessage()); 
    return new ResponseEntity<>(errorDetails, HttpStatus.INTERNAL_SERVER_ERROR);
  }

}
  • Compile, run the application, and visit the rest endpoint.
{
    "timestamp": "2019-04-01T02:40:53.527+0000",
    "message": "This is an overriding of the standard exception: / by zero"
}
  • Before continuing, do not forget to remove the code that divides by zero.

The Exception handler is a useful annotation that allows handling exceptions within a class. We used it in our controller to handle exceptions. The method used to handle the exception returned a ResponseEntity<T> class (javadoc). This class is a subclass of HttpEntity (javadoc). The HttpEntity wraps the actual request or response – here the response – while the ResponseEntity adds the HttpStatus code. This allows you to return a custom response from your rest endpoint.

Global Error Handler

The @ControllerAdvice is a way to handle exceptions within Spring Controllers. It allows using a method annotated with the @ExceptionHandler to handle all exceptions in an application.

  • Create a new class named GreetingExceptionHandler.
  • Annotate it with the @ControllerAdvice annotation.
  • Copy and paste the nameNotFoundException method from the GreetingController class. Change the message text to be certain it is, in fact, being called.
package com.tutorial.exceptions.spring.rest.exceptionstutorial;

import java.util.Date;

import org.springframework.http.HttpStatus;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.ControllerAdvice;
import org.springframework.web.bind.annotation.ExceptionHandler;
import org.springframework.web.context.request.WebRequest;

@ControllerAdvice
public class GreetingExceptionHandler {
  @ExceptionHandler(NameNotFoundException.class)
  public ResponseEntity<?> nameNotFoundException(NameNotFoundException ex, WebRequest request) {
    GreetingError errorDetails = new GreetingError();
    errorDetails.setTimestamp(new Date());
    errorDetails.setMessage("This a global exception handler: " + ex.getMessage());
    return new ResponseEntity<>(errorDetails, HttpStatus.NOT_FOUND);
    }
}
  • Remove the NameNotFoundException exception handler from the GreetingController class.
package com.tutorial.exceptions.spring.rest.exceptionstutorial;

import java.util.Date;

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.http.HttpStatus;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.ExceptionHandler;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;
import org.springframework.web.context.request.WebRequest;


@RestController
@RequestMapping(value = "/greeting")
public class GreetingController {
	
  @Autowired
  protected GreetingService service;
	
  @GetMapping("/greet")
  public HelloGoodbye getGreeting(@RequestParam("type") String type, @RequestParam("id") int id) throws Exception {
    HelloGoodbye goodBye = service.createGreeting(type, id);
    return goodBye;
  }
	 
  @ExceptionHandler(ArithmeticException.class)
  public ResponseEntity<?> arithmeticException(ArithmeticException ex, WebRequest request) {
    GreetingError errorDetails = new GreetingError();
    errorDetails.setTimestamp(new Date());
    errorDetails.setMessage("This is an overriding of the standard exception: " + ex.getMessage());
    return new ResponseEntity<>(errorDetails, HttpStatus.INTERNAL_SERVER_ERROR);
  }	 
}
  • Compile, run the application, and visit the rest endpoint. You receive the error created in the global handler.
http://localhost:8080/greeting/greet?type=hello&id=33

{
“timestamp”: “2019-04-06T21:21:17.258+0000”,
“message”: “This a global exception handler: The id: 33 could not be found.”
}

The @ControllerAdvice annotation (Javadoc) allows an exception handler to be shared across controllers. It is useful if you wish creating uniform exception handling across multiple controllers. You can limit the @ControllerAdvice exception handling to apply only to certain controllers, for more information, refer to the Javadoc.

Conclusion

Spring Boot 2 Rest Exceptions and handling them is both easy and difficult. It is easy because there are concrete ways to implement exception handling. Moreover, even if you provide no exception handling, it is provided for you by default. It is difficult because there are many different ways to implement exception handling. Spring provides so much customization, so many different techniques, it is sometimes easy to become lost in the details.

In this tutorial we explored three different techniques when dealing with Spring Boot 2 REST Exceptions. You should refer to other tutorials before deciding any one technique is what you should use. In the interest of full disclosure, I personally feel the @ControllerAdvice technique is the most robust, as it allows creating a unified exception handling framework.

Refer to the tutorial: Rest Using Spring Boot for a tutorial on using Spring Boot and REST if you want more information on programming REST using Spring Boot.

Github Source

https://github.com/jamesabrannan/spring-rest-exception-tutorial

REST Using Spring Boot

In the next few posts I will be exploring implementing microservices using REST, messaging, and finally Amazon Web Services (AWS) Lambda. Although the tutorials are largely written in a step-by-step format, we also explore the underlying theory of microservice architecture. In this tutorial we explore REST using Spring Boot.

In this first tutorial, I assume Eclipse, Maven, and Postman. If new to Java then you are strongly recommended to begin by first going through this book, Think Java, along with my accompanying tutorials. There are also many links to excellent YouTube tutorials that accompany the step-by-step tutorials provided. If you are new to Maven and/or Eclipse, then here are two introductory tutorials on Maven and Eclipse.

Let’s begin by building a simple Hello World REST endpoint using Spring Boot. Spring boot is an easy way to create Spring applications without requiring web server installation, Spring configuration files, and other necessities of standing a Spring application. You can quickly create and run a Spring application. Although useful for tutorials, and it is used in production, if you do not understand more traditional Spring applications, you should also learn the details of more traditional Spring application configuration before going to a job interview. But in this and the next several tutorials, I assume Spring Boot.

  1. Create a new Maven project named rest-tutorial. If you have never created a Maven application in Eclipse, refer to this tutorial.
  2. Replace the content of pom.xml with this content.
<?xml version="1.0" encoding="UTF-8"?>
<projectDescription>
<name>rest-tutorial</name>
<comment></comment>
<projects>
</projects>
<buildSpec>
<buildCommand>
<name>org.eclipse.jdt.core.javabuilder</name>
<arguments>
</arguments>
</buildCommand>
<buildCommand>
<name>org.eclipse.m2e.core.maven2Builder</name>
<arguments>
</arguments>
</buildCommand>
</buildSpec>
<natures>
<nature>org.eclipse.jdt.core.javanature</nature>
<nature>org.eclipse.m2e.core.maven2Nature</nature>
</natures>
</projectDescription>
  1. Create the following two packages: com.tutorial.spring.application and com.tutorial.spring.rest in the rest-tutorial project. (how to create a package)
  2. Create the class Hello.java in the com.tutorial.spring.rest package. (how to create a class)
package com.tutorial.spring.rest;

public class Hello {

  private String greeting;
  
  public String getGreeting() {
    return greeting;
  }

  public void setGreeting(String greeting) {
    this.greeting = greeting;
  }
}
  1. Create a class with a main method named TutorialApplication in the com.tutorial.spring.application package.
  2. Add the @SpringBootApplication and @ComponentScan annotations to the class.
package com.tutorial.spring.application;

import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.context.annotation.ComponentScan;

@SpringBootApplication
@ComponentScan({ "com.tutorial.spring.rest" })
public class TutorialApplication {
  
  public static void main(String[] args) {
    SpringApplication.run(TutorialApplication.class, args);
  }
}

An annotation is metadata that provides more information to the compiler and that can be interpreted at runtime (introduction to annotations). Spring Boot uses the @SpringBootApplication annotation to signify that a class is the starting point for a Spring Boot application. It also provides default values for the @Configuration, @EnableAutoConfiguration, and @ComponentScan Spring annotations. However, notice we also use the @ComponentScan annotation because we do not wish using the provided default value. Instead, we explicitly instruct Spring to scan for Spring classes in the com.tutorial.spring.rest package.

  1. Create another class named HelloController in the com.tutorial.spring.application.rest package.
  2. Add the @RestController and @RequestMapping annotations to the class.
    package com.tutorial.spring.rest;
    
    import org.springframework.web.bind.annotation.RequestMapping;
    import org.springframework.web.bind.annotation.RestController;
    import org.springframework.web.bind.annotation.RequestMethod;
    
    @RestController
    @RequestMapping(value="/hello")
    public class HelloController {
    
      @RequestMapping(value = "/greeting", method = RequestMethod.GET)
      public Hello greeting() {
        Hello hello = new Hello();
        hello.setGreeting("Hello there.");
        return hello;
      }
    }

The @RequestController annotation serves two purposes, it defines the class as a Spring controller and as a REST endpoint (more on @RequestController and @Controller). The first @RequestMapping use defines the hello endpoint or http://localhost:8080/hello. The second defines the greeting endpoint and is used with the previous endpoint to form http://localhost/hello/greeting. It defines the HTTP method as GET (more on GET and POST).

  1. Build and run the application in Eclipse. You should see log messages from the Spring Boot embedded web server in the Eclipse Console. Note the line that confirms the greeting endpoint was mapped to the method developed.
019-02-24 16:11:52.675 INFO 3491 --- [ main] 
s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped 
"{[/hello/greeting],methods=[GET]}" onto public com.tutorial.spring.rest.Hello
com.tutorial.spring.rest.HelloController.greeting()
  1. Open a web browser and type http:localhost:8080/hello/greeting in as the address to navigate to the endpoint. You should see the following.

But a web browser is intended for results viable for consumption by an end-user. For instance, an HTML page renders in a web browser and is visible to the viewer. A Rest service, in contrast, is intended to be used by another application. Moreover, the endpoint’s purpose is to provide data and not display the data. Or,

REST, or REpresentational State Transfer, is an architectural style for providing standards between computer systems on the web, making it easier for systems to communicate with each other. REST-compliant systems, often called RESTful systems, are characterized by how they are stateless and separate the concerns of client and server. We will go into what these terms mean and why they are beneficial characteristics for services on the Web (code academy).

Moreover, note the displayed results. The results are displayed using JavaScript Object Notation (JSON). This notation is for easy communication between programs rather than human end-user consumption (more on JSON).

Instead of using a browser, let’s use a free tool named Postman. It is designed specifically for Rest developers. Download and install Postman if you do not already have it installed. You can obtain it here. For more information on Postman refer to the many tutorials provided.

  1.  Start Postman and create a new Collection named SprintMicroservicesTutorial.

  1. Create a new Request named Greeting. Ensure the method is GET and has the following URL http://localhost:8080/greeting (creating a GET request).

  1. Click Send and the following should appear as the response.
{
"greeting": "Hello there."
}

Congratulations, you created your first Rest web service.

API Design

REST is designed like webpages, except with computers as the consumers. Just as you would almost never navigate to a page conceptually represented by a verb you should never represent a REST endpoint by such. Use nouns. Endpoints, like webpages, are resources.

REST endpoints are also designed to be layered into hierarchies, the same as webpages. Suspend disbelief and let’s assume we have a simple endpoint to a dogfood and catfood database. Obviously, the example is greatly simplified.

The endpoint is food. The API provides information on food for two species: dogs and cats. Each species has thousands of possible breeds. And each food has three possible sizes: small, medium, and large.

For purposes of illustration, we chose to make dog and cat two separate endpoints. And because there are thousands of potential breeds, we make them what are called path parameters. Finally, we choose a query parameter to represent the food size.

Path and Query Parameters

Parameters that are part of an endpoint path are termed path parameters. They are distinguished by curly braces. For example,

http://www.president.com/{president}

represents an endpoint where the client replaces the president with the name of the specific president.

A query string parameter are represented after the endpoint by using a question mark to offset the query string.

http://ww.president.com/{president}?age=<age>
  1. Create a new com.tutorial.spring.rest.petfood package. Create three new classes named Food, DogFood, and CatFood. Create enumerations for the species and size properties.
package com.tutorial.spring.rest.petfood;

public class Food {

  public enum Species {DOG, CAT}
  public enum Size {SMALL, MEDIUM, LARGE}

  private String brand;
  private Double price;
  private Size size;
  private Species species;
  private String enteredBreed;

  public String getEnteredBreed() {
    return enteredBreed;
  }
  public void setEnteredBreed(String enteredBreed) {
    this.enteredBreed = enteredBreed;
  }
  public Species getSpecies() {
    return species;
  }
  public void setSpecies(Species species) {
    this.species = species;
  }
  public Size getSize() {
    return size;
  }
  public void setSize(Size size) {
    this.size = size;
  }
  public String getBrand() {
    return brand;
  }
  public void setBrand(String brand) {
    this.brand = brand;
  }
  public Double getPrice() {
    return price;
  }
  public void setPrice(Double price) {
    this.price = price;
 }
}
package com.tutorial.spring.rest.petfood;

public class DogFood extends Food {
}
package com.tutorial.spring.rest.petfood;

public class CatFood extends Food {
}
  1. Create a class named FoodController. Provide it with a @RestController annotation and a top-level @RequestMapping for the /food endpoint.
  2. Create two endpoint methods, one for the /dog endpoint and one for the /cat endpoint.
  3. Add the {breed} path parameter to both methods.
package com.tutorial.spring.rest.petfood;

import java.util.List;

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.PathVariable;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestMethod;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;

@RestController
@RequestMapping(value = "/food")
public class FoodController {

  @RequestMapping(value = "/dog/{breed}", method = RequestMethod.GET)
  public List<DogFood> dogFoodrecommendation(@PathVariable String breed, 
       @RequestParam("size") Food.Size size){
  }

  @RequestMapping(value = "/cat/{breed}", method = RequestMethod.GET)
  public List<CatFood> catFoodrecommendation(@PathVariable String breed,
       @RequestParam("size") Food.Size size){
  }
}

The {breed} combined with the @PathVariable annotation is how you define a path parameter. The @RequestParam annotation is how you define a parameter bound to a query parameter.

  1. Create a new class named FoodService and add simple methods to create a list containing cat food and a list containing dog food.
  2. Annotate the class with the @Service annotation.
package com.tutorial.spring.rest.petfood;

import java.util.ArrayList;
import java.util.List;
import org.springframework.stereotype.Service;
import com.tutorial.spring.rest.petfood.Food.Size;
import com.tutorial.spring.rest.petfood.Food.Species;

@Service
public class FoodService {

  public List<CatFood> createCatFood(String breed, Size size)
  {
    List<CatFood> food = new ArrayList<CatFood>();
    CatFood a = new CatFood();
    a.setBrand("Purina");
    a.setPrice(13.12);
    a.setSize(size);
    a.setSpecies(Species.CAT);
    a.setEnteredBreed(breed);
    food.add(a);

    CatFood b = new CatFood();
    b.setBrand("Science Diet");
    b.setPrice(10.00);
    b.setSize(size);
    b.setSpecies(Species.CAT);
    b.setEnteredBreed(breed);
    food.add(b);

    return food;
  }

  public List<DogFood> createDogFood(String breed, Size size)
  {
    List<DogFood> food = new ArrayList<DogFood>();

    DogFood a = new DogFood();
    a.setBrand("Purina");
    a.setPrice(33.22);
    a.setSize(size);
    a.setSpecies(Species.DOG);
    a.setEnteredBreed(breed);
    food.add(a);

    DogFood b = new DogFood();
    b.setBrand("Science Diet");
    b.setPrice(12.22);
    b.setSize(size);
    b.setSpecies(Species.DOG);
    b.setEnteredBreed(breed);
    food.add(b);

    return food;
  }
}

The annotation causes the class to be scanned by Spring when performing classpath scanning. Spring registers the class as a service. An instance of this class is then available to other classes without explicit instantiation using a constructor. You then auto-wire it to other classes using the @Autowired annotation.

  1. Modify the FoodController class to auto-wire the FoodService class.
  2. Modify the two methods to call the service classes createDogFood and createCatFood methods respectively.
package com.tutorial.spring.rest.petfood;

import java.util.List;

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.PathVariable;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestMethod;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;

@RestController
@RequestMapping(value = "/food")
public class FoodController {

  @Autowired
  private FoodService foodService;

  @RequestMapping(value = "/dog/{breed}", method = RequestMethod.GET)
  public List<DogFood> dogFoodrecommendation(@PathVariable String breed, 
                                   @RequestParam("size") Food.Size size){
    return foodService.createDogFood(breed, size);
  }

  @RequestMapping(value = "/cat/{breed}", method = RequestMethod.GET)
  public List<CatFood> catFoodrecommendation(@PathVariable String breed, 
                                   @RequestParam("size") Food.Size size){
    return foodService.createCatFood(breed, size);
  }
}

The @Autowired annotation marks an object as automatically injected using Spring’s dependency injection. An instance of the FoodService class is annotated with @Service and so is loaded by Spring and subsequently injected into the FoodController instance. The FoodController instance can then use the Foodservice instance as if it had explicitly created it using a constructor.

  1. Create two new GET requests in Postman. One for cat food and one for dog food.
  2. For the cat endpoint enter tiger as breed and LARGE as size.

  1. For the dog endpoint enter chihuahua as breed and SMALL as size.

What is returned is an array of cat foods and a list of dog foods (JSON arrays).

In this tutorial we explored REST using Spring Boot. We used Spring Boot to create several Rest endpoints. You learned a little of the reasoning behind the idea of a Restful API.

Java Streams – A Simple MapReduce Example

In this tutorial, you convert a List of Strings to a List of Integers using the MapReduce programming paradigm inherent in Java Streams. Every class that implements the java.util.Collection interface has a stream method. This method converts the collection into a Stream.

Java Streams are a much more convenient and efficient way to work with collections using functional programming. Rather than beginning by describing Functional Programming, I begin by showing you why you might consider incorporating functional programming into your everyday coding. In this simple example using the MapReduce programming paradigm. Let’s jump in with an example, and then return to the theory of Java Streams and MapReduce after completing the example.

A good overview of Java Streams on YouTube that I would recommend watching prior to completing this tutorial is Java Streams Filter, Map, Reduce by Joe James.

  1. Open Eclipse and create a new Java project. Name the project functional.
  2. Create a top-level package by right-clicking on the src folder, selecting New, and then Package from the menu.
  3. Name the package com.functional.example and note the created package structure.
  4. Create a new class named MapReduceExample in the functional package. Do not forget to add a main method to the class.
  5. Create a static method named oldWay that takes a List of Strings and returns an Integer List. Be certain to import the java.util.List package.
public static List<Integer> oldWay(List<String> stringValues){
}
  1. Create a List variable named convertedList and initialize it as an ArrayList. Import the java.util.ArrayList package.
List<Integer> convertedList = new ArrayList<>();
  1. Create a for loop that iterates over the stringValues List, converts each element, adds the converted element to the convertedList variable and then returns the converted list.
public static List<Integer> oldWay(List<String> stringValues){
     List<Integer> convertedList = new ArrayList<>();
     for(String theString:stringValues) {
          Integer val = Integer.parseInt(theString);
          convertedList.add(val);
     }
     return convertedList;
}
  1. Create another static method named sumOldWay that takes an Integer List, sums them, and returns the result.
public static Integer sumOldWay(List<Integer> values) {
     Integer retVal = new Integer(0);
     for(Integer val:values) {
          retVal+=val;
     }
     return retVal;
}
  1. In the main method:
    1. create a list of Strings using the Arrays.asList method,
    2. assign the list to a variable named inValues,
    3. convert them to an Integer List using the oldWay static method,
    4. sum the Integer List,
    5. and print the value.
public static void main(String[] args) {
     List<String> inValues = Arrays.asList("1","2","3","4","5","6","7");
     List<Integer> outValues = MapReduceExample.oldWay(inValues);
     Integer finalValue = MapReduceExample.sumOldWay(outValues);
     System.out.println(finalValue);
}
  1. Run the program and 28 is printed to the console.  The following is the complete program.
package com.functional;

import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;

public class MapReduceExample {
     public static List<Integer> oldWay(List<String> stringValues){
          List<Integer> convertedList = new ArrayList<>();
          for(String theString:stringValues) {
               Integer val = Integer.parseInt(theString);
               convertedList.add(val);
          }
          return convertedList;
     }

     public static Integer sumOldWay(List<Integer> values) {
          Integer retVal = new Integer(0);
          for(Integer val:values) {
               retVal+=val;
          }
          return retVal;
     }

     public static void main(String[] args) {
          List<String> inValues = Arrays.asList("1","2","3","4","5","6","7");
          List<Integer> outValues = MapReduceExample.oldWay(inValues);
          Integer finalValue = MapReduceExample.sumOldWay(outValues);
          System.out.println(finalValue);
     }
}

  1. Now let’s rewrite this program using the MapReduce programming paradigm. Specifically, we use the Java Stream interface. The complete code follows.
package com.functional;

import java.util.Arrays;
import java.util.List;

public class MapReduceExample {
  public static void main(String[] args) {
    List<String> inValues = Arrays.asList("1","2","3","4","5","6","7");
    Integer finalValue2 = inValues.stream().mapToInt(num->Integer.parseInt(num)).sum();
    System.out.println(finalValue2);
  }
}
  1. Run the program and 28 is printed to the console.

The mapToInt method takes a lambda expression as the function it applies to the list elements. The mapToInt method returns an IntStream.  The IntStream’s sum method is a reducer, as it reduces the elements to a single Integer value.

  1. Replace the List creation with the Stream.of method.
  2. Rewrite the function using the Stream’s map function and reduce function.
public static void main(String[] args) {
  Stream<String> myStream =  Stream.of("1","2","3","4","5","6","7");
  Integer finalValue = myStream.map(num->Integer.parseInt(num)).reduce(0, 
       Integer::sum);
  System.out.println(finalValue);
}
  1. Run the program and 28 is printed to the console.

The map function takes a lambda function that is applied to the list elements. The result is a Stream of Integers. The reduce method then applies the provided lambda function to reduce the stream, here an Integer containing the sum of the values. The Integer::sum is called an accumulator because it accumulates the values. Note that :: is a method reference telling the compiler to use the sum method from Integer.

  1. Rewrite the function, but instead of using the :: method reference, provide a different lambda expression to the map method.
  2. Change the sum method to the reduce method as follows.
public static void main(String[] args) {
  Stream<String> myStream =  Stream.of("1","2","3","4","5","6","7");
  Integer finalValue = myStream.map(num->Integer.parseInt(num)).reduce(0, 
        (x,y) -> x+y);
  System.out.println(finalValue);
}
  1. Run the program and 28 is printed to the console.

Note that in the above code we used Stream.of rather than creating a data structure and then streaming it to a stream. Remember, a Stream is not a data structure and does not modify the underlying data source, the Stream streams the elements in the underlying collection. We could have also used the Stream.builder method to create a stream.

Mapping

The mapToInt and map Stream methods are mapping operations. The map function applies the supplied function to a stream’s elements to convert into a stream of a different type. For instance,

myStream.map(num->Integer.parseInt(num))

converts the stream, myStream, that contains Strings to a stream containing Integers. It does this using the mapper. A mapper is a stateless lambda expression applied to each of a stream’s elements.

num->Integer.parseInt(num)

The mapToInt method returns an IntStream. Other mapping methods include mapToLong, mapToDouble, and flatMap, flatMapToLong, flatMapToInt, and flatMapToDouble. Flatmap is covered in another post and is not discussed here.

Lambda Expressions

A lambda expression is a function that is not tied to a class. It can be passed to methods as if it were an object, and it can be executed upon demand. A lambda expression’s syntax is as follows,

lambda operator -> body

A lambda operator is can contain zero or more parameters. Lambda expressions are covered in a later tutorial. However, note here that the following two expressions are lambda expressions.

num->Integer.parseInt(num)   // apply the parseInt method to num and return result.
(x,y) -> x+y    // supply x, and y and return the result.

The first expression parses the integer value of the supplied element. The second expression takes two elements and sums them. Note that it is used in the reduce method recursively. The first element is the sum, the second element, y, is the new element of the stream. So with each iteration x increases while the value of y varies according to the current element.

Filters

Filters are a convenient way to remove unwanted values. The Stream interface declares a filter method that applies a predicate to a Stream and returns a Stream of only the elements that match the predicate.  A predicate is a functional method that returns true or false.

  1. Add a new element to the Strings with the value “Ralph.”
    public static void main(String[] args) {
      Stream<String> myStream = Stream.of("1","2","3","4","5","6","7","Ralph");
      Integer finalValue = myStream.map(num->Integer.parseInt(num))
         .reduce(0, (x,y) -> x+y);
      System.out.println(finalValue);
    }
  2. Run the program and note the exception. This is obviously because “Ralph” cannot be parsed into an integer.
Exception in thread "main" java.lang.NumberFormatException: For input string: "Ralph"
at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
at java.lang.Integer.parseInt(Integer.java:580)
at java.lang.Integer.parseInt(Integer.java:615)
at com.functional.MapReduceExample.lambda$0(MapReduceExample.java:33)
at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
at java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:948)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at java.util.stream.ReferencePipeline.reduce(ReferencePipeline.java:474)
at com.functional.MapReduceExample.main(MapReduceExample.java:33)
  1. Add the filter method to myStream before the map method to filter any non-strings from the resulting Stream.
public static void main(String[] args) {
  Stream<String> myStream = Stream.of("1","2","3","4","5","6","7","Ralph");
  Integer finalValue = myStream.filter(x->x.matches("-?\\d+(\\.\\d+)?"))
        .map(num->Integer.parseInt(num)).reduce(0, (x,y) -> x+y);
  System.out.println(finalValue);
}
  1. Run the program and 28 is printed to the console.

PipeLining

A Stream is immutable, and cannot be modified. For intermediate methods, the result of each processing step is a new Stream with the transformation applied. This allows the convenient transformation “pipelining.”

Each function applied to a Stream returns a new Stream. This allows chaining the operations together into a series of processing steps. There are two types of transformations when processing a Stream, intermediate and terminal operations. An intermediate operation returns another Stream. A terminal operation returns a final value, terminating the pipeline.

  1. Modify the program by adding another map transformation.
public static void main(String[] args) {
  Stream<String> myStream = Stream.of("1","2","3","4","5","6","7","Ralph");
  Integer finalValue = myStream.filter(x->x.matches("-?\\d+(\\.\\d+)?"))
    .map(num->Integer.parseInt(num)).map(x->x*2).reduce(0, (x,y) -> x+y);
  System.out.println(finalValue);
}
  1. Run the program and 56 is printed to the console.

You can chain as many intermediate methods together to form a processing pipeline. Note that intermediate operations that reduce a stream’s size should be executed before elements applied to each element. For instance, it makes little sense to perform the following,

myStream.map(x->x*2).filter(x->x%2==0)

as you would multiply every number in a stream by 2 only to take the resultant stream and half its size by discarding odd numbers.

Collectors

Sometimes you do not wish to reduce a stream to a single variable. Instead, you might wish to transform a collection to another collection, performing processing steps along the way. An easy way to collect a stream into a collection is through Collectors. Let’s consider a typical data processing task developers face daily.

  1. Create a new class named Widget and provide an id and a color property of the enum type Color.
  2. Create an enumeration for Color.
    package com.functional;
    enum Color {red,blue,green,yellow,orange};
    public class Widget {
      private int id;
      private Color color;
    
      public int getId() { return this.id;}
      public Color getColor() {return this.color;}
    }
  3. Create a constructor that takes an int and Color as parameters.
package com.functional;

enum Color {red,blue,green,yellow,orange};

public class Widget {
  private int id;
  private Color color;

  public int getId() { return this.id;}
  public Color getColor() {return this.color;}

  public Widget(int id, Color color) {
    this.id = id;
    this.color = color;
  }
}

Suspend disbelief and assume the Widget class represents a business entity in your software. In a typical program, much code is written dedicated to storing multiple instances of an object in a collection, iterating over the collection’s elements, transforming them, and aggregating the results into another collection.

  1. Add a method to Widget named getRedIds that returns a list of ids for red widgets. The code should look familiar; certainly, you have written code like this countless times.
public List<Integer> getRedIds(List<Widget> widgets){
  List<Integer> ids = new ArrayList<>();
  for(Widget aWidget:widgets) {
    if(aWidget.color == Color.red) {
      ids.add(aWidget.id);
    }
  }
  return ids;
}
  1. Create a main method with five Widget instances added to an ArrayList. Pass the list to the getRedIds method, and print the results.
public static void main(String[] args) {
  List<Widget> widgets = new ArrayList<>();
  widgets.add(new Widget(1, Color.red));
  widgets.add(new Widget(2, Color.blue));
  widgets.add(new Widget(3, Color.green));
  widgets.add(new Widget(4, Color.red));
  widgets.add(new Widget(5, Color.red));
  List<Integer> ids = Widget.getRedIds(widgets);
  System.out.println(ids);
}
  1. Run the program and the string, [1, 4, 5] is printed to the console.

The above is typical boilerplate code, familiar to most developers. Again, suspend disbelief and focus on the processing and not the reality of the business object. But armed with our acquired functional programming knowledge we can discard the getRedIds method and replace it with a single line of code.

  1. Add the following two lines to the end of the main method.
ids = widgets.stream().filter(x->x.getColor()==Color.red).map(x->x.getId())
     .collect(Collectors.toList());
System.out.println(ids);
  1. Run the program and the following two lines are printed to the console.
[1, 4, 5]

[1, 4, 5]

The complete class follows.

package com.functional;

import java.util.ArrayList;
import java.util.List;
import java.util.stream.Collectors;

enum Color {red,blue,green,yellow,orange};

public class Widget {

  private int id;
  private Color color;

  public Widget(int id, Color color)
  {
    this.id = id;
    this.color = color;
  }

  public int getId() { return this.id;}
  public Color getColor() {return this.color;}

  public static List<Integer> getRedIds(List<Widget> widgets){
    List<Integer> ids = new ArrayList<>();
    for(Widget aWidget:widgets){
      if(aWidget.color == Color.red) {
        ids.add(aWidget.id);
      }
    }
    return ids;
  }

  public static void main(String[] args) {
    List<Widget> widgets = new ArrayList<>();
    widgets.add(new Widget(1, Color.red));
    widgets.add(new Widget(2, Color.blue));
    widgets.add(new Widget(3, Color.green));
    widgets.add(new Widget(4, Color.red));
    widgets.add(new Widget(5, Color.red));
    List<Integer> ids = Widget.getRedIds(widgets);
    System.out.println(ids);

    ids = widgets.stream().filter(x->x.getColor()==Color.red).map(x->x.getId())
            .collect(Collectors.toList());
    System.out.println(ids);
  }
}

The terminal method is the stream’s collect method. We provide this method the Collectors toList method which returns a new list.

forEach and Consumer

The forEach method is a useful terminal operation that you can use to apply a lambda function to all elements in a stream.

  1. Create a new class named ForEachExample, be certain to add a main method.
  2. Add a new class to the class named AddTen that returns an Integer.
package com.functional;

import java.util.Arrays;
import java.util.List;

class ForEachExample {
  public Integer addTen(Integer val) {
    val+=10; 
    return val;
  }
  public static void main(String[] args) {
  }
}
  1. In main, create a ForEachExample instance, and a list of Integers.
  2. Stream the list and create a forEach statement and supply it with a lambda expression that calls the addTen method and then prints the results.
  3. Stream the list again and print each element, just to prove that the integers in values are truly immutable.
package com.functional;

import java.util.Arrays;
import java.util.List;

class ForEachExample {

  public Integer addTen(Integer val) {
    val+=10;
    return val;
  }

  public static void main(String[] args) {
    ForEachExample example = new ForEachExample();
    List<Integer> values = Arrays.asList(1, 2, 3, 4, 5);
    values.stream().forEach((x)->{System.out.println(example.addTen(x));});
    values.stream().forEach(System.out::println);
  }
}
  1. Run the program and the following is printed to the console.
11
12
13
14
15
1
2
3
4
5

The code,

(x)->{System.out.println(example.addTen(x));}

is a lambda expression. The actual argument for forEach is a Consumer. A consumer is a functional interface that allows you to define a lambda expression to apply to the input but returns no value.

  1. Modify the main method by removing the lambda function from forEach and creating a new Consumer instance.
  2. Supply the forEach with the consumer instance.
  3. Run the program and the results are the same as before.
public static void main(String[] args) {
  ForEachExample example = new ForEachExample();
  List<Integer> values = Arrays.asList(1, 2, 3, 4, 5);
  Consumer<Integer> action = x->{System.out.println(example.addTen(x));};
  values.stream().forEach(action);
  values.stream().forEach(System.out::println);
}

In practice, you rarely require creating a Consumer and then applying it to the forEach method. But, you could if you had a complex lambda expression. Although in that situation I would personally probably create a separate method.

Conclusion

In this tutorial, we explored how Java Streams simplify working with Collections. Be aware that lambdas, Streams, and functional programming are a rich and complex topic. However, like Java generics, integrating these concepts into your everyday coding does not require a deep topic mastery. As this tutorial demonstrates, integrating Java Streams into your everyday coding allows you to write more concise code that is easier to read and test. The stream’s MapReduce programming paradigm literally allows you to replace entire methods of boilerplate code with a single line of code. That line can contain as many intermediate transformations as necessary.

If new to Java, consider the book Think Java by Allen Downey and Chris Mayfield. Also consider supplementary material presented on this website (Java tutorial).

Algorithms and Functional Decomposition With Honey Boo Boo

 

In a previous post I presented algorithms and functional decomposition. Here, we explore how to write an algorithm and then create working Java code from that algorithm.  This post’s purpose is not to present a realistic computer program. Moreover, this post omits many fundamental programming topics. But these omissions are by design. Instead suspend disbelief and focus on the simple process of turning a task into a computer program’s skeletal structure. Here we are using functional decomposition to go from a general real world task to the beginnings of a computer program.*see below

Sketti – Redneck Bolgnese

A recipe I remember from my childhood, sketti, went viral when Mama June made sketti on the The Learning Channel’s show Here Comes Honey Boo Boo. The recipe is simple: spaghetti noodles, hamburger, ketchup, and butter. But let Mama June demonstrate.

Mama June makes sketti by frying hamburger, boiling spaghetti noodles, microwaving a mixture of ketchup and margarine in a bowl, and then combining the ingredients in the pot used to boil the noodles (after draining the water first).

Observation to Algorithm

When you have a process you wish turning into a computer program, do not begin by writing code. You should first formalize the tasks involved.  Start by writing the steps involved in making sketti. Focus on the broad steps, not the details.

  1. gather ingredients
  2. boil noodles
  3. fry hamburger
  4. mix ingredients
  5. microwave sauce mixture
  6. combine ingredients
  7. serve and enjoy

These are the seven steps I get from observing Mama June make sketti. But, these steps are much too broad. And so let’s decompose these steps into more detailed sub-steps. As I’m a redneck, I wanna keep it realistic – so lets get the dishes from a dish-drainer rack rather than the cupboard. Oh, it’s also a fridge not a refrigerator – details are important.

  1. gather ingredients
    1. get hamburger from fridge
    2. get noodles from cabinet
    3. get pot from rack
    4. get pan from rack
    5. get microwavable bowl from rack
    6. get ketchup from fridge
    7. get butter from fridge
  2. boil noodles
    1. put water in pot
    2. bring water to a boil
    3. add noodles
    4. cook noodles
    5. throw noodles, three, against cabinet
    6. drain noodles
  3. fry hamburger
    1. put meat in pan
    2. fry meat
    3. drain fat (optional)
  4. make sauce
    1. put margarine in bowl
    2. put ketchup in bowl
    3. mix
    4. microwave sauce
  5. combine ingredients
    1. add meat to noodles
    2. add sauce to noodles
    3. stir

And there you have a top-level outline detailing making sketti. We can now use this outline to create the beginnings of a working Java program.

Algorithm to Java Code

Now let’s use the outline above to write the skeleton of a simple Java program that uses simple methods that do not return values. Again, suspend disbelieve, the purpose here is to demonstrate translating an outline to a simple program.

First I create a simple class named Sketti with a main method.

public class Sketti {
	public static void main(String[] args) {
	}
}

I then create methods from the top level outline methods. Note I am adding no logic to perform these tasks.

public class Sketti {
	public static void main(String[] args) {
	}
	public static void gatherIngredients(){	
	}
	public static void boilNoodles(){
	}
	public static void fryHamburger(){
	}
	public static void makeSauce(){
	}
	public static void combineIngredients(){
	}
}

Now, one of my pet-peeves is placing any logic in the main method. The main method is the program’s entry point, it should not be used for any logic. Therefore, I add a top-level method named makeSketti. This method will orchestrate the sketti making tasks.

public class Sketti {
	public static void main(String[] args) {
	}
	public static void gatherIngredients(){	
	}
	public static void boilNoodles(){
	}
	public static void fryHamburger(){
	}
	public static void makeSauce(){
	}
	public static void combineIngredients(){
	}
	public static void makeSketti(){
	}
}

Now that I have the primary tasks, I can orchestrate these tasks into making sketti.

public class Sketti {
	public static void main(String[] args) {
		makeSketti();
	}
	public static void gatherIngredients(){
	}
	public static void boilNoodles(){
	}
	public static void fryHamburger(){
	}
	public static void makeSauce(){
	}
	public static void combineIngredients(){
	}
	public static void makeSketti(){
		gatherIngredients();
		boilNoodles();
		fryHamburger();
		makeSauce();
		combineIngredients();
	}
}

I like to start with a running program, no matter how simplistic, therefore let’s add println statements to each function.

public class Sketti {
	public static void main(String[] args) {
		makeSketti();
	}
	public static void gatherIngredients(){
		System.out.println("gatherIngredients");
	}
	public static void boilNoodles(){
		System.out.println("boilNoodles");
	}
	public static void fryHamburger(){
		System.out.println("fryHamburger");
	}
	public static void makeSauce(){
		System.out.println("makeSauce");
	}
	public static void combineIngredients(){
		System.out.println("combineIngredients");
	}
	public static void makeSketti(){
		System.out.println("making sketti");
		gatherIngredients();
		boilNoodles();
		fryHamburger();
		makeSauce();
		combineIngredients();
		System.out.println("done making sketti");
	}
}

When ran the program writes the following output.

making sketti
gatherIngredients
boilNoodles
fryHamburger
makeSauce
combineIngredients
done making sketti

Now create methods for each sub-step involved in gathering ingredients. Then have the gatheringIngredients method orchestrate its sub-steps. The following code illustrates.

	public static void gatherIngredients(){
		System.out.println("gatherIngredients");
		getBurger();
		getNoodles();
		getPot();
		getPan();
		getKetchup();
		getButter();
	}
	
	public static void getBurger(){
		System.out.println("\tgetBurger");
	}
	public static void getNoodles(){
		System.out.println("\tgetNoodles");
	}
	
	public static void getPot(){
		System.out.println("\tgetPot");
	}
	
	public static void getPan(){
		System.out.println("\tgetPan");
	}
	
	public static void getKetchup(){
		System.out.println("\tgetKetchup");
	}
	
	public static void getButter(){
		System.out.println("\tgetButter");
	}

Repeat creating sub-steps and orchestration for each major step.

public class Sketti {
	public static void main(String[] args) {
		makeSketti();
	}
	public static void gatherIngredients(){
		System.out.println("gatherIngredients");
		getBurger();
		getNoodles();
		getPot();
		getPan();
		getKetchup();
		getButter();
	}
	public static void getBurger(){
		System.out.println("\tgetBurger");
	}
	public static void getNoodles(){
		System.out.println("\tgetNoodles");
	}
	public static void getPot(){
		System.out.println("\tgetPot");
	}
	public static void getPan(){
		System.out.println("\tgetPan");
	}
	public static void getKetchup(){
		System.out.println("\tgetKetchup");
	}
	public static void getButter(){
		System.out.println("\tgetButter");
	}
	public static void boilNoodles(){
		System.out.println("boilNoodles");
		pourWater();
		boilWater();
		addNoodles();
		cookNoodles();
		throwNoodles();
		drainNoodles();
	}
	public static void pourWater(){
		System.out.println("\tpourWater");
	}
	public static void boilWater(){
		System.out.println("\tgetButter");
	}
	public static void addNoodles(){
		System.out.println("\taddNoodles");
	}
	public static void cookNoodles(){
		System.out.println("\tcookNoodles");
	}
	public static void throwNoodles(){
		System.out.println("\tthrowNoodles");
	}
	public static void drainNoodles(){
		System.out.println("\tdrainNoodles");
	}
	public static void fryHamburger(){
		System.out.println("fryHamburger");
		placeMeat();
		fryMeat();
		drainFat();
	}
	public static void placeMeat(){
		System.out.println("\tplaceMeats");
	}
	public static void fryMeat(){
		System.out.println("\tfryMeat");
	}
	public static void drainFat(){
		System.out.println("\tdrainFat");
	}
	public static void makeSauce(){
		System.out.println("makeSauce");
		addMargarine();
		addKetchup();
		mix();
		microwaveSauce();
	}
	public static void addMargarine(){
		System.out.println("\taddMargarine");
	}
	public static void addKetchup(){
		System.out.println("\taddKetchup");
	}
	public static void mix(){
		System.out.println("\tmix");
	}
	public static void microwaveSauce(){
		System.out.println("\tmicrowave");
	}
	
	public static void combineIngredients(){
		System.out.println("combineIngredients");
		addMeat();
		addSauce();
		stir();
	}
	
	public static void addMeat(){
		System.out.println("\taddMeat");
	}
	public static void addSauce(){
		System.out.println("\taddSauce");
	}
	
	public static void stir(){
		System.out.println("\tstir");
	}
	
	public static void makeSketti(){
		System.out.println("Mama June's making sketti");
		System.out.println("=========================");
		gatherIngredients();
		boilNoodles();
		fryHamburger();
		makeSauce();
		combineIngredients();
		System.out.println("done making sketti");
	}
}

When you run the program, you obtain the following output.

Mama June's making sketti
=========================
gatherIngredients
	getBurger
	getNoodles
	getPot
	getPan
	getKetchup
	getButter
boilNoodles
	pourWater
	getButter
	addNoodles
	cookNoodles
	throwNoodles
	drainNoodles
fryHamburger
	placeMeats
	fryMeat
	drainFat
makeSauce
	addMargarine
	addKetchup
	mix
	microwave
combineIngredients
	addMeat
	addSauce
	stir
done making sketti

And there you have it, you have used functional decomposition to write a simple program that makes sketti. Admittedly, this is not a realistic program. But it illustrates functional decomposition.

Conclusion

Computer programming is problem solving. Computers require instructions to do anything interesting. Those instructions must be detailed. Think of writing a computer program to perform some task as similar to writing a term paper. Before writing, you decompose your paper’s topics into an outline.  That outline becomes your term paper’s structure.  The same is true for programming. When programming, your outline becomes your program’s skeleton.If you strip functional decomposition from all other tasks involved in writing a computer program, suddenly writing a reasonably detailed algorithm becomes remarkably simple. It used to be common knowledge that writing algorithms was considered a fundamental programming skill.  Somehow, this skill has gotten lost in the newer generation of programmers and the plethora of coding boot-camps that teach syntax but not programming. But If you learn this one simple skill it will place you in the top-tier of computer programmers. Your problem solving ability and systematic thinking in other endeavors will also improve. You just might start applying algorithmic thinking to your everyday life.

  • This post is geared towards students reading the book Think Java through chapter four. It purposely avoids Object-Oriented design, variable passing, and other topics so as to focus solely on functional decomposition.

Algorithms – Computer Programming’s Foundation

To many, computer programming seems an arcane topic, particularly when first introduced to it. Students learn variables, constants, conditionals, control flow, and numerous other topics. What is often lost – especially in these neophyte to guru in twelve weeks or less boot-camps – is what actually underlies all the arcane symbols comprising a computer language.*see note below

“Computer science is no more about computers than astronomy is about telescopes.”
Edsger W. Dijkstra

Edsger W. Dijkstra

Computer science, and by extension computer programming, is about problem solving. By itself, a computer is a particularly dumb conglomeration of silicon, wires, and aluminum. It does nothing but store and transport electrical charges from one location to another internally. It is not until a human, through his or her intellectual labors, harness the flow of the charges to perform some task that a computer does anything particularly interesting.  But getting computers to do something interesting require a systematic approach to performing tasks. Every step must be specified in detail and you must ensure not to overlook steps. You must devise a detailed procedure for the computer to follow to perform a task. That is computer programming’s essence: write procedures that dictate how a computer solves a  particular task.

So if computer science is not necessarily computer programming, then what exactly is computer science?  Of course, computer programming is an element of computer science, but what is computer science? The following video by “Art of the Problem” on YouTube explains.

Computer science is the science of computation. How to get computers to solve problems. The math behind pushing the boundaries of what computers can and cannot do is staggering. But this is the theoretical side to computer science. Consider the practical. A computer language is a set of instructions that humans can write to have a computer perform tasks. But as computers are dumb, those instructions must be explicit and accurate. Writing accurate instructions to perform a task require an accurate underlying algorithm.

Algorithms

Webster’s dictionary defines an algorithm as “a step-by-step procedure for solving a problem or accomplishing some end especially by a computer <cite here>.”  Here’s a humorous, yet insightful, video clip from “The Big Bang Theory” illustrating an algorithm.

A clip from “Terminator Two: Judgement Day” provides another more graphic, yet still humorous, example of an algorithm.

In the above clip, the Terminator uses one or more algorithms to obtain clothing. From visual input, he calculates the attributes of features such as height, weight, and foot size to determine the probability a person is a match. If not a match, he moves to the next person. If a match, he persuades the person to remove his or her clothes.

Computer programming is formalizing algorithms into a form usable by a computer. Algorithms are the solution to a task/problem. Code is the instructions performing the algorithm. The following video by “Art of the Problem” on YouTube introduces algorithms.

The above video assumes some knowledge of looping and conditional logic. However, if you are reading this as a supplement to “Think Java” then this post only assumes familiarity through chapter four of the book. This chapter introduces you to methods that do not return a value. You have not been introduced to methods that return values, looping, or conditional logic. So let’s consider another algorithm, purposely ignoring concepts not yet covered in “Think Java.”

Consider the task of boiling water. The following steps outline a basic algorithm.

  1. Get pot from cabinet.
  2. Take pot to the sink.
  3. Turn on water facet.
  4. Place water in pot.
  5. Turn off water facet.
  6. Place pot on stove.
  7. Turn on stove burner.
  8. Heat water until boiling.

The steps seem easy enough, but remember, computers are very dumb. These instructions are not explicit enough. For instance, consider the sub-steps involved in obtaining a pot from the cabinet.

  1. Walk over to cabinet.
  2. Open the cabinet.
  3. Reach into the cabinet.
  4. Grasp pot handle.
  5. Lift pot and take it out of the cabinet.

But of course, computers are too dumb to even understand these instructions. For instance, consider the sub-steps involved physically perform the first step, first sub-step.

  1. Turn body in direction of cabinet.
  2. Take alternating right and left steps until cabinet is reached.

But even still the computer is clueless, as these sub-sub-steps remain ambiguous. We could continue to break the steps down to the most fundamental level. For instance, instructions on how to move each muscle when walking to the cabinet.

Understanding muscular contractions is obviously taking our algorithm to a level too fundamental to be useful. Thankfully, algorithms build upon previously solved algorithms, which can be reused as a “black box.” But before considering algorithm reuse, first consider how to decompose an algorithm into more manageable sub-tasks through a process called functional decomposition.

Functional Decomposition

Functional decomposition is explained well by the following.

Societal Impact

There is a downside to our rush to formalize all problems into formal algorithms. We use computers to decide everything from product

Sometimes a computer’s formalization leads us to eschew common sense. And think about this last clip. Politics aside, I leave you with this vision of our near future. It’s not science fiction, it will be reality within a few decades at most.

* This post is particularly directed towards students that have completed chapters one through four of the book “Think Java.” by Allen B. Downey & Chris Mayfield.Note in this post I specifically avoid discussing repetition and conditional algorithms, preferring to keep this post simple. I also avoid discussing Object-Oriented analysis and design. Here, I focus on simple algorithms and functions. It is my personal belief, that understanding simple functional programming helps transition to Object-Oriented programming.