Book description
With microservices taking the software industry by storm, traditional enterprises running large, monolithic Java EE applications have been forced to rethink what they’ve been doing for nearly two decades. But how can microservices built upon reactive principles make a difference?
In this O’Reilly report, author Markus Eisele walks Java developers through the creation of a complete reactive microservices-based system. You’ll learn that while microservices are not new, the way in which these independent services can be distributed and connected back together certainly is. The result? A system that’s easier to deploy, manage, and scale than a typical Java EE-based infrastructure.
With this report, you will:
- Get an overview of the Reactive Programming model and basic requirements for developing reactive microservices
- Learn how to create base services, expose endpoints, and then connect them with a simple, web-based user interface
- Understand how to deal with persistence, state, and clients
- Use integration technologies to start a successful migration away from legacy systems
The detailed example in this report is based on Lagom, a new framework that helps you follow the requirements for building distributed, reactive systems. Available on GitHub as an Apache-licensed open source project, this example is freely available for download.
Publisher resources
Product information
- Title: Developing Reactive Microservices
- Author(s):
- Release date: July 2016
- Publisher(s): O'Reilly Media, Inc.
- ISBN: 9781491960639
You might also like
book
Reactive Application Development
Reactive Application Development is a hands-on guide that teaches you how to build reliable enterprise applications …
book
Microservices Development Cookbook
Quickly learn and employ practical methods for developing microservices Key Features Get to grips with microservice …
book
Building Reactive Microservices in Java
If you’re investigating ways to build distributed microservices, perhaps to replace an unwieldy monolithic enterprise application, …
article
Run Llama-2 Models Locally with llama.cpp
Llama is Meta’s answer to the growing demand for LLMs. Unlike its well-known technological relative, ChatGPT, …