Spring Cloud Stream improves the opportunities to distribute the task computation into different micro services using a message driven methodology.
Using a pattern based on source, process and sink, we’re able to build pipeline for resolving common issue as log processor. This is what I’m going to do in this article.
I learnt that modeling a schema for a NoSQL database (Mongo, Cassandra, …) is not the same thing of drawing a schema for SQL one (Oracle, MySql, …).
In this article, I’m going to illustrate what I’ve learnt reading by who knows more than me!
Sometimes an object instance is not enough; specially, when you use a stateful classes on concurrency environment runtime, a pool of object is a good solution to achieve a multithreaded executions.
In this article, I’ll show you how to set up this configuration with Spring.
Distributed applications is always a current topic in the articles and forum discussions; Microservices answers the need for application scalability and makes a good marriage with Cloud solution (AWS) or Virtual container (Docker).
In this post I’d like to do a little introduction of the concept behind these solution type.
In my previous post on the argument (Junit testing with Spring), I skipped a very interesting part of testing attributes in the code which are inside methods.
In these cases, Mockito it’s very helpful to solve the problem.
A very common schema, nowadays, is the need of transforming data and processing them through a pipeline where, at the end, we get the data ready to use.
In this article, I’ll show how to get it using Spring Integration by a simple example.
How accomplish a job using Spring Batch solution in order to using the same Spring technology adopted for a Web solution.
Easy to get, adding an administration monitor that helps to keep under control the process execution.