ABOUT THE WEBINAR
Microservices are the future, and we're going to use this webinar to share what we've learned from implementing microservices with Mendix customers. Microservices can have a major positive impact in productivity, and with the right knowledge you can implement them in your own projects. Join the webinar to learn how to leverage the Mendix Platform to implement a microservices architecture, learn about use cases, and apply best practices.
During this webinar we will cover the following topics:
- How to provide a seamless user experience
- Patterns to transfer data between apps
- Best practices and pitfalls
- BI Patterns
QUESTIONS & ANSWERS
What is the Deep link module and is it available in the Mendix App Store?
The deep link module allows you to navigate to a form or call a microflow in your application through a link. And yes, it’s in the Mendix App Store - https://appstore.home.mendix.com/link/app/43/.
All the static content, such as JavaScript files, are cached by your browser and not transmitted over the network again. The JavaScript code does need some time to start, that is unavoidable.
How did you automatically generate these items? REST service and so?
This is explained in more detail in the webinar, the following documentation pages will help:
https://docs.mendix.com/refguide/published-rest-services
https://docs.mendix.com/howto/integration/publish-rest-service
Why are you using a REST service to transfer data instead of an App Service?
Our goal in the webinar was to use standards you can also use to integrate with non-Mendix microservices. However, this is not an indication on what you should use. You should use what makes the most sense in your case.
Do you support other identity providers, such as Keycloak?
We support many identity providers through our SAML and OAuth integrations. It is also possible to add custom connectors for any other protocol, such as OpenID Connect.
How could I pass the authentication token from my UI session to a downstream REST call?
When you are using SSO to log in, you can capture and store tokens during the login process. When calling a REST service you can then use the token stored in the session and pass it along as an HTTP header or in the body of the message, depending on the requirements of your service.
Is there documentation or a sample project available using this navigation method with deep links and SSO?
There are no ready-to-use sample apps. However, the modules (AppCloudServices, deep link and other SSO modules) come with extensive documentation, just not geared towards this specific application.
What if screens/processes use information from both orders and invoices (in your example).
It depends on exactly what the use case is, but that might a good case for sharing data between microservices. That way, you should have almost everything you need to use on each micro service.
It can be tricky to explain or establish a business case for developing microservices. Especially because the payback really happens in later projects. What advice do you have for explaining the benefit to the business who are paying (not the IT guys)?
Usually, microservices come from the need to solve a problem (or set of problems). So, if they’re a good solution to the problem, it shouldn’t be that tricky to convince the business to go with them. Of course it involves higher costs up front and a mind-shift from the business, but to convince the business we explain the benefits of devops, automation and autonomy.
When using a message broker, doesn't this introduce a single point of failure (when the broker is down)?
Usually these systems (if you use something like Kafka) are built for high availability and reliability (for example, multiple instances and multiple availability zones), so it’s not easy for those systems to fail (depending on how they’re set up). You can have network failures, but that will also happen for synchronous mechanisms. With asynchronous mechanisms, they will most likely handle the retrial for you.
Are there any limitations to the Kafka payload? If not technical, any best practice limitations?
We never had any limitation issues on any of our projects. Also, Kafka is used around the world by some very big companies (LinkedIn, Uber, etc), so I think that whatever limitations there are, they will probably be the ones to find them, not us.
What problem do microservices solve? One Mendix application vs. more Mendix applications?
Microservices allows you to cleanly separate different business domains into different apps, thus reducing the internal complexity and coupling between functionalities in each app. This allows you to develop each domain as an autonomous service with its own release cycle. For instance,
• You might see that there’s a particular piece of functionality that gets changed more often than the others and needs to be deployed more often. But as it’s still part of the monolith you need to deploy the whole monolith, so you’ll need to wait for the other developers to finish their work on other functionality. This means that particular piece of functionality might be a good microservice candidate.
• You might have problems of load on specific functionality and you might want to spin that functionality off into a microservice to be able to scale that functionality at will
Do we have sample templates for REST integration to use existing microservices with Mendix?
For many common products that use REST integrations there are modules available in the Mendix App Store. Many other microservices are created as part of a specific architecture and are not directly reusable outside of that for other customers.
Because you need message definitions, I would say you are less flexible if you need to change it. Isn't that a big disadvantage using microservices?
It’s important to make sure integrations between microservices are stable to prevent this. This means each service should be responsible for its own translations to the integration data model, so that the integration can change as little as possible. However, this is no different that having internal dependencies in a monolith, where a change in data definition in one part of the app needs changes in dependent parts to accommodate.
If you wanted to expose several microservice UIs in a single view, what would be the approach you would recommend?
Combining microservice UIs is best done using the method we showed in the webinar: creating autonomous UI apps that are linked together through a common consistent UI and deep links. Requirements to combine data often come from a need to have a dashboard-like functionality, which typically warrant a microservice of its own.
Which Event Broker do you recommend and why?
The recommendation is to use anything already in place within your IT landscape that fits the requirements. This way you can profit from any experience that’s already there in hosting and managing it. If you do not have anything that fits, we cannot provide a specific recommendation as we do not offer a service like this ourselves.
How did you accomplish the toolbar/main nav that links from service to service? How can we do this without using Mendix App Cloud Services? We use a different IDP for SSO.
By using the same template on the pages of the different applications. You can use any IDP you want using modules form the App Store. Most notably, the SAML module supports most major IDPs
When should you not use microservices?
There’s no direct answer for this, because it depends. You should always start with a monolith and start breaking it up when you see there’s a need for that. So, you should only use microservices where you have a problem that microservices can solve.
Can we have a seamless integration at UI level with an existing non-Mendix microservice?
Yes, as long as you can provide the same UX/UI so that the apps can work seamlessly together for your users, there is no reason this would not be possible.
What questions would you ask to help think about integration requirements. For example: how often does data need to be exchanged? What is the volume?
There are many aspects to be considered – too many to be fully discussed here. The most important one is the required consistency (does data need to be up to date immediately). Other important factors are robustness, data volume, performance.
Can we use microservices to accomplish most complex business logic? Or does this require going one level down to the Java actions and code the logic by extending them ? Have you faced any such scenarios in your customer implementations?
Most complex business logic can be created in Mendix and included in microservices.
In your example of order and invoice microservices, which integration pattern would you recommend if the invoice needs to be generated for the order created in the orders application by the time the user opens the invoice page using a deep link?
If the invoice always needs to be there before the user opens the link, a synchronous request/reply integration is your only option, as that is the only way to guarantee the logic is executed before transferring the user. However, this has the downside that the user will always have to wait for the document to be generated before being transferred to the invoice page. This trade-off in user experience has to be made for each solution.
You do have a module to work with Kafka, but do you have one to work with RabbitMQ?
Yes, we do.
What would be the best way to avoid too much dependency between microservices? For example, should one fail, can the others still operate?
Each microservice should store the data it needs to operate. This makes your microservices more autonomous and less dependent on others. In these situations, if another service is unavailable, the others should still be able to function a bit better than if they were completely dependent on the failing server for the data.
Why are you using a deep link and not a Mendix native page URL?
Because we’re using single sign-on, and the deep link module handles the redirection to the IDP when a user is not logged in the target Mendix application. Additionally, the deep link module also allows us to link to microflows.
Let's say I have three microservices that are associated with each other. If one of the microservices is down, what happens to my published data?
The data in the app that is down will of course be retained in the offline database. The other microservices will not be able to access it through their integrations. This is a scenario that a microservices architecture should take into account, for instance through queuing and retry mechanisms.
The hospital scenario looks a lot like Business Process Management (getting/receiving data from multiple systems). Why would you build all those dependencies in Mendix and not use other services (like BPM in SAP PO for example), which are a little bit broader?
This is just an example app. However, I don’t agree that this is a good example for BPM as it’s a reactive system, not a well-defined process. Each microservice/application has its own purpose and there’s data shared between various apps.
How would you version and govern microservices using Mendix?
A microservice by itself only has one active version as it’s a full-stack application (Data/Logic/UI) that’s deployed into production only once. The integration points to other microservices are typically REST or SOAP and can have different versions available at the same time. Mendix provides tools in the Desktop Modeler to manage available service versions.
Filtering and sorting options on non-persistent entities are limited in Mendix 6. Would you make persistent copies on the consuming side?
It would depend on the requirements, but most likely yes, to have the microservice as autonomous as possible. Having a local copy of the data increases autonomy significantly as the service can continue to operate when other services are not available.
Do you need to keep certain design patterns in mind when you start building a monolith, so the decoupling of the app will get easier to do afterwards?
Yes, you should always keep the functionality and domains grouped in a logical way (Mendix makes this easy with the modules), and make sure the modules are decoupled and the dependencies kept to a minimum. This is a general best practice in app development, and it will help you if you need to split your monolith into microservices.
How would you create a long-polling mechanism to update data on a UI based on an asynchronous data change triggered by Kafka?
This could be created in Mendix through a JavaScript widget that initiates the long-poll and talks to a microflow that checks for new events. An alternative using existing functionality could easily be constructed using the microflow timer widget from the Mendix App Store.
Can we use Tibco as event broker to implement microservices?
Yes, we have existing customers that use Tibco as a messaging layer.
View more webinars on-demand at learn.mendix.com.