Audio analytics has changed car companies' focus on their products to improve customer satisfaction. Voice and speech recognition have become integral in the industry.
Remote Desktop Protocol is a network communications protocol developed by Microsoft mainly for remote access. It will help you stay safe from MitM attacks.
How to install KubeSphere, a container platform running on top of Kubernetes with streamlined DevOps workflows, unified multi-cluster management, and more.
If your DevOps team is planning process automation, here are a few key things to know about the flexibility of BPM engines and how to implement them properly.
Hello Muleys, Here's another interesting article on how to restrict additional queryParams and headers that are sent other than those defined in RAML. We all do the test if the required parameters are working fine or not but we forget to test if unnecessary parameters sent along with required parameters. What happens if unwanted parameters are sent? There are chances that attackers might send some thousands of queryParams and Headers with large content. In that case, your application will crash. So what to do? Here's the solution: I have designed a basic RAML with the below resource : When you download the RAML and generate flows, There's an option to restrict additional parameters or headers in APIkit Router Module configuration : By default the configuration is disabled. You have to enable the strict validation config as below: Now your application will not be allowed to pass additional fields: Removing unnecessary fields will give a successful response: Check out this video for the live demo:
All APIs developed using Mule must be built with a health check endpoint. This template will help you maintain a uniform structure across all the APIs and their layers.
A DevOps toolchain is a collection of tools that operate as an integrated unit to design, build, test, manage, measure, and operate software and systems.
I describe how I created the API definition, then how I created the server and client code from the API definition. Then I will talk about some of the problems I faced.
Right now, Apache Kafka utilizes Apache ZooKeeper to store its metadata. Managing a ZooKeeper cluster creates an additional burden on the infrastructure and the admins.