Skip to content

Step up your data driven delivery game!

Working with data and delivering data-driven products is a key success factor in business these days. At our Tech Friday event we had a demo session of a secure and scalable data science CICD process implemented on OpenShift and a cloud-native development setup where machine learning models are deployed as microservices and can deliver
data-driven value to much more people than ever before.

When you are working with data and want to deliver your product to add value to the rest
of the organization, this can be a real challenge. You have to find the right tools and meet the right security requirements and finally get the user’s attention to use your product. This situation can be boiled down to two outcomes.

  1. You are very technical and know how to find your way around but will bump into a wall of IT administrators who will stall your “go live to production” moment for as long as possible and the deployment becomes a long and draining process.
  2. You have your data-driven product running on your machine (classic) and have to find someone else to help you move to production. To find someone you have to explain how your data product works. Most of the time the people who can help you are not working with data and the explanation becomes some kind of nightmare. Either way, you want to tackle these situations as soon as possible by managing your

The CICD process implementation consists of a people-first approach, a microservice framework, an OpenShift container platform, Gitlab, and some security frameworks. We are going to focus on the microservice framework first. This framework enables non-software developers to create simple APIs. The framework has multiple layers on each microservice where different needs are met. This is the place where every stakeholder has a place to get their wishes granted.

From the inside out you have the actual machine learning model developed by the data scientist. After that the input validation layer and next the API security layer followed by the API web server layer. All this is packaged inside a container image which is created from an automated pipeline to prevent repetitive tasks.

The container image is deployed as a pod on an OpenShift container platform with all the security measures, monitoring, and scalability in place to take the application to production. Each of these layers can be maintained and developed by different stakeholders from different disciplines and they can work together in a controlled cloud-native environment where everybody can do what he or she does best without sacrificing time to market or level of quality.

Speed is everything

What I love to do is bring people together with tech and help them speed up their process. Along the way, this concerns also that you listen very carefully to people and connect with them on a human level. When you do that, you will find that time is an incredible resource that can be spent in many ways but it does not scale. In DevOps you can buy people time by automating and scaling their processes.

Working at Devoteam

We, as Devoteam, are simply the sum of 9,000 Tech Enthusiasts full of innovative energy. To increase our tech stronghold & density, we are constantly looking for talent and potential, from young professionals to experienced legends. Have a look at our vibrant culture and discover our company!