The past few years in IT have been very interesting, because of the many, increasingly faster opportunities and techniques, especially as an IT professional and this will only become more interesting next year. Developments are fast, as said, and change is the order of the day. Geo-politics will throw the playing field upside down is my expectation.
Will datacenters at companies undergo a mortuary construction? Companies will increasingly switch to the public cloud and use the many advantages. In addition, I implement Software-Defined-Datacenter (SDDC) solutions that replace traditional hardware with, for example, VxRail (Dell EMC / VMware), Nutanix and Simplivity (HPE). Remarkably, I have not done any Azure Stack implementations yet, which I personally would like to do once.
With an SDDC the advantages of virtualization, as known for years for servers, can now also be achieved with the other parts of the data center, namely the network, security and the storage capacity. Because of software-based control, the data center can also be managed as one, of course with software. If you then add a self-service portal for (authorized) end users, you will soon have your own (private) cloud.
Solutions such as VMware on AWS – with which you can easily transfer your VMs (workloads) to the public cloud via an elevator-and-shift concept – is an example of market demand. VMware also recognizes that the investments in its own data centers do not positively close the business case. It is, therefore, the challenge for companies that have their own (traditional) DCs to fulfill their new/future role.
I myself attended VMworld Europe last November as I do every year. A fantastic conference from the market leader in virtualization and one with a clear vision. Always very excited to be there, because it inspires me and provides me with answers that I am looking for my clients. This year VMworld gave me the feeling that the information technology from it’s ‘Core Infra’ is ‘ready’. Change and innovation take place higher in the stack. Of course, much can still be improved, cheaper, smarter and more creative. But in essence, the basic virtual and cloud technology is present, for sale, effective and safe to apply.
In 2019 companies will adopt cloud services faster, they want to purchase services higher in the cloud stack, and further distance from infrastructure services. Multi-Cloud becomes more and more a fact and a challenge for those companies. The adoption of serverless architectures will grow more and I will meet AWS Lambda more and more often. Container services adoption will also start to grow more. Companies will also make more use of container services such as Kubernetes and Docker. Where in my opinion Kubernetes has won – or will win – the battle as the container management-orchestration solution. VMware did not acquired Heptio for nothing with Joe Beda as one of the inventors of Kubernetes. Docker makes it possible to package an application in a lightweight, movable container. This makes installing an application on a server as easy as installing a mobile app on your tablet or smartphone.
In 2019 Docker from now on will become part of the IT toolbox to make applications function lighter and faster and to easily move them from one environment to another. I mainly see Docker growing on the AWS platform with my clients.
That both Microsoft, VMware, and AWS have opted for collaboration with Kubernetes and Docker and position this alongside their own technology for containerization, I think that virtualization as it had, until recently, had its longest time.
Multi-Cloud vs Hybrid Cloud
You might think that “multi-cloud” and “hybrid cloud” mean the same thing, but they are in fact different stages in the evolution of cloud computing.
“Multi-cloud” means that more than one public cloud is used. That use pattern arose when companies tried to prevent them from being dependent on one public cloud provider. More a combination of best-of-both-worlds, so combining specific services from public cloud A, Cloud B, and cloud C together grew into a “multi-cloud”. This does involve the necessary complexity in managing, controlling and automating the different flavors.
“Hybrid Cloud” means when components of the infrastructure run in both an On-Premises (Private cloud) environment and a Public (cloud) environment. Often an organization has its own datacenter in which its infrastructure is located and managed. It is also possible that the infrastructure is located at a hosting provider, but the management and maintenance is done by itself. In short, a “private” cloud in which the organization itself is responsible for its infrastructure environment.
In addition, many companies have already placed part of the infrastructure with a Public Cloud provider such as Microsoft Azure, Amazon Web Services (AWS) or Google Cloud.
To make multi-cloud work best for an organization, I use a multi-cloud management solution, such as a CMP (cloud management platform) or a CSB (cloud services broker) between on-premises and the various public clouds. This prevents complexity of the specific native cloud services of each (public) cloud provider. Instead, you work with a (self-management) abstraction layer, sometimes called a “single pane of glass”, where you can use a management interface and sometimes a set of APIs to perform common tasks between the cloud providers you use.
What’s in IT for you?
Advice that I always give: focus on what cloud technologies do and why you need it, not just look at forms such as public cloud, private cloud, hybrid cloud, multi-cloud. Semantic overload? Indeed. Do not get entangled in naming, but instead focus on what they do. It is a fact that cloud architectures will evolve in the coming years and new patterns will also emerge. I expect that new names will also come.
The adoption at AWS and Azure is growing enormously, the right guidance and management for this is and will remain a concern in 2019. Google Cloud Platform I see more often in the picture at companies.
In the past period, in which I have supported several companies with their journey to the cloud, my support and advice are increasingly grown, about the management and deployment of the cloud for the core business. The question whether that cloud can be of added value for companies is no longer an issue. I get the question more about how the cloud can be applied. What strikes me is that companies have often made a choice for which provider they prefer to purchase cloud. When I ask why the company have opted for this. They often cannot provide me with a good and thoughtful answer. In short why AWS and no Azure? Or vice versa. For this, I often take IAM as a basis and also very important, the application landscape of the organization.
Furthermore, IAM in combination with the cloud is always a part that brings many challenges, especially SSO, which many companies want to offer their end-users. Choices at the protocol level and attribute level are decisive.
What will 2019 bring?
More and more hybrid and multi-cloud environments. Because of the following reasons:
There are no additional costs for using multi-cloud, other than the added complexity management that is needed. In this the use of a cloud strategy is very important, it must be decisive in this. The strategy provides frameworks – where the business always has to be central – for innovation towards the cloud. However, smart companies will do this quickly, including the use of advanced multi-cloud-oriented cloud-ups platforms.
Employees are increasingly demanding convenience and flexibility, which means that employers will have to give them access to data from anywhere, at any time and still work according to the new standard reliably, simply and quickly, with data always and everywhere available. The best way for companies to deal with this requirement is to switch to a multi-cloud model, leading to an environment that requires data management to evolve from policy-based to behavior-based environment. A multi-cloud strategy will become the standard for companies that want to compete in 2019 at the business level but also at staff level (attract and retain).
Serverless computing will become systematic for most cloud development services and databases. It is much easier to use and pre-provision of infrastructure resource provisioning and scaling is no longer necessary. In addition, cloud development services will rapidly improve and expand the serverless eco-systems, which, in my opinion, puts the traditional PaaS systems under more pressure. Namely, companies – at least my advice too – will be wise to take services higher up in the cloud stack, which also ties in more with the serverless architecture evolution. Using functions is ultimately what matters and the underlying infrastructure and middleware are only needed to create the functions. So why purchase a platform of OS, Databases, Runtime etc. on which you then have to implement your functionality, if you can purchase the functionality as a pay-per-use service and be relieved of eg license, capacity, Lifecycle management, security, and updates.
Containers will grow (even) more and with that Kubernetes will do the same. Of course, inextricably linked, if you want to use containers on a large scale and want to manage them well.
Companies will (still) make more use of IOT, Machine Learning, and Artificial Intelligence, especially to give their data structure, so that the right data is available and made at the right time. In my view, this will bring major changes in all branches, but especially in healthcare.
Cloud integration will become more and more a necessity and a specialization. How do you connect business smart with each other in a safe and fast way, whereby this also happens in a structured way. Personally, I have been working on cloud integration for a number of years, including data, applications, IAM and infrastructure. Cloud integration will really become a specialization in 2019.
Security, security and once again security. More companies will have to apply an approach to security and architecture. This will become a distinctive feature for companies, which will certainly benefit from competition. As an architect, I have been working for years with the ‘Security-by-design’ principle, which means that you can immediately take security into your solution in the design phase. This sounds logical and yet, security is often only tested after a new solution is delivered. However, it is cheaper and more efficient to do this with the design.
If you want to know more about Cloud Computing or one of the above topics, please contact me on my profile on LinkedIn, my twitter or send me a message. I dare to give an opinion and share this, but also like to hear other opinions in order to get new insights.