Our articles.

January 28th, 2020

From DevSecOps to SecDevOps: for even better, integrated security

Integration, integration, integration. From now on, IT and business teams need to be aligned, meaning they need to work closely together. That is the true cost of agility. If the DevSecOps concept was to design integrated security, why not choose security that is not only integrated in both operations and development, but that is at the very foundation of any initiative?

Bruno van Marsenille

December 18th, 2019

CIO: Good intentions for 2020

At the end of the year, the time is right to take a look at the balance sheet, but also to set a number of new goals. And no, there will be no rest for the Chief Information Officer in the coming year. Automation, the war for talent and business alignment remain on the agenda in 2020, without forgetting the operational side of IT.

Bruno van Marsenille

December 10th, 2019

Artificial intelligence. Go for a pragmatic approach

If there is one term in vogue in ICT these days, it has to be AI or artificial intelligence. According to Gartner, AI’s adoption rate has increased from 4% in 2018 to 14% in 2019. But still, AI covers many realities. It is therefore important to focus on technologies that offer the biggest potential or return on investment.

Bruno van Marsenille

November 19th, 2019

DevSecOps, the grail of agility

These days, a company’s success is measured by its agility, its ability to react to technological change, improve its competitiveness and conquer new markets. In the wake of the DevOps and NoOps methodologies, DevSecOps or the integration of IT security in the complete life cycle of applications seems to be a must.

Bruno van Marsenille

October 7th, 2019

ElasticSearch: transforming data into information

In modern business, data is the new oil. Consequently, a company must be capable of collecting, storing and processing large volumes of structured and unstructured data to realize a competitive advantage. An engine like ElasticSearch may help find the needle in the proverbial haystack…

Bruno van Marsenille

September 2nd, 2019

Replacing virtualisation by containers?

The IT department of an organisation must be aligned with the business needs. It can’t be developed in a traditional way anymore, knowing that the company must be proactive and flexible. Hence the emergence of Devops and agile development, or even of NoOps. This agility is not conceivable without containers...

Bruno van Marsenille

Will container technology become as widespread as application virtualisation? Although the concept of the container already exists since the beginning of this century (in particular with Linux), the success of Docker (the most popular and most used open source containerisation platform based on Linux) since 2013 has reinvigorated the use of containers.

The advantages

In fact, a container is a complete development environment integrating the application, all of its dependencies, libraries, binary files and runtime files within one single 'package'. Consequently, it is a virtual envelope integrating anything an application needs in order to function. Remark: unlike server virtualisation or virtual machines, containers don’t have an operating system, but rely on the OS of the server on which they are deployed.

Another advantage of containers is their size, since they only 'weigh' a few tens of megabytes, versus several gigabytes for virtual machines and their complete operating systems, which allows to increase portability. Moreover, virtual machines may need several minutes to start and run their applications, while containers can start almost immediately.

Moreover, containerisation is more flexible and modular since a complex application can be split into modules (database, interface…) rather than stored in one single container. This is called the ‘technique of microservices’ (please refer to a previous blog). Since these modules are lightweight, they can be managed easily and activated 'on the fly' when a need is emitted. Remark: there are many container management systems on the market, whether for Windows or Linux, but also among the main cloud providers.

Finally, a technology such as Docker allows to deploy a locally tested application in production on almost any cloud, while the operation can be complicated in the case of virtualisation.

In order to convince people of the importance of containerisation, in 2014, IBM published a performance comparison between Docker and KVM, concluding that containers equal or exceed the performance of virtualisation. In 2017, this opinion was shared by the University of Lund in Sweden, which compared containers to VMware virtualisation and came to the same conclusion. The only constraint: containers created under Linux aren’t compatible with Microsoft and vice versa, which is not the case for traditional virtualisation.

Deployment

As you can see, containerisation appears to be a solution to increase the elasticity of an application and to improve its performance since each module is optimised for its specific use. Furthermore, the development of applications is faster and their continuous deployment is simplified because in case of modification only the code of the module concerned must be adapted, not the entire application. For this same reason, operational maintenance will be easier.

Additional remark: by means of containers, the developer enjoys more autonomy and freedom of action because he can work inside the container, without having to request the creation of a virtual machine. Moreover, the developer will benefit from an application stack closer to that of the production environment, which will lead to a more fluid production launch, in principle.

Container security, sometimes considered worse than that of virtual machines (since the insulation is intrinsically linked to the VM technology), has significantly improved in recent years, especially at Docker’s, which now integrates a signature platform.

Assistance

However, we should not conclude too soon that virtualisation has had its day and that the container is the magic solution. As a matter of fact, it will be necessary to adequately assist the developer, for instance when his container will have to be deployed on a more classic production infrastructure (Software-as-a-Service platform, virtual machine…).

In practice, the IT service provider can assist the development team in a whole series of value-added services: advice on the choice of architecture and deployment methods, provisioning of the processing or storage capacity, partitioning, making monitoring tools available, settings according to specific needs…

In short, the development team will have to be closer to the infrastructure (this is exactly the challenge of the DevOps approach) as well as to its IT service provider to increase efficiency and agility, and thus be even more tuned to the business needs.

Partner

To help the internal IT department with the deployment of its IT platforms, the Aprico Enterprise Architecture entity relies on proven methodologies, as well as on tools and referential frameworks. Moreover, our specialists will strive to facilitate the dialogue between the IT department and the business entities in order to implement the solutions most adapted to the business needs. Overall, Aprico's experts rely on referential frameworks and best practices in enterprise architecture. They aim to identify and to measure the real added value of any new project in order to formulate implementable and relevant recommendations for the organisation. Moreover, Aprico's architecture specialists are part of the group's global strategy in terms of integrity, privileged contact with the customers, operational excellence and transparency. More information: marketing@aprico-consult.com

August 12th, 2019

69% of companies regard automation and process transformation as a top priority on their digital agenda

Full automation of IT operations - in other words: NoOps - is driven by the intensification of IT automation and the emergence of cloud computing. The goal is to limit - or even eliminate - the intervention of IT specialists when deploying and maintaining applications. The resources thus freed can then be allocated to new assignments.

Bruno van Marsenille

It’s a well-known figure: about 70% of an organization's total IT budget is spent on ensuring its critical applications function properly. It comes as no wonder then that, according to Deloitte's “2018 Global CIO Survey”, 69% of companies regard automation and process transformation as a top priority on their digital agenda.

Fully automated

In a NoOps environment as envisioned by Forrester, "the deployment, monitoring and management of applications and the infrastructure on which they run are fully automated," says Glenn O'Donnell, Senior Analyst and co-author of the report "Augment DevOps with NoOps". In practice, rather than having the development team test their program in an isolated environment before entrusting it (providing it meets the requirements, of course) to the operations team, the tasks assigned to that automation team are fully automated, whether they are implementation, management or even maintenance tasks. This move towards NoOps is driven by an ever-increasing automation of operations as well as by the cloud - which explains why so many Platform-as-a-Service (PaaS) providers offer this type of solution.

In DevOps, on the other hand, the development and operations teams share tasks. And they are closely involved in the whole chain (with an agile rather than a classic waterfall approach), both for code generation and functional changes as well as for the production and the life cycle of the applications.

Serverless

In reality, there are no longer any barriers between development, updates and modifications, testing, deployment, integration and maintenance, all of which are automated. In this context, the use of containers will obviously improve efficiency, agility and security, as applications integrate their complete execution environment. Indeed, these containers are built from a set of isolated microservices (see also our previous blog). And they can be dynamically created and provisioned quite simply, the challenge lying in the use of a provisioning engine smart enough to understand the needs and characteristics of the workload associated with the container. In addition, these containers reduce dependency on traditional virtualization technologies, such as VMware or Hyper-V.

It should be noted that the microservices architecture and the containers are closely linked to the serverless concept. That concept puts the cloud service provider, such as Amazon Web Services, Microsoft Azure, Google Cloud Platform or IBM Cloud, in charge of the execution of some of the code by dynamically allocating resources. Indeed, this code is usually executed in a container and sent as a function.

According to a recent survey among 600 IT decision makers, conducted by the vendor Cloud Foundry, 19% of respondents already use a serverless architecture, while 42% intend to use it in the next two years. In fact, research by MarketsandMarkets estimates that the serverless market will represent 14.93 billion USD in 2023, compared to just 4.25 billion USD in 2018.

Cloud in the centre

To succeed in a NoOps approach, or at least tend towards that goal, the cloud appears as the road to take. It allows your organization to get rid of a number of tasks related to IT operations, while an internal data centre imposes certain constraints such as the provisioning of machines, the management of the network or storage in particular. That said, on this level containers can also contribute part of the answer (see above).

In fact, the cloud will have to come in different ways. Be it, at one end of the spectrum, in the form of a Platform-as-a-Service, where the customer buys access to his applications. Or, at the other end, in the form of a Function-as-a-Service, where the customer pays only for the code he wants to run, depending on the degree of maturity of his organization.

Eventually, NoOps should allow IT to free up resources currently assigned to management tasks, the so-called IT operations. It should also give developers the option of not having to worry about 'minor' tasks related to the underlying infrastructure, the operating system, the middleware or the runtime language. Overall, it's about getting IT teams to shift from a reactive to a proactive approach.

For now, Amazon, Google and Microsoft offer serverless platforms that allow you to get closer and closer to the NoOps concept, while players such as IBM, Alibaba or Oracle offer their own approach. But it is clear that this transition will not happen overnight and requires a higher level of maturity from your IT department.

With this in mind, Aprico has been helping companies transform their business for more than two decades. Therefore, we can share best practices, technologies and organizational models that will allow you to quickly adapt to the current rapid evolutions of your business and IT environment. More information: marketing@aprico-consult.com

June 5th, 2019

Shadow IT, turning threats into opportunities

With the advent of the cloud and Software as a Service (SaaS), shadow IT is gaining ever more ground within today’s companies. Should your IT department regard this rising new trend as a threat, at the risk of losing credibility with their colleagues and maybe even some customers? Or does it offer them an opportunity to better manage their IT infrastructure and provide better IT services to end users?

van Marsenille Bruno

According to a recent study by security specialist McAfee (“Cloud Adoption and Risk Report 2019”), 21% of files hosted in the cloud contain sensitive data, while the actual sharing of such files has increased by no less than 53% in a single year. Even more striking - and worrying perhaps - is the finding from another recent study, sponsored by McAfee but conducted by Frost & Sullivan, that 80% of employees admit to already having practiced shadow IT. Either because their department has chosen and adopted a solution without consulting their IT staff, or - and just as easily - by buying software on their own initiative.

Not only is the use of shadow IT often just a mouse click and a credit card away, the millennials in your workforce also tend to consider BYOD (Bring Your Own Device) an acquired right, allowing them access to any number of professional applications from their personal devices. Consequently, many applications and devices nowadays escape the view and therefore the control or management of IT departments - which is precisely the definition of 'shadow IT'.

Threats aplenty

The dangers of shadow IT are obvious, whether in terms of security, management costs or the lack of coherence of your overall IT infrastructure. And while aspects of cost savings and greater flexibility can be put forward as clear benefits by the business or even a particular user, it is not so much the actual existence of shadow IT that seems problematic, but its huge breadth. Indeed, if shadow IT gets too widespread, it can quickly become uncontrollable, even though it’s meant to meet the needs of efficiency and agility.

In addition, the role of the IT department is precisely to translate the demands of business users into powerful IT solutions. However, the hurried choice of an external application may not fully meet the expectations and, above all, it may not be part of your company's overall IT strategy. Even worse yet, it may not even be able to integrate - sufficiently - in your company’s IT infrastructure.

Opportunities abound

Should you just surrender to shadow IT then? And give those external applications free rein within your carefully built and managed IT infrastructure? Certainly, some will evoke the ease and speed of implementation of such applications: a key argument in the context of digital transformation.

As a first step, while defining a framework for good governance, your IT department should inform your users and raise their awareness of shadow IT. In addition, it should insist on its own skills and those of its trusted partners, without necessarily closing the door on any external solutions. It should also insist on the necessary coherence of your IT environment. Not to mention important aspects of that environment, such as security and compliance - especially in view of the famous General Data Protection Regulation (GDPR).

Ultimately, your IT department should seek to position itself as a trusted partner rather than simply a service provider, by searching with its business colleagues for the most relevant solution that can be deployed in a timely manner. In other words: it needs to become a privileged and open interlocutor capable of offering informed choices in line with the priorities and objectives of each business entity. In addition, it will have to offer a catalogue of services in the form of an a la carte menu rather than a set menu.

More opportunities than threats

"The cloud offers more opportunities than threats", concludes the McAfee report. "Especially for organizations that are able to manage the risks and equip themselves with the necessary skills and tools to secure their IaaS, PaaS and Saas."

As a consulting firm specializing in information systems architecture and transformation, we at Aprico Consultants help you strengthen your position in the market by providing you with the necessary flexibility, performance and competitiveness to accelerate your digital transformation processes. As a privileged partner of your IT department, we help you identify the elements of shadow IT as part of an in-depth study of information exchanges within your company as well as with your external partners. Finally, once we understand the reason(s) for these shadow IT practices, we help you restore confidence in your business users and implement application governance policies in line with your global strategy.

May 6th, 2019

Cybersecurity: everyone’s concern

Cybersecurity isn’t merely the concern of your IT department: it is everyone's business. Not only does it concern all levels of your decision-making, up to the highest level, it also concerns each and everyone of your employees. Not to mention your external partners, since your company is increasingly opening up to the outside world.

Bruno van Marsenille

In a recent study entitled ‘Risk Value 2018’, NTT Security assessed 1,800 companies’ cybersecurity policy. According to that global survey, nearly one third of Benelux companies are not well prepared for a cyber attack. Also, they most often fail to advance their prevention and preparedness policy. In fact, only 45% of Benelux respondents say they have established an IT security policy, i.e. 12% below the international average. And those same companies spend only 12% of their ICT budget on cybersecurity. And there’s more: 34% of companies in the Benelux say they would be prepared to pay a ransom if they fell victim to a cyber attack, such as a ransomware infection. In addition, the survey shows that the distribution sector is the least prepared for a cyber attack, followed by the transportation sector, the wholesale trade and the services industry. The telecoms, pharma, chemistry and technology industries are better protected.

Leadership required

At the global level, the study holds some other surprises. To begin with, only 19% of companies regard the Chief Information Security Officer (CISO) as the person who is ultimately responsible for IT security, whereas 22% refer this responsibility to the CIO and 20% to the CEO. This shows a great dilution of responsibilities and skills in IT security. And the new General Data Protection Regulation (GDPR) does not help matters since a Data Protection Officer (DPO) has now also been added to the list of functions. And here’s another other disturbing survey result: barely 57% of organizations have a well-established security policy, while 26% are still working on it. Finally, only 39% of managers believe that their employees fully understand the security measures defined by their company.

Within your company, clear and effective leadership must therefore be established, especially since digital transformation requires a solid and secure foundation. It will be up to your CISO to define your IT security policy and to raise awareness for it among all your stakeholders: management, staff, business partners, etc. All the more so as with the emergence of the Internet of Things (IoT), the cloud, social networks and mobile devices, new attack vectors are equally emerging. It is imperative that your security policies are embedded in your daily business and that data-centric incident management solutions are deployed throughout the life cycle of your data.

Cloud to the rescue

In addition, the cloud requires a clear view of the movement of your data, wherever it is, as well as measures to protect that data and specific procedures for incident management. However, along with the cloud we also see the emergence of managed security solutions that exploit the potential of artificial intelligence and machine learning to identify threats as quickly as possible and counter them with maximum effectiveness. This could well be an interesting solution to the glaring shortage of specialized security profiles.

Moreover, the cloud can present an immense potential for computational power and flexibility, enabling ultra-sophisticated algorithms that are capable of analyzing threats in real time, of modelling risks and of providing a quick response in case of attack. Similarly, the cloud can enable closer collaboration between security actors by sharing threat information (especially details on attack life cycles) and information on cyber criminals (including their most commonly used tactics, techniques and procedures).  This is notably the mission of the Cyber Threat Alliance (CTA) and its project Adversary PlayBooks.

Collaboration: the way to go

More than ever, collaboration seems the right way to fight cybercrime effectively. Collaboration not only between departments within your company (transparency is without a doubt the best approach in case of attack), but also collaboration with your technology partners.

As an IT specialist, Aprico Consultants helps organizations establish their ICT strategy and assists them in their digital transformation, in order to improve the performance, productivity and competitiveness of their business. We combine in-depth knowledge of various aspects of ICT with technology expertise and an end-to-end understanding of our clients' business processes.

Given that the cybersecurity market is particularly fragmented (even though a certain consolidation of the ecosystem is under way), the choice of a reliable and trusted technology partner is essential in your search for a sustainable and global ICT platform. In other words: a partner who is capable of not only selecting the most relevant offer, but also of deploying and maintaining it.

Aprico aims to help companies innovate and rethink their business processes by putting security at the centre of their strategic thinking. We share best practices, technologies and organizational models that allow your organization to open up to the outside world and to share information securely.

April 1st, 2019

Putting simplicity and agility first with microservices

While the cloud clearly remains one of the cornerstones of your company's IT flexibility, a new style of architecture is now increasingly needed: microservices. It promises greater ease of development and deployment, better scalability, easier maintenance and more flexibility in the way you engage with technology.

Bruno van Marsenille

Your company has always faced many IT challenges: whether to ensure the availability and performance of your applications, as well as their quick and inexpensive upgrade, or to develop and deploy new solutions more quickly.

Advantages

In short, your organization, if it wants to keep up, must be synonymous with responsiveness and flexibility in order to fully meet the demands of your business. Yet all too often your IT department inhibits this dynamic process, because of your installed base (i.e. your historical applications), the inflexibility of your infrastructure and the lack of flexibility of your development teams. Of course the cloud - and especially the hybrid multi-cloud (see our previous blog) - can provide an answer to these challenges. But nowadays, it’s possible to go beyond that answer, thanks to microservices.

Schematically, a microservices-based architecture aims to develop an application as a suite of small services, each operating its own process and communication through lightweight mechanisms. These services are built around business functionality and can be deployed independently as part of an automated process, while centralized management is minimized.

In other words, a microservices architecture offers you several advantages: shorter development cycles, scalability built in from the start of your development, the possibility of deployment on-premise or in the cloud, the ability to handle complex requirements, less vendor lock-in thanks to the many products available in open source, the possibility to choose the best implementation technology to solve your specific problem, as well as ease of maintenance and upgrade.

Constraints

While the microservices approach has many advantages, adopting it requires that you respect a number of conditions. For one thing, as a complex technology and architecture, it requires certain specialized skills. In addition, a DevOps and automation culture is absolutely essential. Finally, the boundaries of each service must be clearly defined, while organizational changes are also needed to ensure the success of such a project.

In short, you definitely shouldn’t look at microservices as the new Eldorado for your IT department: this type of architecture is only suitable if your company needs scalability, complexity and speed of implementation. And you certainly shouldn’t underestimate the organizational challenges it brings. It effectively requires you to set up inter-functional teams, multidisciplinary and autonomous, with clear boundaries between them (and therefore between your microservices). Ideally those teams have a double dimension: a vertical business aspect and a technological communication structure between teams to create the knowledge network and establish border governance between services.

 DDD approach

To meet these challenges, we at Aprico propose a type of structure, based on the idea of DDD or Domain-Driven Design. It is neither a framework nor a methodology, but rather an approach (as described in the book of the same name by Eric Evans) that aims to define a common vision and language shared by all people involved in the development of an application.

In practice, DDD allows you to offer tools capable of establishing the service boundaries, the upper limit being the associated context and the lower limit the aggregate. Domain events are powerful building blocks for service orchestration, while domains constitute natural boundaries for the business-oriented team. In addition, context integration and team relationships are governed by strategic design.

Partnership

Microservices represent the logical evolution of a distributed systems architecture. They are intended to meet your needs of complexity, scalability and speed of delivery. But to succeed, your organization must be aligned. DDD offers powerful tools for structuring a microservices architecture through aggregates, associated contexts, domain events and the strategic design that ensures relationships between contexts and teams.

As a consulting firm specializing in information systems architecture and transformation, Aprico Consultants allows you to resolutely accelerate your digital transformation processes. We provide you with the flexibility, performance and competitiveness needed to strengthen your position in the market. Aprico Consultants works with you to translate your company's strategy, objectives and constraints into pragmatic transformation programs that deliver real added value and proven return on investment.

Careers opportunities

We’re always looking for talented people.
Are you one of those?