Latest news

How to start cloud adoption in less than a week

In this increasingly technological age, we are witnessing a boom in digital modernisation and transformation (across all areas of business). The current pandemic situation further accelerates this trend and a successful digital transformation will, for many firms, make the difference between success and failure in the coming era.

We cannot address this topic without bringing cloud computing to the table. An increasing number of companies are studying the possibility of shifting their applications to the cloud and reaping the benefits of not having the responsibility and skills to maintain an on-premises infrastructure (this study suggests that “more than $1.3 trillion in IT spending will be affected by the shift to the cloud by 2022”).

Cloud computing usually has characteristics that your business can immediately take advantage of such as high availability, scalability, fault tolerance, security and many more.

Besides the technical aspect, migrating your applications to the cloud can have a meaningful impact on company costs and productivity.

With this change, it is possible to predict costs (with zero upfront expenditure as no on-premises infrastructure is needed) because you only pay for what you use. You should also consider the fact that you can use and pay for additional resources when needed and stop paying for them at any time. Also, with the rapid uptake of cloud computing, cloud providers can take advantage of economies of scale which then translates into lower prices for their customers.

All the technical skills needed to build and maintain data centre infrastructure are ensured by the cloud provider so you can boost the productivity of your business by letting your IT teams focus on application development.

When discussing the cloud migration process we are talking about planning how an organisation should move its data and applications out of the office and into the cloud.

The first step should be an assessment of the environment, to decide which resources should be migrated and what migration strategy should be used for each individual resource. The next steps should be selecting the cloud provider and executing the migration itself.

In this article, we will just cover the first step by presenting the most common strategies used for cloud migration (with some practical examples, for migrating real-world applications).

We will focus particularly focus on two cloud service models, IaaS –“Infrastructure as a service” and PaaS – “Platform as a service”.

The main difference between these two is in responsibility distribution:

With IaaS, the cloud provider manages everything to do with hardware but other concerns such as maintaining the OS and network and application configuration remain up to you.

With PaaS, you get less responsibility as you are not in charge of network maintenance or the OS, and only need to worry about your application data and configuration.

The commonest migration strategies are:

  • Rehost (AKA “lift and shift”) – migrate current application “as-is” to IaaS
  • Refactor – migrate application “as-is” with minimal configuration changes to take advantage of cloud services (e.g. PaaS, DBaaS)
  • Rearchitect – Modify application architecture/code to optimise it for new cloud functionalities
  • Rebuild/New – (Re)build application adopting cloud-native methodologies and technologies

The following table presents a comparison of the different strategies (the objective factors have an impact on the choice of what strategy to apply):

So, how do we do this?

With one week to carry out our adoption, we are just going to look at the first two possible strategies, Rehost and Refactor.

Rehost, also known as “Lift and Shift”, is one of the most popular and well-used strategies. As the name suggests, this strategy consists of “lifting” your application out of the current host, for example, where it is hosted on your premises, and “shifting” it to your chosen cloud provider. This is one of the easiest ways to adopt the cloud model when there is no time to re-engineer your applications.

Your infrastructure costs are now being redirected to your cloud provider, who takes care of all the resources needed to run your application and sells that as a service (IaaS). The risk is low for this strategy and you receive immediate benefits and return on investment. If it is no longer needed, you can free up your on-premises infrastructure.

There are some tools available to carry out this migration, but some companies prefer to do it manually. This strategy is a good first option for cloud adoption but, depending on your application, of course, the next that we will present can benefit more from essential cloud-native features.

To simplify this concept here’s an example of an application that runs in a Java environment, uses a database management system, runs it all under a Linux system.

With Rehost, you just move all the layers of your application to a cloud provider infrastructure, but the cloud has much more to offer.

With Refactoring you are not altering the core of your application, you are just moving different layers of your application to cloud managed services. Think about all the time and resources spent managing databases, backups, updates and many other concerns. Most cloud providers offer database-as-a-service where all of those concerns are managed by them. Some services offer fully managed platforms where you can run your application layer and all of the capacity provisioning, scaling, load balancing and other features are managed by your cloud provider.

This strategy may need code more changes and additional configuration than the Rehost option, but you’ll benefit from it.

While this is a big step up from the Rehost strategy, it is still far from leveraging all the cloud-native capabilities that you can benefit from, but with one week to migrate to the cloud, it is a good one.

If you want to know more or if you want more information/help with your cloud migration strategy plan, please do get in touch with us.

antónio correia
joão gouveia gonçalves
António CorreiaHow to start cloud adoption in less than a week
read more

Jira Service Management: a new service desk experience

Jira Service Management from Atlassian is the new generation of Jira Service Desk. From November 2020, Jira Service Desk features will be gradually added, which till now were only available separately in other Atlassian products such as Opsgenie, Mindville Insight and Halp. This will make Atlassian’s service desk solution a much more complete and modern IT Service Management tool.

By ending the possible hold-ups between teams working separately, solutions, problem-solving and support requests (internal and external) can be carried out faster, based on richer and more detailed information. Jira Service Management empowers organisations with greater visibility and responsiveness, fostering and promoting efficient collaboration between teams.

Previously available only as a separate product, Opsgenie’s integration with Jira Service Management will allow you to manage alerts for incidents, disruptions or plan service interventions. Support requests for incidents in Jira Service Management can be tested and escalated within the product, which will automatically generate, for example, alerts for the team responsible for the services. These features will be available in the Cloud version. For Server and Data Center customers, integration with Opsgenie can only be achieved with the solo acquisition of this service.

Mindville‘s Insight will also be part of Jira Service Management. A flexible asset & configuration management tool, which allow you to catalogue your organisation’s assets, such as hardware, software, car fleet or office supplies. This will offer a more detailed and interlinked view in managing its lifecycle. All types of orders within Jira Service Management can quickly be enriched with detailed information from Insight. Example: an order related to laptop assistance or a licence that needs renewing can now automatically be added with all its associated technical information. Insight features will be available on Jira Service Management in the second half of 2021.

Halp allows you to create, sort and manage tickets using conversational tools, and will also be included in Jira Service Management in the second half of 2021. It will be possible for teams to integrate Slack (already available in Halp’s standalone version) or Microsoft Teams (in testing) with Jira Service Management and manage orders directly in these applications.

Source: Atlassian

In addition to the new capabilities included in the product, Jira Service Management will also provide new templates for support projects. These templates will cover typical settings in Incident & Change Management, Service Request and Problem Management scenarios, speeding up the tool’s adoption process. They will also integrate naturally with Bitbucket, offering automated new controls, deployments approval and greater tracking in problem analyses.

The new Jira Service Management is now available as a Cloud version with several plans available – Free, Standard, Premium and Enterprise – with added features of Insight and Halp being made available gradually over the coming months. The Server and Data Center versions will arrive via update later this year.

Are you ready to amplify service management in your organisation?

Tiago AlmeidaJira Service Management: a new service desk experience
read more

And the winner is… modern BI!

Why ‘modern Business Intelligence‘ is so important for your organisation in order to become more competitive?

Modern BI gives you the flexibility to build a data-driven culture where you can make decisions based on facts rather than guesses or assumptions. Instead of relying solely on EDWs (enterprise data warehouses), like traditional BI, it comes with a set of new visual interactive tools to tackle all kinds of tasks from IT to workflow. BI is no longer just a project; the goal is now a data-driven company in which BI connects data with its end users through ETL (extract, transform and load) and visualisation tools. What we want here is to demonstrate how modern BI can bring value to your company and we’ll do that via the following trilogy.

1. Accessing data

Connecting to data has become so much easier. Nowadays you can connect to every type of DB (database), to a set of files, like PDF or excel, and to APIs or data from the web with web data connectors, etc. You can do this from a BI tool such as Tableau Desktop or Power BI, which let you connect directly to these data sources. Think about it; many companies rely on Excel as an output for their departmental information. Imagine that you prepare monthly report files on sales figures. How best can you compare your sales performance for 2008 against that of 2018? Are you going to open all those Excel files? How can you build a sales vs region map by year? And drill down to data for every month? Well, fortunately, modern BI tools can give you all this. You get the capability to quickly connect your Excel file, or a load of Excel files (by joins or relationships) to Tableau or Power BI, which means you’re acquiring a real upgrade of analysis efficiency because in a matter of minutes you can clean, normalise and use all the data to build dynamic reports and dashboards.

Bear in mind that sometimes data isn’t completely error-free, so a cleaning process must be carried out, which you can do easily in the tools mentioned above with just a few clicks. However, if you have many complex files, which need to perform complicated calculations, or you want to join too many different data sources, you’ll need an ETL tool like Pentaho, Tableau Prep or SSIS (SQL Server Integration Services) and a database into which your ETL can drop the data from all those sources.

2. Building visualisations

All right, connecting data has new fancy options and is now much faster, but the coolest thing is building the visualisations. Forget those old static charts and tables or maps. Nowadays, BI solutions give you the ability to construct every type of graph to display information exactly as you want. Using a single graph, in few minutes, you can create a visualisation by year with the possibility of drilling down to individual days; or you can drill down by hierarchies, for example, showing profit first by product group and even further until you reach profit by individual product. You have the ability to incorporate filters into your visualisations in order to be more precise with your analysis. You can build parameters to enable you to see different KPI’s (key performance indicators) by groups of products meaning that in the same visualisation, you can explore profit or number of sales or margin by product. Nevertheless, you can explore raw data by creating tables with all the information you need.

Alongside these cool attributes, there’s another important functionality. You can explore and analyse all the underlying data. Choosing a group or clicking on a specific point on your graph, you can promptly see the information behind the values. With this, you’re never working blind, and guess what, you can export these pieces of data to file. Using this functionality with the visualisations, you can do your own exploration and analysis in which you’ll be able to find trends or outliers and take interesting conclusions.

You arrive at a place where you’re able to create views that you can group in a super-dynamic dashboard full of filters and parameters that can be applied to all your graphs, tables and maps. With dashboards, you can create data stories where you tell a story using snapshots of a specific dashboard to communicate data much easier and more consistently.

It gets even better because after constructing visualisations you can then share them with everyone in your company and decide who sees what.

3. Governance

If the visualisations are cool, imagine how awesome it is to easily share content with anyone in your company. Tools like Tableau or Power BI give you the opportunity to bring all your content together in one place, divided by project or department. We’re talking about your data sources and dashboards in which you can create groups of users and decide their permissions. Imagine that you build a marketing dashboard and publish it via the online service. You have users in all your company’s different departments but you only want those in marketing to be able to see this dashboard. To achieve this you create a group called Marketing, add all your marketing users to it and grant them permission to see your content, while for other groups, such as Finance, for example, you deny permission.

The online service is managed either through a physical server or based in the cloud, with a high level of security. To connect, you can use login directories such as Active Directory, or you can create one locally. You can embed your interactive visualisations into webpages, which will refresh any time that you change the underlying data. Defining schedules and alerts is something you’ll find really useful. Schedules for refreshing data, for example, and alerts for when a KPI reaches a specific limit. Despite this, you are able to monitor everything: who has accessed what, performance, space usage, tasks, etc.

The best part of all of this is the fact that your IT personnel are still an important part of managing access to data, but with the right software, every user is empowered to view, customise, create and edit dashboards and reports or analyse data in far less time.

Final thoughts

As we said, BI is no longer simply a project. BI is a living being in your company. Imagine your company as a city and BI as a data speed highway flowing freely between every corner of that city. Scalability becomes very effective as you can upgrade your departments to the digital culture in less than no time or upgrade your old digital BI structures, which will then become even faster.

The given value of monitoring, making decisions, analysing information quickly or sharing insights between people and departments is one of many other strengths that BI can offer you, although business users become more autonomous to do so.

For sure, modern BI is a winner that caught the speed of our fast, competitive, changing world and earned its place right next to business fabric, and this is how XpandIT DIaaS (Data Innovation as a Service) became real. By defining your strategy, deploying, maintaining and supporting a robust data solution, and helping you build your speed highway, we can help you define the process of positioning yourself as a competitive data-driven company and of course, a winner because when you win, we win too!

José MirandaAnd the winner is… modern BI!
read more

Managing Marketing teams with Atlassian tools

Organising work is a daily topic for all companies. Who delegates what? How is that project that can’t really fail going? How do we evaluate the team’s workload?

All of these are legitimate questions, which is why we will help you take advantage of Atlassian tools to enhance your team and organisation. We will also present a use case, using the Xpand IT Marketing team as an example, so that you can better understand the true potential of these tools in teams outside the scope of software development.

Confluence

Let’s start with Confluence. Regardless of the organisation’s area of activity this knowledge management and sharing platform allow teams to display projects in which they are involved, and share results, initiatives and even personal content that may be useful to the organisation.

Thanks to its structure we can almost call it an internal social network which allows information flows to run effectively, allowing all the people involved in the company to stay up to date on each other’s daily work.

Let’s look at the case of the Xpand IT marketing team. With access to Confluence, it is possible to get visibility over all the marketing initiatives (and not just marketing!) taking place in our company. Easy access to information makes all our employees feel involved about the same projects, increasing and fostering a teamwork culture. With Confluence, everyone has a voice and can share and give feedback.

For example, we can see the impact of Confluence in an interaction between the Marketing team and the Sales team. Even working on different levels, the platform allows both teams to give their inputs in support of Xpand IT’s participation in an external event.

Through the quick and easy editing of the event-dedicated page, it is possible to increase the agility of the entire organisational process and simplify bureaucratic work that could easily become time-consuming carried out over the usual email exchange.

This is just an example of the benefits that Confluence software can bring to your team. It is also important to understand that the platform requires some attention from your team because it can be easily forgotten and get out of date. In another article, we explore the 10 reasons to update your version of Confluence and make sure that everyone contributes to your team’s success.

Through Confluence it is also possible to create meeting notes, project plans and service specifications, among others. You can learn more about Confluence here.

Jira Software

The name may seem complex, but any issues ends there. This work management platform allows your teams to view all tasks to the current work period in a practical and intuitive way.

There are 4 phases linked to this platform’s success and which have a positive impact on teamwork:

Plan: task distribution across the team;

Track: task prioritisation and monitoring the success of each project where your team is involved;

Release: execution time of various tasks. Throughout the work period, every team member can change their status to bring the rest of the team up to date;

Report: at the end of each work period, it is possible to prepare a report that identifies where your team has been successful and the improvement points.

It’s based on these four phases that we can give you another example from the Xpand IT Marketing. Within a team there may be several work methodologies to be adopted (in this article you can read about 5 agile methodologies). For this case, we will explore the SCRUM method. This working method is characterised by cycles or stages of development, defined as sprints, and by the maximisation of time development for a series of tasks.

There should mainly be daily meetings of 15 minutes, the daily scrum, which act to synchronise activities and as a way to plan the working day. With 15-day work periods, called ‘sprints’, the team start their work cycle with a meeting where each team member uploads their tasks to the platform and assigns several metrics to them: story points (from 1 to 8), which help to understand the degree of difficulty of each task; hours, providing a more reliable count of the time to be spent on each task; priority, allowing the entire team to perceive the urgency of each project; and assignee, the person responsible for fulfilling it. It is also possible to create several labels using tags that allow a better visualisation of where the team spent their time, in the final report for each sprint.

Then, the SCRUM Master – the person responsible for the team – starts the sprint.

At the end of the sprint, the team meets again for another briefing to learn about areas for improvement in their working methods and task difficulties, amongst other things.

team management

Let’s take a look the following example:

In this case, an issue was created in Jira Software for creating this blog post. It was assigned estimated story points and the number of hours in which to write this article and given a priority.

As you can see in this image, a total of 12 hours were allocated for the development of this content (1 day and 4 hours). The difficulty was maximum (remember, the scale is from 1 to 8), and 3 tags were created: Blog because it is an article for our blog; CS, because it is our business area associated with the technologies referred to; and Atlassian, because it is the theme that this use case portrays.

After uploading, this task will be displayed in the “To do” column and will then change status, depending on the progress of the task. In this case, if you are reading this article, the task is now in “Done”, but before this, it went through “In progress” as it was being developed.

With this, as a team, we are able to understand the workload necessary for each project and consequently optimise the work and the effectiveness of the human resources allocated.

Jira Service Desk

team management

This service order management platform is the ideal solution for resolving issues that can arise in your daily work. The resolution’s team mission is to respond to a request with the best and highest level of quality support while simultaneously being able to optimise the management process of their own timelines.

The process of opening a support ticket / request is quite simple. Let’s see a case from our marketing and design teams where a ticket has been opened to produce communication pieces for this blog post. On the Jira Service Desk platform, you can select a category for your order. In this case, we will open a ticket as a “Design Request”.

Next, we give the most accurate briefing possible so that the order has the highest possible success rate. Then you can allocate the ticket to the project (for budget or workload management issues), and select one or more approvers. Finally, you must select the ticket deadline and even attach reference files.

Final Thoughts

In this content we explained in a simple way the two tools that our marketing team use on a daily basis. However, Atlassian tools do not stop there. From Jira Service Desk to Bitbucket, the solutions that improve collaboration between teams are endless.

Since Xpand IT is a global company acting as the only Atlassian Platinum Solution Partner in Portugal, develops top services and products in the IT area to reach the ALM and SDLC.

Our knowhow, combined with our experience in Atlassian technology, allows us to give various specialised consulting services according to client needs, including installation, customisation, app development and monitoring, and support services for the entire range of products. Find out everything here. Our team of experts will help you.

Pedro RosaManaging Marketing teams with Atlassian tools
read more

5 questions you need to answer before creating your Azure solution

There are countless articles, blog posts and other types of content that talk about the cloud: from the benefits that this technology offers to companies to demystifying popular myths, and from how to make the most of this technology to how to optimise existing applications. The relevancy of the role that cloud technology currently has in the reality of companies is undeniable.

The topic remains so relevant that, according to a recent survey conducted by IDC, organisations show a strong inclination to increase investments in cloud services in 2020. On the other hand, the pandemic caused by the COVID-19 virus has resulted in an increasing digitalisation of companies where the cloud element has proved to be essential, given the benefits it presents for companies that have been forced to adapt digitally overnight. Infrastructure requirements have increased significantly, and the number of workers who have started to work remotely has also grown. Additionally, it has become clear that companies need to offer digital versions of their services to consumers.

However, considering the number of different available options, it can be challenging to understand where to start. Whether your company already adopts this technology in some parts of the business and wants to optimise it, or even if your company has yet to adopt cloud technologies, it’s important to do an analysis of the business to clarify the objectives and purposes your organisation needs to achieve by using this technology.

5 questions you need to ask to create an Azure solution

There are 5 elements you need to clarify to understand where to start on your digital transformation journey with the cloud with as much information as possible:

1) What are your needs?

Before moving on to developing solutions that take advantage of cloud technology, start by asking yourself what your needs are. Do you want to modernise an existing solution, build a native cloud solution, create a data platform, or are you still unsure of what your business needs are? Clarifying your business need is the first step you need to take before you even start thinking about embarking on a cloud migration project. Without the answer to this question, the purpose of the investment will not be straightforward – nor how it should be done.

2) What type of solution are you looking for?

Secondly, you need to understand what kind of solution you are looking for. Are you looking for a web app, a mobile app, a business backend, a streaming app or are you looking for another solution? Eventually, you may conclude that what your company needs right now is an aggregate of these elements. Having a clearer sense of your business needs and the type of solution that can help you suppress it will be an essential step to take on your migration journey to the cloud.

3) What is your business objective?

Once you know what you need and also what type of solution you are looking for within the universe of cloud technology, it’s crucial to reflect on what your business objectives are. This is the moment where you can finally crystallise your vision for the company and how this technology can you help you achieve it. Clearly and unambiguously communicating this vision will help specialists be aligned with you so that you make that vision a reality.

4) What is the technological stack you’re using/that you want to use?

Regardless of your needs, be they modernising an existing solution, creating a data platform or even building a native cloud solution, it is crucial that you consider the different technologies you already use or want to use. In this way, you can analyse the impact these technologies will have when starting on your cloud journey. The sooner you analyse the possible limitations or obstacles to achieving your goals, the more efficiently you will be able to find strategies to overcome them.

5) What are your main drivers?

Last but not least, it’s also essential to clarify the company’s drivers. What kind of elements are crucial for the company’s activity and which ones do you want to prioritise? Is it costs, performance, scalability or something else? To ensure the solution built is highly personalised and focused on the individual needs of the company, you need some visibility over what type of indicators guide it. Only in this way will it be possible to arrive at a solution that addresses all the challenges and opportunities to which the company wishes to respond.

What happens next? Next steps

Even after you’ve answered all these questions, it can still be challenging to understand what the best cloud solution will be or the best combination of resources for your specific needs. To help, Xpand IT has designed a Cloud Assessment tool. This tool’s goal is to make a brief assessment, so that our specialists can help your organisation understand what you’re already doing right, which areas you can improve and, finally, get visibility as to what is the best combination of resources within your business’s specificity and the challenges you need to answer.

At Xpand IT, the Digital Xperience business unit has, for many years, been focused on the development of solutions based on cloud technology, namely Microsoft Azure. The solutions we have been developing over the years take advantage of a variety of components that are combined to build the best possible solution: App Services, API Management, Cosmos DB, Cognitive services, among others. Our experience with these technologies and particularly with Microsoft Azure derives not only from the development of native cloud solutions but also from reengineering existing solutions, where we help our customers make the most of this technology and all its components.

Complete our Cloud Assessment and take a step forward in your digital transformation journey.

Filipa Moreno5 questions you need to answer before creating your Azure solution
read more

Churn Rate – Why you should give Machine Learning a chance

Data Science, Machine Learning & Churn Rate

Monetising data, within clear privacy and compliance regulations, is a quintessential action for every company that wants to be, or keep being relevant in their market. Analysing data generated by every-day company activities might be one of the least expensive ways to probe for inefficiencies and check for performance issues.  In fact, leveraging data and acting upon it should be a procedure with which companies are familiar if they want to keep ahead of the competition. All of this is, most likely, a given fact to anyone working with data. It might not be as clear to someone not working in the field, but a quick count of how many articles Forbes or Harvard Business Review produced on the topic of monetising data should be proof of how the concept is an acquired trend within the corporate landscape.

In this blog post, we will take a look at a specific use case of how data, combined with Data Science (DS) techniques and Machine Learning (ML) algorithms, can help a company management team better understand their business and their customers’ behaviour, by being equipped with more and better information in their decision-making process. As the reader may already have worked out by the title, the scope of this blog post will be customer churn and how data produced in-house can evaluate and detect it.

Let’s first define what costumer churn is, how it is measured, why it may occur and its impact in a company.

According to Investopedia, customer churn rate is the rate at which customers stop doing business with an entity. It is most commonly expressed as the percentage of service subscribers who discontinue their subscriptions within a given time period. Most obviously we can associate customer churn to a subscription-based model (SaaS), where the customer stops paying the recurring subscription. However, the concept can be also applied to one-off payment-based models too, i.e. when a regular customer stops purchasing from a particular shop.

One can elaborate on a number of obvious reasons for why clients churn: a drastic modification to a client’s financial situation, better competing products, poor costumer experience (CX) or unfulfilled client expectations. Taking the current pandemic into consideration, another easily identifiable cause may be the lack of online services during the lockdown which most countries experienced this year.

This said, it is obvious that a lower customer churn rate will probably benefit every party involved in a transaction: the company will see its profits grow and customer satisfaction will be higher. It is also well known how expensive it can be to get new clients, which is another strong reason to keep churn as low as possible.

Let’s consider a simple example. The company Xyz, which has a monthly subscription-based business model, has 5000 recurrent customers. Xyz considers a client to be recurrent if they conduct one consecutive monthly consecutive transaction. Over the past month, the company registered 125 canceled subscriptions, i.e. a 2.5% churn rate.

Looking into high churn rates, there are a few questions that management teams may want to answer. Some of the most obvious would be: What was the trigger event? and What is the typical churning client profile? The use of advanced analytics in this field produced interesting insights. As Bain & Company state, churn results from a series of episodes over time, not just one or two specific triggers. This conclusion makes us want to add one more obvious question: What are the specific root causes? or What is the archetypal series of events that result in customer churn?

Data, collected over time, such as revenue information, transactions, contract state and even demographic information for the client most likely have clues to the reasons behind a churn event. By analysing these data combined, ML algorithms provide an unbiased interpretation and may be able to distinguish behavioural patterns related to such events, undetectable to product owners and even people with savvy business acumen.

Machine Learning is a field of computer science that uses advanced mathematical models trained to identify patterns and predict events. These models learn to do such tasks according to the data they have seen.

Specific algorithmic approaches are able to translate these patterns into interpretable scores or insights, in order to answer relevant questions such as those mentioned above. For instance, it would be possible to compute the churn probability for every client based on data about their respective customer journey.

This type of metric ends up being a proxy of a measure of similarity between an active client X, and the typical profile of churned/inactive clients. Another approach, more focused on answering the last two questions, would be to build an algorithm able to show what the commonest series of events in a customer journey that results in churn are. It is important to state that, most of the time, methods like this are not exclusively developed or used. This means that it might be useful to have more than one churn-related metric, like the two mentioned above, or others, specific to each company market, available data and organisation type.

The end goal of every ML learning methodology used within this context is to generate actionable knowledge. More specific, focused, with well-defined target populations, financially efficient and as effective as possible actionable knowledge.  How this is rendered in action is outside the scope of this blog post, but there are a couple of simple ventures that it might make sense to mention, and that should not require any major structural company changes, such as targeted marketing campaigns or simple website/UI tests and updates. The effectiveness of these should be noticeable in the short to medium term in the overall customer churn rate. It is worth mentioning that the success of these kind of experiments should also be measured by following A/B testing.

Direct influence on customers though campaigns will be reflected in their behaviour and the overall CX. This, combined with macro and micro economic trends, business competitor strategies/activities and other extrinsic factors, will make the churn patterns detected non-static over time. In practical terms, this means that a pattern detected at a time, t, will be different from one detected at t+time. Consequently, an ML algorithm will need to be updated and maintained continuously, in order to remain relevant business-wise. This phenomenon is called concept drift and it is a well-known nemesis of the predictive analytics field.

Data Science at Xpand IT

The Data Science unit at Xpand IT developed a process, based on industry references and standards. This guide helps us to minimise the natural uncertainty of Data Science projects, by following a structured approach, based on agile methodologies.

A simple yet interesting exercise is to approximate the DS process to some of the use case key points made in this blog post:

  1. Viability Analysis:
    • Define a business objective and a question to be answered, for instance: What is the archetypal series of events that results in customer churn?;
    • Determine which churn-related metrics are appropriate according to the market, available data and company organisation;
    • Study the impact of concept drift on the problem and how to subtract it from the solution;
    • Where required, design A/B testing to evaluate the performance of the actions deployed.
  2. Modelling 
    • Build algorithms capable of detecting churn patterns.
  3. Deployment
    • Automate model updates and retraining.

Small Conclusion

The use case discussed here is probably familiar to companies whose activities are based in retail consumption or client subscriptions. We hope that we have shown the reader a small glimpse of how ML and DS can add serious value to the problem at hand.

Our DS unit is ready to help you in this use case, in similar cases, such as Lifetime Value Prediction, or something completely different, such as Predictive Maintenance, Fraud Detection and many other scenarios.

Our goal is to deliver value throughout the project life cycle, while focusing on understanding your business and helping you to create and deploy the required technology.

Gonçalo CostaChurn Rate – Why you should give Machine Learning a chance
read more

Green Deal: 5 reasons to become a data-driven company

Let’s face the truth, sooner or later every industry will have to comply with ecological directives, whether they come from the European Commission or another organisation with a similar kind of influence.

In fact, as many of you know, in December 2019, the Commission published a white paper called “The European Green Deal”. This deal had the purpose of changing the way European Union (EU) and all of its citizens cherish our climate and environment.

It isn’t hard to understand that European companies will have to be the earliest pioneers and data-driven companies will most easily adopt such directives. Therefore, if you want to stay competitive and quickly absorb the coming changes regarding production or daily operational processes, you should give a data-driven business model a try.

Why? Because you’ll be able to analyse and take actions knowing exactly what’s happening with your day-by-day operations instead of just making assumptions. You’ll have data organised in digital infrastructures allied to Business Intelligence, Data Science and Big Data solutions.

For you to understand this better, we’ll analyse 5 points, but this comes with the warning that the deal contains an incredible wealth of information and here we can only focus on those parts that matter for the purposes of this article. However, we advise strategic managers to read the Green Deal.

1 – Clean, efficient energy

According to the Deal, 75% of the greenhouse gas emissions in the EU emanate from the production and consumption of energy by economic sectors; so in order to reduce those numbers, power sources must become built around renewables and must work towards being 100 percent digitised and structurally interconnected throughout.

As a data-driven company you’ll know precisely what is being consumed and exactly how much. For example, you will be able to understand if you’re using renewable resources or not, how much electricity is spent, how much water is being used or how much heating and air conditioning is wasted. Better strategies can be built by evaluating the consumption required by day-to-day operations and evaluating whether they are running efficiently and what resources they required.

2 – Clean, circular economy

Only 12% of the materials used by EU industry are recycled and, the extraction of new materials tripled between 1970 and 2017, is still rising and represents a global risk. Because of this, in order to achieve Green Deal objectives, the Commission want companies to embrace ecological and digital transformation and be more independent of the requirement for new materials, which are converted into products and then disposed of as waste or emissions. In fact, the Deal reveals that the Commission’s priority is to reduce the use of new materials and improve their reutilisation, reserving recycling as a third plan.

As a data-driven company, you’ll be capable of monitoring production lines and retrieving data in order to analyse the amount of resources used to produce any item or measure (in tonnes, for example), which materials are being used and how much waste is being generated. With this information, you’ll be able to redirect production lines to ecological purposes, using materials more efficiently and producing less waste or optimising recycling. Besides this, having a digital view, you can become a role model in your industry by developing efficient models and innovating ways of producing products in a disruptive way.

3 – Construction and buildings renovation

The rate of annual building renovations in the EU is between 0.4 and 12%. The Commission states that this rate must double in order to achieve climate objectives. On the other hand, millions of consumers fight to keep their homes warm. To solve this problem, the Commission will encourage the renovation of private and public buildings, tighten energy efficiency legislation around building and follow the circular economy logic by increasing the digitalisation of housing stocks. The Commission will also review Construction Products Regulations.

As a data-driven company, you’ll have the data analysis tools to give you insights into what materials are used in every project and where to use better materials if needed, so that buildings have lower energy consumption. Getting an intimate view of how construction is carried out and having data on the effectiveness of every material used, you will easily comply with coming changes on energy efficiency legislation.

You’ll know how to organise stocks efficiently in order to reduce waste.

As a company linked to construction, you will be able to design solutions to retrieve data about consumptions and energy efficiency to houses or buildings owners. You will also be able to build intelligent houses that climate efficiently using smart windows and temperature balance systems with air purity and circulation or houses with vertical garden for air renovation, humidity prevention and acoustic renovation. This applies to both new construction and buildings renovation.

4 – Sustainable mobility

According to the Deal, to achieve climate objectives, transport emissions should be decreased by 90%; and by transport, they mean all types (cars, planes and boats, etc.). To address this, the Commission will create a strategy for this challenge that strikes at all these sources of emissions.

As a data-driven company you will be able to control your fleet with devices that provide information on consumption and emissions for every vehicle. With this data, you can rethink strategies and change your fleet if necessary (from diesel to electric power, for example) or redesign routes to reduce mileage and emissions.

If you are a transport industry manufacturer you will be able to get consumption, emissions and efficiency data from your engines and others parts of your products, which will tell you precisely where something isn´t so effective. You’ll be able to understand the performance of your products and what to do to reduce emissions, even if it means less power.

The same applies to battery manufacturers. You’ll be able to retrieve data on the relative performance of your batteries and the exact materials that go into producing them. This will give you in turn ideas on how to extend your battery lifetimes or develop efficient production lines producing lower emissions, which could encourage your customers to switch from fossil fuels to electric fleets.

5 – Healthy, environmental-friendly food systems

Producing food pollutes the air, water and soil, affects climate change, contributes to the degradation of biodiversity and consumes too many natural resources when that food is wasted. Nowadays there are new technologies, discoveries and public awareness that can present new opportunities to producers and value to stakeholders. To change the way in which food production pollutes our planet, the Commission has developed a “Farm to Fork” strategy. Furthermore, their proposal defines that at least 40% of the Common Agricultural Policy’s overall budget and at least 30% of the Maritime Fisheries Fund would contribute to climate action”.

As a data-driven company you will know exactly how much water is used on your farms, how much water and grass your livestock require and amount of chemical pesticides, fertilisers and antibiotics used, as well as many other important facts. This allows you to devise new ways of operating using lower consumption of resources and less chemicals in food.

The future has to be different, so it is just a matter of time before your company is forced to change its strategies and find new ways of producing clean, sustainable products. The data you get from your farms, vineyards, greenhouses, aquaculture, etc., married to data on your customers’ needs and choices, will become essential to helping you achieve the Green Deal objectives.

Final thoughts

We’re speaking about a massive transformation over the next 30 years, and we believe that companies averse to change will live hard times. Environmentally friendly, sustainable companies will be ever more valuable in the coming years. We cannot forget that new generations are growing increasingly aware of environmental and inequality problems. They are the ones who will force their employers and their parents to change their behaviours. They are our up-and-coming consumers and the ones to listen to.

Europe knows the future exposes new environment-friendly paradigms and wants to become the pioneer of these patterns, not only for future competitive and economical purposes but because it is mandatory.

A good way to discover where to go next is to know where you came from, which means that one way to understand where your company needs to go in the future is to analyse your daily data and understand exactly what you’re doing at present. You’ll need to be strategically well informed and capable of joining valuable information to help build future strategies and policies.

A great way to become a successful data-driven company is to get the help from a data specialist partner who understands Business Intelligence, Data Science and Big Data services, like we do at Xpand IT, and who can provide you a “Data Innovation as a Service (DIaaS)” solution to guide you on your journey to becoming a data-driven organisation.

Your solution evolves over time and goes from those first steps of building and setting out your strategy, on to the implementation phase with its analytics and data science components and finally to maintenance, training and support services. You will end up with quite a set of automated processes, that will enable you to take more from data and in the end be so much more efficient; This is in fact a noble cause, as in the end, you will also be fighting climate change – and we are here to stand by your side.

José MirandaGreen Deal: 5 reasons to become a data-driven company
read more

WSO2: a new approach to middleware

The integration of systems and information in organisations has been a core requirement for improving the efficiency and quality of processes, fostering and leveraging development and innovation. In the increasingly predominant context of digital transformation, technological evolution is essential for providing a rapid response to market demands and thus remains a key player in the infinite game of business. Such agility requires ever more decentralisation and autonomy, from teams, which have to become multi-disciplinary, and from increasingly objective and efficient processes, creating a body of independent communicating cells whose vitality depends strongly on the mechanisms of integration among each other.

The role that middleware plays in organisational digital strategy has been fundamental to their success for several decades. The first need was born with a growing number of systems and the need to share information among them. Then, widespread access to the world wide web offered a new integration challenge, along with the growth in mobility and the proliferation of devices with access to information. This was followed by an infrastructure abstraction, with SaaS (Software-as-a-Service) offerings and the latest cloud trend. Today we are living a period of architectural reformulation, of more scalable and flexible properties, and a focus on serverless environments and micro service-oriented architectures. Middleware has been, is, and will continue to be a constant presence and the cornerstone of this evolution, where ore the organic disintegration of architectures increasingly requires dedicated platforms for their management.

Virtue is in the middle

Integration has been a growing and increasingly complex problem for many decades. In the early days of the use of information systems in organisations (the 1970s and 1980s) the integration problem was unimpressive:

  • Organisations had just one central system (mainframe), to handle the automated execution of typical operational activities; and
  • In the event that there was a system to assist communication needs between them, it was rare and of a very limited domain, justifying point-to-point

However, since the 1990s we have seen an explosion of organisational systems, with an increasing decentralisation of information to different silos and departments in order to better meet their needs. The globalisation of Internet access and the respective evolution of communication networks have both contributed to this. Integration became a paradigm to be addressed and the concept of EAI (Enterprise Application Integration) was born, with the aim of removing this burden from the business’s application development, and to define and systematise good practices in these implementations.

The following decade – 2000 – was characterised by the exploitation of connectivity through the web, so we witnessed the birth of SaaS (Software-as-a-Service) offerings and the adoption of communication standards (e.g. SOAP). This multiplicity of service offerings required a rethinking of the good design practices of these types of architectures, as their implementation began to be characterised by a high volume of heterogeneous interfaces facilitating the exchange of messages between them. And so the ESB (enterprise service bus) was born, which is basically the instantiation of the architectural model of an EAI implementation, with the aim of giving organisations relevant properties such as abstraction, loose-coupling and reuse.

Following the proliferation of the mobile phone between 2000 and 2010, the decade after the year 2010 was characterised by the diversity and processing capacity of mobile devices, and they now play an active role in organisations as a tool for communication, work, and even leisure. In this era of multi-connectivity, the importance of being connected anywhere has become central to the competitive business market, and this has increasingly led to the promotion of services by organisations through the Internet and the creation of these new channels of value. Each and every service or system started to make APIs (Application Programming Interfaces) available for the access and consumption of consumers, whether in the organization itself or its customers and partners and allowed the creation of new channels of value for the business. This proliferation of APIs then led to a growing need for management and supply, culminating in a complementary approach to integration that is called API management. API management complements system integration as it feeds the API lifecycle (from design and testing, through publication and operation, to its depreciation), providing a central point of security and control, providing metrics and usage trends, and accelerating consumer adoption with collaborative and self-service functionality.

We are currently moving towards a paradigm of flexibility, where applications must respond to requests for information in a distributed and independent manner, with response times approximated to real time. This is a disruptive change, where monolithic architectures are converted into models based on microservices and containerisation, in an agnostic way in terms of location. Just as applications evolve in this sense, integration architectures are also mutating, with the aim of speeding up and better responding to the growing need for integration resulting from the dispersion of applications. We are thus witnessing the adoption of iPaaS (integration platform-as-a-service) – platforms that facilitate and speed up the creation of integrations between applications – as well as hybrid integration platforms – hybrid integration platforms, i.e. that work agnostically both on-premise and in the cloud as a set of communicating cells, to better respond to the needs of information exchange.

api management

The WSO2 solution: a new approach to middleware

WSO2 was founded in 2005 and develops an open-source offering of the same name, in the middleware area. The suite is composed of different products and services that facilitate a decentralised API-first approach, enabling organisations to achieve fast, agile implementation of their digital solutions. The platform consists of three attack vectors to the middleware problem:

  • API manager – leveraging the promotion and use of APIs to streamline and exploit business capabilities;
  • Enterprise integrator – facilitating the development of enterprise integration and promotes the revitalisation of legacy systems; and
  • Identity server – promoting trust and security in information access management.

Each of the products addresses a specific need for integration. The API manager empowers Full Life Cycle API Management organisations, i.e. the ability to manage APIs 360º, from their planning and design, through their operationalisation and monetisation, to their depreciation. In this way, organisations can respond effectively to business trends and the establishment of value partnerships. Using enterprise integrator it is possible to interconnect all the dispersed information of the organisation speedily and with ease, allowing organisations to explore operational efficiencies and new offers for their business. To ensure secure access to information and integrated identity management in organisations, Identity server presents itself as the tool of choice.

The differentiating factors

WSO2 presents a set of differentiating factors that distinguish them from similar offerings:

  • 100% open source – completely open source, with no anomalies in its distribution, unlike other offers based on Community and Enterprise This feature guarantees customers a unique opportunity to test and validate their final solutions at no cost. WSO2’s own development is transparent and open, allowing its customers to have full visibility and to participate actively;
  • Cloud-native – prepared and developed from scratch, to adapt to today’s decentralised IT architectures based on containers and microservices;
  • Modular – runs on a common basis of functionality, with a high level of internal cohesion, and is easily integrated into its various components;
  • Light – promotes a rationalised use of the necessary features, thus ensuring the best efficiency of the solution;
  • Flexible – facilitates integration into the organisation’s architecture, exploring the decoupling of its components so that they can be sized according to solution requirements;
  • Extensible – allows you to include customised code, both for the extension of the functionality and in the development of proprietary protocol-specific integration connectors (there are over 200 available in the connector store).

Our vision for the future

Throughout the evolution of different integration models, what we have noticed is that it has not been a substitution of concepts. Rather, we have witnessed an increasing complexity of problems and challenges that require new solutions for a better response. In our vision:

  • Any integration architecture is potentially valid, depending on the challenge it is meeting. It makes sense to adopt models that bring the greatest value to the organisation;
  • Contrary to many statements, the ESB is not in disuse. It remains a highly valid and current paradigm in what we consider to be a good integration architecture. These statements exist because often the ESB concept is mixed with the notion of a centralised backbone of integration, something that is actually in disuse in the new model of distributed architectures;
  • Integration is hybrid and consists of a combination of mixed on-premise and different cloud (multi-cloud) integration scenarios, offering flexibility and sustainability to organisational growth;
  • APIs continue to be the agents that promote integration, as they facilitate and promote access to information in a simple and fast way.

WSO2 offers a modern, versatile solution to address the pressing integration needs of organisations and support the entire transformation and innovation process. WSO2 products place in the hands of organisations the tools with which to implement an integration platform with the inherent complexity of their current objectives and shape it according to their future evolution and the context and requirements of the markets in which they operate.

Nuno SantosWSO2: a new approach to middleware
read more

Guide to a successful Marketing Automation strategy (with Salesforce and Pardot)

The question may be simple, but creating an effective Marketing Automation strategy is not always easy. To get answers, and to understand how to create a successful Marketing Automation strategy with Salesforce and Pardot, let’s look at it separately. First, what is Marketing Automation all about?

As its name implies, Marketing Automation refers to the use of software to automate marketing processes. We often focus the efforts of marketing teams on email campaigns, social networking and ad campaign management. The idea behind this is right. These activities allow companies to deliver a more personalised service and experience to their customers. However, the process can be somewhat complex and time consuming, sidelining team efforts into operational rather than strategic work. Fortunately, there are plenty of tools to help us to reduce the time spent on all the fundamental marketing activities at the same time as increasing the effectiveness of your campaigns.

In this paper we will explore two tools that can help our organisational structure and implement Marketing Automation campaigns.

Marketing Automation strategy with Salesforce

First, Salesforce. This is the most widely used CRM application in the world, delivering solutions that help companies and their customers to be completely connected, within a 360º vision.

The tool provides a wide variety of solutions, allowing the platform to be adapted to each of the companies, in order to implement their unique processes and simplify the daily lives of marketers and commercials. (Something we addressed in this content).

Salesforce builds a database of customer records, with all their contacts, and stores the information on the leads generated. After being nurtured and qualified in Pardot, the leads transfer back to Salesforce, in order to generate real business opportunities, giving the sales team a good working base to convert into sales.

Currently, the platform has over 10 million users worldwide.

B2B communication tool: Pardot

Pardot is a B2B marketing communication tool and part of the Salesforce CRM universe. It is an intuitive marketing automation platform that offers the ability to create, implement and manage segmented campaigns, where leads and Salesforce contacts (Prospects in the Pardot namesake) are distributed by several personas, in order to make the Prospects experience unique and differentiated. The reason we say unique and differentiated is because we can make campaigns ever more targeted using data feedback. That’s right, targeted at specific people!

Pardot is particularly useful in email marketing campaigns, offering the possibility to link directly to landing pages created in the tool itself – fast development even without technical knowhow, and with the choice of using or not using a form – or in social campaigns, where it is possible to validate how Prospects have been interacting with company activity on social networks, as well as making publications on several platforms at once, encouraging an increase in the quantity and quality of leads that will later fill through to the sales teams.

So we set out to answer our initial question: how do we create a successful Marketing Automation campaign using Salesforce and Pardot?

There are 4 main characteristics that enhance the strength of campaigns and are associated with both Pardot and Salesforce:

  1. Effective email marketing
  2. Lead generation
  3. Lead management
  4. Reports

1 – Effective email marketing: The most attractive email campaigns are those that are highly targeted to our target audience and get the message across in the best way, thus generating more business leads. With these two tools it is possible to create emails and pages faster and more efficiently, manage automation flows, taking into account the personalisation of recipients and segment leads, and ensure correct sending and consistent visualisation across devices.

2 – Lead generation: To keep the sales pipeline filled, you must constantly nurture the leads that reach companies. By building landing pages, forms using drag-and-drop tools, creating and measuring the actual ROI of the campaigns through interconnection with Google AdWords, and using A/B tests to track the performance of landing pages, it is possible to keep the pipeline fully supplied with a quality flow of leads.

3 – Lead management: means less time spent by the marketing and sales teams on repetitive tasks, to give way to greater concentration on acquiring and maintaining important leads. With this, it is possible to prioritize and qualify leads using automated scoring systems in Pardot.

4 – Reports: Both Salesforce and Pardot offer the possibility to extract complete reports, such as a log of  all the contact points with your customers, understanding their journey from their first click to the moment of conversion and the closing of the deal; reports showing the sales lifecycle, thus, knowing where potential customers are stagnating in the funnel; and search reports to understand how Google AdWords campaigns impact on the organisation’s results, among others.

We can then conclude that using Salesforce and Pardot we get the ability to take a 360º look at our customers; from marketing strategy to sales. This allows us to more effectively manage the acquisition and maintenance of lead quality, check opportunities and improve the results of ongoing marketing campaigns.

Here is an example of how to implement Marketing Automation:

I want to know more!
Pedro RosaGuide to a successful Marketing Automation strategy (with Salesforce and Pardot)
read more

When Excel isn’t enough anymore (5 reasons why you should use Tableau)

Excel vs Tableau: two tools that are used for data analysis and each one takes a different approach to exploring data and finding key insights. If you’re storing your data in Excel and you’re tired of writing formulas, Tableau can bring it all together and make the exploration of data faster and easier. Their drag-and-drop analytics will answer deeper questions and rather than sending reports, with Tableau you can centralise, collaborate and explore shared data and dashboards. 

Excel is used to add, collect, store, track and sort data, and to perform various financial, mathematical and statistical operations. But while Excel comes equipped with many features that help you get started with data, Tableau takes your analysis to the next level with flexible, responsive visuals and best of all, Tableau natively connects to Excel spreadsheets to make data analysis simple and fast. 

You can combine and integrate data

In Excel you will handle millions of rows of data scattered across different workbooks, which makes working out your data story overwhelming. You might find yourself lost, spending hours searching for the data you want to focus on. While Excel is great for storing your data, Tableau can easily pull together all the data that you have, not only in Excel, but also connecting live with any relational database such as Salesforce, LinkedIn, Google Sheets, Google Analytics or Spark. You can easily analyse data from a combination of data sources, tables, worksheets and workbooks all at once. With Tableau you save time because you can explore and have a complete view of your data. With drill-down and data blending features built in, you’re able to spot new patterns, trends and correlations, and then understand what caused them to happen. 

Your data is up to date in no time

Excel and Tableau can work with static and live data from multiple data sources. However, automatically refreshing data in Excel involves manually programming processes or creating steps that automatically update your sheet’s data when you open your file. When a user doesn’t have skills with live connections, typically it becomes a copy paste process that can lead to errors. With Tableau you can easily set up a live connection to your data that allows to you to have a real time usage of your visualisations. 

You can build charts and even more insightful visualisations

In Excel, you need to have a reasonable idea of the answers to your questions before deciding which chart best displays the results of your analysis. Even after you think you’ve made the right decision about which chart to use the data may change; there may be additional information available to help with your analysis; or the results of your chart may not look the way you expect, or answer the question properly. In such cases, you often have to start over and update your file. Tableau is more intuitive with this process; for example, when creating calculations, the formula will be applied to all rows referencing that source. This makes it easier to create and apply recurring processes. And with visualisations, Tableau visualises data from the start, allowing you to see the significance right away. Tableau differentiates correlations using colour, size, labels and shapes, giving you context as you drill down and explore on a granular level. 

You can have even richer analytics

In addition to a smart calculation language, Tableau has a number of built-in features to help you get your data into a structure and format you and others can use. These analytics allow you and others to explore data at will. Quickly reveal and isolate outliers, discover hidden patterns, see trends, show geographical locations and model the future so you can anticipate results. With Tableau you can analyse your data without worrying about it becoming corrupt, or losing hours and hours of work because someone accidentally deleted one of your formulas. The data never writes back to your original data source and Tableau lets you ask questions and allows the answers lead you to new insights. 

You can share and collaborate

When it comes to analysing data and sharing insights, credibility and trust around what is being reported are the two most important key factors to consider. Using Excel users often find themselves duplicating reports, losing track of their location and versioning or having different business users report on the same KPI with different formulas. Tableau can help you govern your data and eliminate these problems so your company can share a single version of the truth. 

With Tableau Server or Tableau Online, your data is centralised and available on the web, giving your data the power to make an impact. Simply publish and share visualisations and dashboards with the people you want to collaborate with, allowing them to answer questions, collaborate with others and further share your data analyses. Sharing in Tableau allows everyone in the organisation to use the same metrics and the same version of reports, avoiding problems typical governance problems. 

Excel vs Tableau: in summary

If you’re working with lots of data, it’s better to work from a centralised database, but if you’re not, you can use Excel as a simple data repository and use Tableau for visual analyses. Whenever you need to insert new data, you can insert it manually using Excel or use our Write-Back extension for Tableau, which allows to any user add new data to any report directly from Tableau. 

To know more about how to combine Excel and Tableau, please click here.

Sílvia MartinsWhen Excel isn’t enough anymore (5 reasons why you should use Tableau)
read more