Latest news

Web content management

What is it for, what the advantages are, and what technologies are currently trending

A web content management system (WCMS) is the term used to describe a CMS (content management system), which is a set of tools for managing digital information stored on a website that also allows the user to create and manage content without any knowledge of programming or markup languages such as XML. WCMS is a program that helps users to maintain, control, change and adjust the content on a webpage.

WCMS behaves similarly to a traditional content management system – managing the integrity, editing and information lifecycle – but is specifically designed for handling web content.

The typical functionality of a WCMS system might include the ability to create and store personalised content on the website, with editors being able to review and approve content before it is published and configure an automated publication process. There is an increasingly greater need for such platforms to provide both creative options and accessibility, not just for content, but covering the entire user experience – solutions that manage the uploaded content and facilitate the monitoring of the entire user journey – regardless of the channel being used.

Pros and cons

There are several elements to consider when using a WCMS.

On the one hand, WCMS platforms are usually inexpensive and intuitive to use, as they don’t require technical programming expertise in order to manage and create content. The WCMS workflow can also be personalised by creating several accounts to manage different profiles.

On the other hand, WCMS implementations can sometimes be extremely costly, demanding specific training or certification. Maintenance can also incur extra expense, for licensing upgrades or updates. Security can also be a concern, given that in the event of a safety threat, hackers might explore vulnerabilities which could potentially damage user perceptions of the brand.

Choosing the right WCMS solution

With a WCMS, the content is predominantly stored in a database and grouped using the help of a flexible language such as XML or .Net.

There are several options using open-source WCMS, such as WordPress, Drupal and Joomla for more generic functions. But there are also solutions that cater to specific needs, such as, for example, the Marketing 360 platform, Filestack and CleanPix.

And there are the commercial solutions currently on the market, such as Sitecore, a single platform that comprises several WCMS components, Content Personalization, Content Marketing, Digital Asset Management and E-Commerce. This is one of the major advantages of this platform, as instead of acquiring and integrating the different components that will consume content and information from an adjacent system, in Sitecore’s case, contact data and information and interactions performed through the different channels are already available in the platform, ready to be used and processed by different functions and for different purposes: creating campaigns, sending emails, creating marketing workflows and customisation rules, among others.

WCMS solutions provide different functionalities, with several levels of depth and specific purposes. Before selecting the platform, consider the following functionalities:

  • Configuration: ability to activate and deactivate functionality using specific parameters.
  • Access management: managing users, permissions and groups.
  • Extension: the capacity to install and configure new functionalities and/or connectors.
  • The ability to install models with new functionalities
  • Customisation: ability to change specifications to customise some features, through toolkits or interfaces.
  • WYSIWYG: capacity to provide a “What you see is what you get” mechanism, allowing content managers to know, while making alterations, what the users will see after launching a new version of the content. A good example of this is provided on Sitecore’s “Experience Editor”
  • Integration: ability to integrate the WCM solution with other previously installed solutions, or with external solutions in order to gather information from both ends; for example, integration with Microsoft CRM Dynamics 365 or Microsoft SharePoint.
  • Flows: capacity to incorporate a flow configuration mechanism for content approval and alteration, from different content creators with different profiles, plus content publishing.
  • User experience: editing is less complex, with built-in templates that add a predetermined functionality to the page, with no additional training needed.
  • Technical assistance and updates: consider the degree of technical support you will receive, as well as the level of accessibility for making system updates.

The advantages of WCMS

A major advantage of WCMS is the fact that the software solution gives you consistent control over the look and feel of the website – brand, wire frames, navigation – simultaneously granting the functionality to create, edit and publish content – articles, photo galleries, video, etc. WCMS can be the best solution for companies looking for a rich content repository, focused on brand consistency.

Other advantages:

  • Automated templates;
  • Controlled access to the page;
  • Scalability;
  • Tools that allow simple editing, via WYSIWYG solutions;
  • Regular software updates;
  • Workflow management;
  • Collaboration tools that provides users with permission to modify the content;
  • Document management;
  • Ability to publish content in several languages;
  • Ability to retrieve older editions;
  • Ability to analyse content across devices (desktop, mobile, tablet, watch).
  • Omnichannel content availability.

Our vision

Content management is a relevant topic, although not recent. However, a topic that gained a lot of traction during recent years is the capacity to use customised content, offering a relevant experience to all users. In order to achieve this goal, Xpand IT decided to go into partnership with Sitecore, because we believe it to be the best platform for addressing customisation challenges, benefiting from the aforementioned advantages and also exploiting the fact that Sitecore allows Headless implementations (separating the entire content from the presentation layer), as well as integration with mobile platforms (producing true omnichannel solutions). We are certain that this technology has a lot to offer and we are excitedly looking forward to implementing new functionalities, which will be available soon and launched with the intent of fulfilling our vision – offering relevant and personalised content for everyone, at any time, in any place.

Sílvia RaposoWeb content management
read more

Xpand IT enters the FT1000 ranking: Europe’s Fastest Growing Companies

Xpand IT proudly announces our entry into the Europe’s Fastest Growing Companies ranking, compiled by renowned international journal the Financial Times! With sustained growth surpassing 45% in 2018, Xpand IT attained a place among the fastest growing companies, along with 1000 other European enterprises, taking into account their consolidated results between 2014 and 2017.

An income of 10 million and 195 collaborators were the figures that guaranteed our place on this list. Our income has since taken the leap to 15 million, and we can now count on the tireless work of more than 245 collaborators. And so, out of the three Portuguese tech companies distinguished with a spot on the ranking, Xpand IT can boast the best results in terms of income and the acquisition of new talent.

Paulo Lopes, CEO & Senior Partner of Xpand IT, said “Having a place on the FT 1000 European ranking is the ultimate recognition for all the work we have undertaken over the last few years.  We are renowned for our know-how and expertise within the technology arena, and now also for our unique team and business culture, focused on excellency and innovation, which makes it far easier to achieve these kinds of results.”

This year’s goal is to maintain our growth trend, not just by expanding into new markets, but also by increasing our workforce. In 2019, we expect to reach the beautifully rounded number of 300 Xpanders!

Sílvia RaposoXpand IT enters the FT1000 ranking: Europe’s Fastest Growing Companies
read more

7 steps to implement a data science project

Data science is a set of methods and procedures applied to a very complex, concrete problem, in order to solve it. It can use data interference, algorithm development and technology to analyse collected data and understand certain phenomena, identifying patterns. Data scientists must be in possession of mathematical and technological knowledge, along with the right mindset to achieve the expected results.

Through the unification of various concepts, such as statistics, data analysis and machine learning, the main objective is to unravel behaviours, tendencies or interferences in specific data that would be impossible to identify via a simple analysis. The discovery of valuable insights will allow companies to make better business decisions and leverage important investments.

In this blog post, we unveil 7 important steps to facilitate the implementation of data science.

1. Defining the topic of interest / business pain-points

In order to initiate a data science project, it is vital for the company to understand what they are trying to discover. What is the problem presented to the company or what kind of objectives does the company seek to achieve? How much time can the company allocate to working on  this project? How should success be measured?

For example, Netflix uses advanced data analysis techniques to discover viewing patterns from their clients, in order to make more adequate decisions regarding what shows to offer next; meanwhile, Google uses data science algorithms to optimise the placement and demonstration of banners on display, whether for advertisement or re-targeting.

2. Obtaining the necessary data

After defining the topic of interest, the focus shifts to the collection of fundamental data to elaborate the project, sourced from available databases. There are innumerable data sources, and while the most common are relational databases, there are also various semi-structured sources of data. Another way to collect the necessary data revolves around establishing adequate connections to web APIs or collecting data directly from relevant websites with the potential for future analysis (web scrapping).

3. “Polishing” the collected data

This is the next step – and the one that comes across as more natural – because after extracting the data from their original sources, we need to filter it. This process is absolutely essential, as the analysis of data without any reference can lead to distorted results.

In some cases, the modification of data and columns will be necessary in order to confirm that no variables are missing. Therefore, one of the most important steps to consider is the combination of information originating from various sources, establishing an adequate foundation to work on, and creating an efficient workflow.

It is also extremely convenient for data scientists to possess experience and know-how in certain tools, such as Python or R, which allow them to “polish” data much more efficiently.

4. Exploring the data

When the extracted data is ready and “polished”, we can proceed with its analysis. Each data source has different characteristics, implying equally different treatments. At this point, it is crucial to create descriptive statistics and test several hypotheses – significant variables.

After testing some variables, the next step will be to transfer the obtained data into data visualisation software, in order to unveil any pattern or tendency. It is at this stage that we can include the implementation of artificial intelligence and machine learning.

5. Creating advanced analytical models

This is where the collected data is modelled, treated and analysed. It is the ideal moment to create models in order to, for example, predict future results. Basically, it is during this stage that data scientists use regression formulas and algorithms to generate predictive models and foresee values and future patterns, in order to generalise occurrences and improve the efficiency of decisions.

6. Interpreting data / gathering insights

We are nearly entering the last level for implementing a data science project. In this phase, it is necessary to interpret the defined models and discover important business insights – finding generalisations to apply to future data – and respond to or address all the questions asked at the beginning of the project.

Specifically, the purpose of a project like this is to find patterns that can help companies in their decision-making processes: whether to avoid a certain detrimental outcome or repeat actions that have reproduced manifestly positive results in the past.

7. Communicating the results

Presentation is also extremely important, as project results should be clearly outlined for the convenience of stakeholders (who, in the vast majority of instances, are without technical knowledge). The data scientist has to possess the “gift” of storytelling so that the entire process makes sense, meeting the necessary requirements to solve the company’s problem.

If you want to know more about data science projects or if you’d like a bit of advice, don’t hesitate to get in touch.

Sílvia Raposo7 steps to implement a data science project
read more

Node.js: the JavaScript platform used by Netflix and Uber

The progressive and noticeable growth of JavaScript is hard to ignore. Over the years, this programming language has singlehandedly provided hundreds – if not thousands – of frameworks and libraries, helping developers and companies to create websites, portals, and interactive and agile applications, with modern interfaces. Adding the fact that JavaScript is completely independent from other platforms, easy to learn and supported by an ever-growing community, among many other advantages, it is easy to understand why.

However, for a long time, JavaScript was a language exclusively oriented towards client-side development and never managed to establish itself for backend purposes – at least until 2009, when the first version of Node.js was launched. For the first time in history, JavaScript became a viable alternative for backend solutions.

It is important to demystify the fear that many companies have about this alternative to more traditional backend solutions (Java, .NET, etc.) in the world of Enterprise applications, even though companies including Netflix, Trello, PayPal, LinkedIn, Uber, Walmart, NASA, Intel and Twitter have already successfully implemented Node.js in their infrastructures – and this list continues to grow each day.

For those who are not familiar with Node.js, it is important to highlight some of its biggest advantages:

  • Ideal for the construction of real-time applications;
  • Facilitates the programmer’s full stack vision in JavaScript (as both backend and frontend languages are the same);
  • Decreases development time, thanks to its full stack view;
  • Supported by a gigantic community that contributes new libraries and updates at an astonishing rate;
  • Extremely fast code execution;
  • Ideal in architectures oriented towards micro services.

We can now go back to what we really want to discuss: why should companies adopt Node.js for their applications? In a nutshell, because it was designed for large-scale applications, offering a modern perspective on how to develop applications with complex architectures.

How those capacities actually come to fruition is the most important aspect.

Scalability is essential for the vast majority of current corporate applications, and Node.js responds to that necessity by offering a base clustering module with load balancing on multiple CPU cores. Associating the clustering power with a single-threaded, non-blocking solution, specifically designed for events and callbacks, allows it to handle multiple connections simultaneously, processing millions of concurrent connections.

Being single-threaded is often regarded as a limitation because, theoretically, it can slow down the performance of the application, but that is nothing more than a myth. On solutions that are not oriented towards events, where multiple threads are necessary to deal with multiple requests, the number of parallel threads is limited. Node.js is completely free from these limitations. As long as there’s available memory and if the kernel allows it, we can effortlessly process any number of simultaneous requests.

Companies are also generally afraid to place their code in the Cloud, which would prevent the usage of the NPM (Node Package Manager). In order to address this issue, we have created a new Enterprise version that can be installed and maintained on companies’ own infrastructures, therefore preserving their internal module registry and complying with the strictest security requirements.

We also need to touch on the subject of long-term support. This will always be a priority for Enterprise solutions, but the truth is that Node.js also assures that very same support.

Each major version of Node.js will include active support for 18 months from the period it becomes eligible for LTS (Long Time Support), after which it will transition to a maintenance regime with a duration of 12 additional months. During this period, the version used will receive security updates and bug fixes, but new functionalities will not be added. In this way, we have addressed the potential problem that causes the absence of support for solutions developed with the help of Node.js, due to its lack of longevity.

Based on all this information, the aforementioned companies decided to make their transition to this technology. What have they accomplished?

  • Netflix: a reduction of over one minute on buffering times.
  • LinkedIn: rebuilt the core of their mobile services with Node.js. Their application is currently running 20 times faster and benefits from a substantially better integration between backend and frontend. This was achieved while Node.js was just in its first year of development.
  • PayPal: migrated all their web applications from Java to JavaScript and Node.js and saw their programmers writing 33% less lines of code, using more than 40% less files and reducing by half the necessary time to build their applications (while also requiring less people). Response times have decreased by roughly 35%, which translates to an improvement of 200 ms in page creation times.
  • Uber: built their interpersonal system between drivers and passengers with Node.js, due to its fast response capabilities and massive power to process requests, along with the welcome ease and ability to have a distributed architecture.

I don’t want to plant the idea that Node.js is a “silver bullet”. It might not be the best solution for all cases, but it is always wise to evaluate your possibilities and understand the potential benefits of this technology.

Francisco Costa

Enterprise Solutions Lead

Francisco CostaNode.js: the JavaScript platform used by Netflix and Uber
read more

The impact of Big Data on Social Media Marketing

Social media was born with the intent to create remote connections between colleagues, friends and others who wanted to share knowledge and information. Even though this purpose is still prevalent in its genesis, the truth is that social media has been evolving exponentially throughout the years, becoming a powerful bi-directional communication tool between companies and clients.

Nowadays, social media allows companies to publicise their brand and products, facilitating the rapid growth of their client base while also allowing the ceaseless collection of inputs from their users, whether they are clients or not.

For that reason, each like, comment or share gives companies a better understanding of their clients and their respective behaviours, through the way in which they interact with specific types of content. This behavioural analysis and exchange of information generates a massive volume of data, which can only be stored and processed using “Big Data” technologies.

In reality, Big Data has impacted on almost every sector of our daily lives, shaping the way people communicate, work and even have fun.

In recent decades, the quantity of generated data has been growing exponentially, doubling its size every two years, potentially reaching 44 trillion gigabytes in the year 2020. The massification of the World Wide Web and the Internet of things abruptly increased the amount of generated data, equally intensifying the necessity to diminish the time it takes to transform and access that same data.

Big Data is the technological concept that encompasses a particular set of practices and tools, tackling this problem using 5 fundamental principles:

  • Volume (storing, processing and accessing vast amounts of data)
  • Variety (cross-referencing data from various sources)
  • Speed (data access, treatment and processing speed)
  • Veracity (guarantee the veracity of information)
  • Value (usefulness of the information processed)

This “new” data access method and processing power has established a new paradigm within the marketing sector. Now it’s easier to analyse and identify trends, as well as possible cause and effect relationships to apply to marketing strategies. These types of analyses have become indispensable to companies for increasing the percentage of messages that actually reach the target, resulting in the growth of their ROI (return on investment).

How do we take advantage of Big Data in a marketing strategy?

The first step is to establish a relation between non-structured data, provided by social media, and already available data, such as your clients’ details. After completing this step, it will be easier to observe and analyse your clients’ actions, thus collecting important insights that will form a solid base for your future campaigns.

Now you can outline marketing strategies focused on all the insights you’ve gathered. In other words, you are now able to design marketing campaigns anchored by content that fulfills the needs of your clients, or segmented groups of clients.

Execution time has arrived! Now you possess the most actionable content, based on your analyses, let’s discover the degree of effectiveness of your strategy.

You’ve almost certainly worked out that this is a fundamental formula to success, but reaching that sweet spot will require constant “fine-tuning”. In other words, from this point forward, your digital marketing strategy will work in cycle: the number of insights about your clients and the reach and suitability of your strategies and content are proportionately higher, which in turn implies more insights.

Social media marketing is a tool that allows a company of any dimension and in any market to better understand its clients and work out the most effective strategies to shape its offers in order to satisfy the needs of its clients.

The truth: without Big Data, none of this would have been possible!

Sílvia RaposoThe impact of Big Data on Social Media Marketing
read more

8 reasons why you should choose Atlassian solutions on data center version

A data center is an environment that allows to aggregate the necessary infrastructure to assure all systems of an organization keep functioning the right way, or, in this case, to deploy products and/or apps. Data centers are designed to ensure the traffic, processing and store of great amounts of data. Atlassian was specifically designed to work as one more option for your customers who can choose to deploy the products they use on the cloud, server or data center, such as Jira Software, Confluence or Jira Service Desk. Its five fundamentals are: high availability, scalability, performance, safety and, of course, cost.

If you use Atlassian tools, using them on the data center option is now a possibility. But is this the best solution for your company? Find out the 8 reasons why you should choose Atlassian on data center version and which is the best moment to upgrade.

Which are the advantages of Atlassian on data center?

  1. Scalability: Atlassian on data center was specifically designed to grow according to the needs of the companies. You can add nodes to your cluster, without ever needing to worry about performance losses or downtimes (when you plan upgrades, for example, you can activate the read-only feature, which will allow your customers to keep viewing the pages and searching while maintenance works on the background).
  2. High availability: the storage of all information in active clusters allows teams to access to be constantly and without being interrupted, therefore minimizing flaws that may exist on the company’s applications.
  3. SAML 2.0: Atlassian on data center uses this protocol in order to ensure compliance and simplify the login experience. This way, Atlassian ensures that the authentication system is safe (through specific tokens).
  4. Choosing an infrastructure: in the data center model, you can choose to implement the applications on-premise or in IaaS providers, such as Azure or Amazon Web Services.
  5. Disaster Recovery: Atlassian ensures that your business can keep functioning without any problems, with a complete Disaster Recovery strategy, whether the system interruption was total or partial.
  6. Verified ecosystem: all applications developed in data center environment are verified on their turnaround times, scales and databases supports.
  7. Performance warranty: as your organization grows, the need to maintain the quality of the work developed and performance also grow. Atlassian on data center can eliminate everything that sets your team back (for example, through project archiving, which allows you to find the information you are truly looking for).
  8. Control: you can have full control over the compliance, safety and regulation needs.

When should you upgrade?

Atlassian on data center grants you access to the same functionalities, applications and products than the server. Moreover, both options provide almost full control over your data and infrastructure. However, in comparison, the data center has a few advantages – when your company reaches a new growth level – because, for example, the server runs on only one node and the data center runs on multiple nodes.

The data center was precisely designed to accompany the customers of the server model when they grow and their organization reaches maturity. Therefore, when should you upgrade to data center? Have in mind the following variables:

Number of users: how many users access your apps on a daily basis? According to Atlassian, “apps like Jira Software, Confluence or Bitbucket need more stability when they reach 500 or 1000 users. In the care of Jira Service Desk the upgrades usually take place when 50 users are reached”.

Performance: with the growth of your company, its performance also needs to grow proportionally. To ensure that the performance of your systems is maintained, you should assess if the number of users allows to maintain the same quality.

Downtime: assessing the downtime costs for your company is essential. If you think the costs are high and you cannot work with them, maybe your solution may involve assessing your data center model.

Management: do you think you spend too much time managing requests and taking care of issues that should be simple? The data center provides Administrators a possibility to simplify countless tasks, such as take and grant access or manage requests to change passwords.

In case you need to assess which is the best option for your company, do not hesitate and contact us.

Ana Lamelas8 reasons why you should choose Atlassian solutions on data center version
read more

10 best international big data events of 2019

Big data. A trend, a buzzword, or a necessity?  At Xpand IT, we believe it can be a mix of all these concepts – and a lot more. A few themes such as artificial intelligence, the Internet of Things, machine and deep learning or data science are part of the technological overview and start to become clear choices for company investments. Take part in the data universe with us and find out about the 10 best big data events of 2019.

1. Data Fest 2019

Data Fest 2019 is sort of a festival. It consists of 6 different events that happen all over Scotland. It focusses on essential themes such as data science and artificial intelligence (AI), and the goal of the festival is to rethink data revolution and its innovation.

These are some of the 6 events from Data Fest 2019:

  • Data Summit (an international conference on 21 and 22 March, in The Assembly Rooms, Edinburgh);
  • Data Talent (this seeks to gather engineering students and IT professionals in order to create a networking event that discusses various technological subjects. It will take place on 19 March, in the Hilton Hotel, Glasgow);
  • Fringe Events (a series of events such as meet-ups, hackathons, debates and workshops, that seeks to gather professionals from various industries. It will take place between 11 and 22 March, all over Scotland);
  • Data Tech (seeks to gather industry, public and academic members in order to share knowledge and technical presentations. This event will take place on 14 March, in the National Museum of Scotland, Edinburgh).
Date: throughout March 2019
Location: Scotland, Great Britain
Website: https://www.datafest.global

2. AI Tech World

It is called AI Tech World, but not all of it will be about artificial intelligence. Themes such as big data, cloud safety, DevOps and data centres will also be on the agenda for this conference, as well as the importance of ethics in the development of AI-powered solutions.

Throughout both days you can attend the most inspiring lectures from some of the best elite IT team professionals, in partnership with companies such as Hitachi or MariaDB.

Date: 12 and 13 March 2019
Location: ExCeL, London, Great Britain
Website: https://www.bigdataworld.com/ai-tech-world

3. Data Innovation Summit

This year the 4th edition of the Data Innovation Summit takes place and it will be about specific themes in big data, such as data engineering, machine learning, deep learning and data management. It has 6 stages and more than 100 speakers. This will be the biggest event on technology and innovation to take place in Scandinavia and one of the biggest in Europe.

The goal is to gather the most successful companies around the world and the best professionals in their fields to discuss new business models and ways of improving profitability and customer satisfaction.

Date: 14 and 15 March
Location: Kistamässan, Stockholm, Sweden
Website: https://datainnovationsummit.com

4. DataWorks Summit

Ideas, insights and innovation. These are the three concepts that compose the DataWorks Summit mood, happening in March, in Barcelona. This event aims to discuss the latest developments in technology such as Artificial Intelligence, Machine Learning, IoT and Cloud, with the primary goal  to gather pioneers from these areas to answer to the most critical questions.

In addition, the focus will be on open source technologies, and how they can help organisations leverage all of their digital transformation processes.

Date: 18 and 15 March 2019
Location: Centre de Convencions Internacional de Barcelona, Barcelona – ES
Website: https://dataworkssummit.com/barcelona-2019/

5. Spark & AI Summit 2019

Even  with no date yet for the 2019 event, the Spark & AI Summit is an event you can’t miss this year, as it is the most significant event worldwide about Apache Spark. This year, one of its main focuses will be Artificial Intelligence, with focus on autonomous cars, voice and image recognition, intelligent chatbots and even new deep learning frameworks.

As the world’s largest open source community in the Big Data universe, Apache Spark is used by some of the world’s biggest companies, such as eBay or Netflix, definitely something to keep under radar.

Da/europete: TBC – registrations open 8 April 2019
Location: TBC, Amesterdam – NL
Website: https://databricks.com/sparkaisummit

6. Strata Data Conference

The Strata Data Conference is held in partnership with O’Reilly and Cloudera, and this year it seeks to be the epicenter for data and business, providing talks and trainings in countless areas. In 2019, the conference will be focused on artificial intelligence, since it is a trending theme, but also encompasses ethics, privacy and data security.

This event promises to put participants in contact with the best professionals in all different areas of the technological world, such as data scientists, engineers, analysts, developers and even investigators in areas such as AI or IoT.

Date: 29 April to 2 May 2019
Location: ExCeL London, Great Britain
Website: https://conferences.oreilly.com/strata/strata-eu 

7. AI & Big Data Expo

London and Amsterdam are the two European cities in which the AI & Big Data Expo will be taking place, and you can expect themes such as artificial intelligence, IoT, Blockchain, digital transformation and even cyber safety.

Around 36.000 participants are expected to attend this conference, as well as more than 1.500 speakers from some of the biggest companies in the world, such as Google, Amazon, Coca-Cola, Adidas, Uber, Twitter and Hewlett Packard. If you want access to an international showroom discussing best practice for IT departments, choose this event. You have two locations and two dates to choose from:

Date: 25 and 26 April 2019
Location: Olympia, London, Great Britain
Website: https://www.ai-expo.net/global/
Date: 19 and 20 June 2019
Location: RAI, Amsterdam, Netherlands
Website: https://www.ai-expo.net/europe/

8. Kafka Summit London

If you are a Developer, and Operator or a Data Scientist, this is the event for you. Kafka Summit promises to bring talks of some of the best professionals from leading companies in the universe of streaming technology, to share knowledge and foster networking dynamics. As the name implies, this is the appropriate place to contribute to, but also to learn from the community dedicated to the Apache Kafka platform.

The Confluent technology will also provide a training session to introduce Apache Kafka to new users, exploring the fundamentals and application development.

Date: 13 and 14 May 2019
Location: Park Plaza Westminster Bridge, London – UK
Website:https: //kafka-summit.org/events/kafka-summit-london-2019/

9. J on the Beach

Are you developer or DevOps? Do you work with Big Data technologies? Do you enjoh the beach? If your answer was yes, this event is for you. J on the Beach (JOTB) is a conference that is looking to increase sharing of experiences and tricks related with data universe, working with topics such as Data Visualisation, IoT & Embedded or Functional Programming, amongst other topics. You will also be able to participate in a Hackathon to develop a distributed solution of Data Science. And all in beautiful Marbella beach.

Date: 15 May to 17 May
Location: Palacio de Congressos de Marbella, Marbella, Spain
Website: https://jonthebeach.com/

10. Gartner Data & Analytics Summit

Gartner Data & Analytics Summit seeks to bring clarity to issues already discussed, such as digital transformation, business intelligence and even data. The goal is to share new strategies and dissect best practices in order to make your company a winner in the digital economy.

You can count on a big networking component, as well as the opportunity to learn hands-on, which always gets the best results, based on research from Gartner.

Date: 19 and 20 November 2019
Location: Kap Europa Hotel, Frankfurt, Germany
Website: https://www.gartner.com/en/conferences/emea/data-analytics-germany

If you want to know more about how our Big Data solutions can help your business, contact us here.

Ana Lamelas10 best international big data events of 2019
read more

Building the Future: together we activated Portugal!

The first edition of the event Building the Future: Ativar Portugal (Activate Portugal) took place in January (between the 29th and the 30th) and was organized by Microsoft, with the help of the agency imatch.

More than 3000 participants, 100 speakers, 60 sessions and 50 partners. These were the numbers of one of the most waited technological event, that ended up uncovering that the investment in digital transformation is, in fact, one of the main priorities for a quite significant number of companies. Xpand IT had the privilege to take part of this huge success and can confirm that along with Building the Future: together we activated Portugal!

For Paula Braz, Marketing Manager of Xpand IT, “Building the Future was an extremely interesting event, since it  allowed to create (or recreate) a vision of  not-so-far future through experiences provided by all partners in different sessions – from the most technical ones, such as our “Cognitive Lab”, in which we offered the possibility to learn to develop a bot, to the most conceptual ones, such as the talks from Gerd Lehoar (writer and founder of Futures Agency) or Jim Stolze (active leader of the TEDx community and co-founder of Aigency)”.

As Microsoft’s partner, Xpand IT had the opportunity to promote some of the topics of the event, using artificial intelligence applied to gamification applied in the Sentiment Meter, as well as by showing a Retail Bot, in the Intelligent Day area, or even with sessions by Jorge Borralho, the Project Manager, and Sérgio Viana, Digital Xperience Lead from Xpand IT.

For Sérgio Viana, Digital Xperience Lead of Xpand IT, “Embracing technology and empowering it brings value to the business, as well as to our human abilities; it is the path to build solutions that make a difference. There is no need to fear innovation, but one should use it with the right purpose, based on fundamental and structural ethical values”.

Ana LamelasBuilding the Future: together we activated Portugal!
read more

Xpand IT at WSO2 Con 2018

WSO2 Con, the official technology conference of WSO2, took place in three places around the world this year: in the USA (San Francisco) in July, Asia (Colombo) in August and Europe (London) in November. As a certified partner and retailer of WSO2, Xpand IT took part in the European event on the 13th to 15th of November, in the Hilton London Bankside hotel.

The European WSO2Con of 2018 was focused mostly on WSO2’s opinion on Agile Integration and API-oriented business contexts, in a world where integration needs are increasing with the blooming of systems and apps.

The three days of presentations comprised the strategic vision of WSO2 on integration, their architecture and applicability definitions, technical abilities and business applications of their products, and case studies of successful WSO2 implementations, presented by the customers themselves. The morning, lunch and afternoon breaks also contributed to creating a networking environment between partners and customers.

Moreover, an “Oxygen Bar” was available, where experts from all technological areas from WSO2 were constantly available for any additional information on the products or their use.

Day 1 – Digital Transformation

The first morning was filled with various keynotes, the first being presented by the CEO of WSO2, Tyler Jewell. In this keynote, the strategic vision of WSO2 for the next few years was presented, reinforcing the trend towards integration and the reasons that increasingly justify an “API first” approach by organizations. This same vision was supported by Massimo Pezzini, Vice-President of Gartner, with view of HIP (Hybrid Integration Platform) as a digital facilitator for organizations. In the afternoon, there were parallel streams with three different contexts: the red room had Integration and Architecture, the yellow room had Stream Processing and Identity Management, and the green room had Open Banking and success stories.

Day 2 – Agility in Integration

The second day had Agility as its main theme, and it started with two very interesting keynotes from Aria van Bannekum, founder of the Agile Manifesto, and Paul Fremantle, CTO and Co-Founder of WSO2. In the afternoon, there were again three parallel sessions: the red room was focused on API Management, the green room on success stories centred on the API ecosystem, and the yellow room on complete WSO2 product demonstrations.

Day 3 – Ballerina

The last day of the conference was integrated with the event Ballerina Day 2018.

Although it was complemented by exclusive sessions for debates on subjects related to the new partnership programme, this day was totally focused on Ballerina, an open-source and cloud-native programming language that WSO2 has been developing for the last three years. This programming language aims to address flaws created by “non-agile” middleware products and by current programming languages, which are too complex to deal with in integration scenarios. Ballerina promises to simplify this level of complexity and promote agility, providing middleware abilities with the minimal possible amount of code.

It is also oriented towards cloud and DevOps environments; therefore, it allows for integration with Decker and Kubernetes.

Ana LamelasXpand IT at WSO2 Con 2018
read more

Advantages of implementing Big Data in your company

Big Data is not a ‘trend’. It is a necessity associated with most large, or even medium or small, companies that can no longer get sufficient value from the data produced by more traditional Business Intelligence tools. Big Data plays an important role in boosting business, and many companies are already aware of that. According to Forbes, the global market for Big Data (software and services) will grow from 42 trillion dollars, in 2018, to 103 trillion dollars in 2027.

There are many advantages to implementing Big Data in your company, and having a well-defined strategy is a halfway house to being able to make well-informed decisions, which can be a key to success for your business.

What is Big Data?

Big Data is the ability to analyse and/or process very large amounts of data, based either on its volume or on the number of ‘data points’ generated. The concept of Big Data comes to the fore when companies face such a great flow of data that conventional processing and analysis tools cannot handle it effectively.

Data can be structured, semi-structured or unstructured. Structured data are, for example, data from purchases or sales from an organisation or information from forms or operational tables. Unstructured or semi-structured data are information generated without an established order and from sources such as, for example, social media, user logs in web or mobile apps, sharing of opinions or files.

According to data from Harvard Business Review, only 20% of the data that gets to companies is structured, while the other 80% is semi-structured or unstructured. Moreover, the percentage of that structured data that is used to support decision-making and to extract insights is less than 50%; however, for semi-structured or unstructured data, that percentage falls to 1%.

The Big Data concept can be characterised by five Vs:

  • Volume: massive amounts of data are generated and need to be stored and processed. According to the website Statista, already in 2018, 10.6 zettabytes were generated worldwide from cloud data centres.
  • Velocity: the velocity of generating, processing and analysing data can be more important than volume, since real-time or near-real time information provides great agility to companies that have a Big Data strategy implemented.
  • Variety: data can originate from various sources, such as normal data bases, social media, web pages, financial transactions, emails, sensors (IoT), audio, text or video files, archives, forums, etc.
  • Veracity: is the generated data reliable, according to its source or origin?
  • Value: do the generated data have true value for a company? It is necessary to assess if those data will, in fact, generate new opportunities, increase income or optimise costs, for example.

Advantages of implementing a well-defined strategy

So, we know that implementing a Big Data strategy has become a necessity for large organisations, and the focus has changed from “whether to use Big Data” to “how to use Big Data more efficiently”.

We also know that Big Data opens doors to better informed decision-making, based on extremely complex analysis, and that it allows the collection of important insights to optimise the information gathered. Consequently, the decision to implement a Big Data strategy must come from business teams and not from the IT departments that must ensure the technical execution of the project in the most efficient way. Basically, it is those business teams that will get value from the gathered data for their daily work and for the definition of  strategy.

However, what are the true advantages of implementing a Big Data project? What will be the advantages to the competitiveness of your business? We identify three of the main advantages of implementing Big Data in your company:

Advantage 1: Informed decision-making

With data analysis carried out by Big Data technologies, it is possible to find purchase or behaviour patterns that support decision-making from business departments. For example, if a marketing team has information that a certain family buys the same product every single month, it can send discounts for that same product through digital or physical mailing, in order to ensure that the customers stay faithful.

Advantage 2: Reduced costs

Data generated from or for a company are stored, processed and analysed, resulting in finding important business insights or the identification of gaps and errors. Working on data previously analysed and having access, for example, to constant behaviour or purchase trends, allows companies to launch more efficient campaigns that reach directly to the desired target and, therefore, can register a better ROI. This way, optimising the use of a budget will make teams more efficient – also increasing their productivity.

Advantage 3: Possibility to predict future situations

Usually, in Big Data, there are three types of analysis that can be carried out and complement each other:

  • Descriptive analytics, the type of analysis that describes what is happening, often in real-time. By the use of data aggregation and data mining, it is possible to access a picture from the past and understand the reason for a departure or a change – or just summarise a certain aspect.
  • Predictive analytics, the type of analysis that predicts what might happen in the future, relying on statistics and algorithms and providing scenarios of statistically probable situations.
  • Prescriptive analytics, based on optimisation, simulation algorithms, machine learning and computational models; this is quite a complex type of analysis, which seeks to answer the question “what should we do in a given situation?” Basically, the scenarios created will work as specifications of different actions and their expected outcomes, allowing the company to choose the scenario that represents least risk, for example.

Practical examples

Now that you know the advantages of implementing Big Data in your company and how to establish specific and measurable goals, the question is: how can you benefit from the data generated from the organisation, based on the area it impacts?

Here are a few practical examples:

  • Data from sensors in transportation systems;
  • Analysis of financial data to prevent fraud (for example, by detecting the use of a credit card from an unusual user);
  • Analysis of network traffic;
  • Monitoring mentions on social media to assess if the emotions towards a brand/company are positive or negative;
  • Information on traffic flows to predict which times will be more problematic.
Ana LamelasAdvantages of implementing Big Data in your company
read more