Ana Lamelas

5 Business Intelligence books you have to read

At Xpand IT, we believe that business intelligence goes way beyond reports and dashboards. We are expert providers of BI solutions, developing projects with the ever-present goal of adding value to any business. Many companies have already placed their bets on data analysis software, recognising the huge potential that such insights represent to progress. However, there is still a small percentage of companies unable to recognise the proper value of internal data analyses and which, therefore, choose not to provide them to their clients. And so, we’ve picked 5 great business intelligence books for you to read, to help you discover more about adopting a complete BI strategy suited to your own situation. In this digital era, we’ve chosen physical formats to help you understand modern BI strategies that you can implement, going way beyond the standard pattern.

As stated by John Owen: “Data is what you need to do analytics. Information is what you need to do business.”

1. Business Intelligence Guidebook: From Data Integration to Analytics

1st Edition, November 2014

This is one of the more comprehensive books about business intelligence and data integration, touching on simple topics as well as vastly more complex architecture. The author guarantees that after reading this book you will be able to develop a BI project, launch it, manage it, and deliver it on time and to a budget. You will also be able to implement a complete strategy for your company – supported by the tools he introduces.

If you’re looking for a reliable source of information, capable of explaining the best practices, the best approaches, and presenting a complete overview of the entire life cycle of a BI project, adaptable for companies of any size, don’t look any further: this is the right book for you.

2. Data Strategy: How to Profit from a World of Big Data, Analytics and the Internet of Things

Bernard Marr – 1st Edition, April 2017

The author starts from the premise that less than 0.5% of all generated data is currently being analysed and used, building a compelling narrative to convince company leaders to invest in business intelligence strategies, focusing on the benefits for business growth.

Complemented with case studies and real examples, this book explains how to translate the data generated by companies into insights to support the strategic decision-making process. This aims to improve companies’ business practices and performance, with a vital combination of Big Data, Analytics and Internet of Things.

3. Agile Data Warehouse Design: Collaborative Dimensional Modeling, from Whiteboard to Star Schema

Lawrence Corr and Jim Stagnitto – 1st Edition, November 2011

This is a book for professionals looking to implement data warehousing and business intelligence requirements, turning them into dimensional models, with the help of BEAM (Business Event Analysis & Modeling) – an agile methodology for dimensional models that aims to improve communication between data warehouse designers, BI stakeholders and their development teams.

If you want to implement this methodology in your company or if you’re just curious about this approach, we strongly recommend you to explore this book, which includes, amongst other topics, subjects such as data modelling, visual modelling and data stories, using the 7 Ws (who, what, when, how many, why and how).

4. Successful Business Intelligence: Unlock the Value of BI & Big Data

Cindi Howson – 2nd Edition, November 2013

This is not the most recent edition, but the wealth of information it contains still makes it one of the best must-have business intelligence books you can read. The author, Research Vice President at Gartner and BI analyst, has conducted a study with the objective of identifying analytics strategies implemented by some of the biggest players in the market.

This book provides much more than just theory. It is a valuable manual that tells stories and lays out successful BI approaches, explaining why the strategies implemented cannot be the same for every company. Additionally, the book includes tips on how to achieve an adequate alignment between a company’s BI strategy and its commercial objectives.

5. Business Intelligence – Da Informação ao Conhecimento

Maribel Yasmina Santos and Isabel Ramos – 3rd Edition, September 2017

This is the only Portuguese book on our list, and it’s very comprehensive, explaining the basic concepts of data analysis and demonstrating how BI technologies can be implemented – from the data warehouse storage process to the analysis of the data (online analytical processing and data mining), outlining how the resulting knowledge can be used by companies to support decision-making.

An essential book, whether you’re a professional searching for a complementary source of information or you’re simply looking for reasons to implement a business intelligence strategy in your company

If you would like to know more about some of the topics mentioned above, or if you want to implement your own BI strategy, get in touch with us today!

Ana Lamelas5 Business Intelligence books you have to read
read more

ITIL: sound practices to improve your IT service management

ITIL is an acronym for Information Technology Infrastructure Library, a set of good practices designed to facilitate a significant improvement to the operation and management of all the IT services within a company. When implemented by an organisation, this set of practices becomes an unequivocally beneficial asset, as it comes with several advantages, such as the improvement of risk management, the strengthening of client relationships, an increase in productivity and reduced costs.

Developed in 1980 by the Central Computer and Telecommunications Agency (CCTA) – a British government agency – it is the primary framework for sound IT Service Management (ITSM). It began with more than 30 books comprising numerous sources of information, and describing good practices to follow in relation to IT services. Currently, ITIL runs to 5 books covering its various processes and functions (and a total of 26 processes that can be adopted by companies).

In 2005 the framework was finally formally recognised and given the ISO/IEC 20000 Service Management seal of approval for compliance with desired standards, and for being truly aligned with Information Technology best practice.

ITIL went through various revisions and there are now 4 different versions, with the most recent being released at the start of 2019. This updated version maintains a strong focus on automating processes in order to maximise professional time and the business integration of IT departments, in order to improve communication between teams and technical and non-technical staff. Version 4 features new ways to tackle the challenges of modern technology and its main goal is to become ever more agile and cooperative.

Reading current books on the subject simply won’t give you enough background to effectively implement ITIL for your company, however. You need to engage professionals dedicated specifically to the field, and guarantee adequate training and certifications for both the company and these professionals. Current certification, in accordance with the 4th version of ITIL, is divided into two levels: ITIL Foundation and ITIL Master – each one with its own unique examinations and programme content. There are two options under the ITIL Foundation module: ITIL Managing Professional (which certifies an ITIL specialist), and the ITIL Strategic Leaders certification (encompassing both ITIL Strategist and ITIL Leader certificates). After completing foundation accreditation, you can then leap into master level – the highest certification available in ITIL 4. You can review the full scheme using the table below:

ITIL

ITIL is divided into five major areas – Service Strategy, Service Design, Service Transition, Service Operations and Continual Service Improvement – and each area has individual processes. Although this framework provides 26 processes in total, companies are not obligated to adopt them in their entirety. It is up to the IT professionals and ultimately the CTO to define appropriate procedures to integrate into teams. Below you can find some examples of the most commonly used processes:

ITIL
Ana LamelasITIL: sound practices to improve your IT service management
read more

8 reasons why you should choose Atlassian solutions on data center version

A data center is an environment that allows to aggregate the necessary infrastructure to assure all systems of an organization keep functioning the right way, or, in this case, to deploy products and/or apps. Data centers are designed to ensure the traffic, processing and store of great amounts of data. Atlassian was specifically designed to work as one more option for your customers who can choose to deploy the products they use on the cloud, server or data center, such as Jira Software, Confluence or Jira Service Desk. Its five fundamentals are: high availability, scalability, performance, safety and, of course, cost.

If you use Atlassian tools, using them on the data center option is now a possibility. But is this the best solution for your company? Find out the 8 reasons why you should choose Atlassian on data center version and which is the best moment to upgrade.

Which are the advantages of Atlassian on data center?

  1. Scalability: Atlassian on data center was specifically designed to grow according to the needs of the companies. You can add nodes to your cluster, without ever needing to worry about performance losses or downtimes (when you plan upgrades, for example, you can activate the read-only feature, which will allow your customers to keep viewing the pages and searching while maintenance works on the background).
  2. High availability: the storage of all information in active clusters allows teams to access to be constantly and without being interrupted, therefore minimizing flaws that may exist on the company’s applications.
  3. SAML 2.0: Atlassian on data center uses this protocol in order to ensure compliance and simplify the login experience. This way, Atlassian ensures that the authentication system is safe (through specific tokens).
  4. Choosing an infrastructure: in the data center model, you can choose to implement the applications on-premise or in IaaS providers, such as Azure or Amazon Web Services.
  5. Disaster Recovery: Atlassian ensures that your business can keep functioning without any problems, with a complete Disaster Recovery strategy, whether the system interruption was total or partial.
  6. Verified ecosystem: all applications developed in data center environment are verified on their turnaround times, scales and databases supports.
  7. Performance warranty: as your organization grows, the need to maintain the quality of the work developed and performance also grow. Atlassian on data center can eliminate everything that sets your team back (for example, through project archiving, which allows you to find the information you are truly looking for).
  8. Control: you can have full control over the compliance, safety and regulation needs.

When should you upgrade?

Atlassian on data center grants you access to the same functionalities, applications and products than the server. Moreover, both options provide almost full control over your data and infrastructure. However, in comparison, the data center has a few advantages – when your company reaches a new growth level – because, for example, the server runs on only one node and the data center runs on multiple nodes.

The data center was precisely designed to accompany the customers of the server model when they grow and their organization reaches maturity. Therefore, when should you upgrade to data center? Have in mind the following variables:

Number of users: how many users access your apps on a daily basis? According to Atlassian, “apps like Jira Software, Confluence or Bitbucket need more stability when they reach 500 or 1000 users. In the care of Jira Service Desk the upgrades usually take place when 50 users are reached”.

Performance: with the growth of your company, its performance also needs to grow proportionally. To ensure that the performance of your systems is maintained, you should assess if the number of users allows to maintain the same quality.

Downtime: assessing the downtime costs for your company is essential. If you think the costs are high and you cannot work with them, maybe your solution may involve assessing your data center model.

Management: do you think you spend too much time managing requests and taking care of issues that should be simple? The data center provides Administrators a possibility to simplify countless tasks, such as take and grant access or manage requests to change passwords.

In case you need to assess which is the best option for your company, do not hesitate and contact us.

Ana Lamelas8 reasons why you should choose Atlassian solutions on data center version
read more

10 best international big data events of 2019

Big data. A trend, a buzzword, or a necessity?  At Xpand IT, we believe it can be a mix of all these concepts – and a lot more. A few themes such as artificial intelligence, the Internet of Things, machine and deep learning or data science are part of the technological overview and start to become clear choices for company investments. Take part in the data universe with us and find out about the 10 best big data events of 2019.

1. Data Fest 2019

Data Fest 2019 is sort of a festival. It consists of 6 different events that happen all over Scotland. It focusses on essential themes such as data science and artificial intelligence (AI), and the goal of the festival is to rethink data revolution and its innovation.

These are some of the 6 events from Data Fest 2019:

  • Data Summit (an international conference on 21 and 22 March, in The Assembly Rooms, Edinburgh);
  • Data Talent (this seeks to gather engineering students and IT professionals in order to create a networking event that discusses various technological subjects. It will take place on 19 March, in the Hilton Hotel, Glasgow);
  • Fringe Events (a series of events such as meet-ups, hackathons, debates and workshops, that seeks to gather professionals from various industries. It will take place between 11 and 22 March, all over Scotland);
  • Data Tech (seeks to gather industry, public and academic members in order to share knowledge and technical presentations. This event will take place on 14 March, in the National Museum of Scotland, Edinburgh).
Date: throughout March 2019
Location: Scotland, Great Britain
Website: https://www.datafest.global

2. AI Tech World

It is called AI Tech World, but not all of it will be about artificial intelligence. Themes such as big data, cloud safety, DevOps and data centres will also be on the agenda for this conference, as well as the importance of ethics in the development of AI-powered solutions.

Throughout both days you can attend the most inspiring lectures from some of the best elite IT team professionals, in partnership with companies such as Hitachi or MariaDB.

Date: 12 and 13 March 2019
Location: ExCeL, London, Great Britain
Website: https://www.bigdataworld.com/ai-tech-world

3. Data Innovation Summit

This year the 4th edition of the Data Innovation Summit takes place and it will be about specific themes in big data, such as data engineering, machine learning, deep learning and data management. It has 6 stages and more than 100 speakers. This will be the biggest event on technology and innovation to take place in Scandinavia and one of the biggest in Europe.

The goal is to gather the most successful companies around the world and the best professionals in their fields to discuss new business models and ways of improving profitability and customer satisfaction.

Date: 14 and 15 March
Location: Kistamässan, Stockholm, Sweden
Website: https://datainnovationsummit.com

4. DataWorks Summit

Ideas, insights and innovation. These are the three concepts that compose the DataWorks Summit mood, happening in March, in Barcelona. This event aims to discuss the latest developments in technology such as Artificial Intelligence, Machine Learning, IoT and Cloud, with the primary goal  to gather pioneers from these areas to answer to the most critical questions.

In addition, the focus will be on open source technologies, and how they can help organisations leverage all of their digital transformation processes.

Date: 18 and 15 March 2019
Location: Centre de Convencions Internacional de Barcelona, Barcelona – ES
Website: https://dataworkssummit.com/barcelona-2019/

5. Spark & AI Summit 2019

Even  with no date yet for the 2019 event, the Spark & AI Summit is an event you can’t miss this year, as it is the most significant event worldwide about Apache Spark. This year, one of its main focuses will be Artificial Intelligence, with focus on autonomous cars, voice and image recognition, intelligent chatbots and even new deep learning frameworks.

As the world’s largest open source community in the Big Data universe, Apache Spark is used by some of the world’s biggest companies, such as eBay or Netflix, definitely something to keep under radar.

Da/europete: TBC – registrations open 8 April 2019
Location: TBC, Amesterdam – NL
Website: https://databricks.com/sparkaisummit

6. Strata Data Conference

The Strata Data Conference is held in partnership with O’Reilly and Cloudera, and this year it seeks to be the epicenter for data and business, providing talks and trainings in countless areas. In 2019, the conference will be focused on artificial intelligence, since it is a trending theme, but also encompasses ethics, privacy and data security.

This event promises to put participants in contact with the best professionals in all different areas of the technological world, such as data scientists, engineers, analysts, developers and even investigators in areas such as AI or IoT.

Date: 29 April to 2 May 2019
Location: ExCeL London, Great Britain
Website: https://conferences.oreilly.com/strata/strata-eu 

7. AI & Big Data Expo

London and Amsterdam are the two European cities in which the AI & Big Data Expo will be taking place, and you can expect themes such as artificial intelligence, IoT, Blockchain, digital transformation and even cyber safety.

Around 36.000 participants are expected to attend this conference, as well as more than 1.500 speakers from some of the biggest companies in the world, such as Google, Amazon, Coca-Cola, Adidas, Uber, Twitter and Hewlett Packard. If you want access to an international showroom discussing best practice for IT departments, choose this event. You have two locations and two dates to choose from:

Date: 25 and 26 April 2019
Location: Olympia, London, Great Britain
Website: https://www.ai-expo.net/global/
Date: 19 and 20 June 2019
Location: RAI, Amsterdam, Netherlands
Website: https://www.ai-expo.net/europe/

8. Kafka Summit London

If you are a Developer, and Operator or a Data Scientist, this is the event for you. Kafka Summit promises to bring talks of some of the best professionals from leading companies in the universe of streaming technology, to share knowledge and foster networking dynamics. As the name implies, this is the appropriate place to contribute to, but also to learn from the community dedicated to the Apache Kafka platform.

The Confluent technology will also provide a training session to introduce Apache Kafka to new users, exploring the fundamentals and application development.

Date: 13 and 14 May 2019
Location: Park Plaza Westminster Bridge, London – UK
Website:https: //kafka-summit.org/events/kafka-summit-london-2019/

9. J on the Beach

Are you developer or DevOps? Do you work with Big Data technologies? Do you enjoh the beach? If your answer was yes, this event is for you. J on the Beach (JOTB) is a conference that is looking to increase sharing of experiences and tricks related with data universe, working with topics such as Data Visualisation, IoT & Embedded or Functional Programming, amongst other topics. You will also be able to participate in a Hackathon to develop a distributed solution of Data Science. And all in beautiful Marbella beach.

Date: 15 May to 17 May
Location: Palacio de Congressos de Marbella, Marbella, Spain
Website: https://jonthebeach.com/

10. Gartner Data & Analytics Summit

Gartner Data & Analytics Summit seeks to bring clarity to issues already discussed, such as digital transformation, business intelligence and even data. The goal is to share new strategies and dissect best practices in order to make your company a winner in the digital economy.

You can count on a big networking component, as well as the opportunity to learn hands-on, which always gets the best results, based on research from Gartner.

Date: 19 and 20 November 2019
Location: Kap Europa Hotel, Frankfurt, Germany
Website: https://www.gartner.com/en/conferences/emea/data-analytics-germany

If you want to know more about how our Big Data solutions can help your business, contact us here.

Ana Lamelas10 best international big data events of 2019
read more

Building the Future: together we activated Portugal!

The first edition of the event Building the Future: Ativar Portugal (Activate Portugal) took place in January (between the 29th and the 30th) and was organized by Microsoft, with the help of the agency imatch.

More than 3000 participants, 100 speakers, 60 sessions and 50 partners. These were the numbers of one of the most waited technological event, that ended up uncovering that the investment in digital transformation is, in fact, one of the main priorities for a quite significant number of companies. Xpand IT had the privilege to take part of this huge success and can confirm that along with Building the Future: together we activated Portugal!

For Paula Braz, Marketing Manager of Xpand IT, “Building the Future was an extremely interesting event, since it  allowed to create (or recreate) a vision of  not-so-far future through experiences provided by all partners in different sessions – from the most technical ones, such as our “Cognitive Lab”, in which we offered the possibility to learn to develop a bot, to the most conceptual ones, such as the talks from Gerd Lehoar (writer and founder of Futures Agency) or Jim Stolze (active leader of the TEDx community and co-founder of Aigency)”.

As Microsoft’s partner, Xpand IT had the opportunity to promote some of the topics of the event, using artificial intelligence applied to gamification applied in the Sentiment Meter, as well as by showing a Retail Bot, in the Intelligent Day area, or even with sessions by Jorge Borralho, the Project Manager, and Sérgio Viana, Digital Xperience Lead from Xpand IT.

For Sérgio Viana, Digital Xperience Lead of Xpand IT, “Embracing technology and empowering it brings value to the business, as well as to our human abilities; it is the path to build solutions that make a difference. There is no need to fear innovation, but one should use it with the right purpose, based on fundamental and structural ethical values”.

Ana LamelasBuilding the Future: together we activated Portugal!
read more

Xpand IT at WSO2 Con 2018

WSO2 Con, the official technology conference of WSO2, took place in three places around the world this year: in the USA (San Francisco) in July, Asia (Colombo) in August and Europe (London) in November. As a certified partner and retailer of WSO2, Xpand IT took part in the European event on the 13th to 15th of November, in the Hilton London Bankside hotel.

The European WSO2Con of 2018 was focused mostly on WSO2’s opinion on Agile Integration and API-oriented business contexts, in a world where integration needs are increasing with the blooming of systems and apps.

The three days of presentations comprised the strategic vision of WSO2 on integration, their architecture and applicability definitions, technical abilities and business applications of their products, and case studies of successful WSO2 implementations, presented by the customers themselves. The morning, lunch and afternoon breaks also contributed to creating a networking environment between partners and customers.

Moreover, an “Oxygen Bar” was available, where experts from all technological areas from WSO2 were constantly available for any additional information on the products or their use.

Day 1 – Digital Transformation

The first morning was filled with various keynotes, the first being presented by the CEO of WSO2, Tyler Jewell. In this keynote, the strategic vision of WSO2 for the next few years was presented, reinforcing the trend towards integration and the reasons that increasingly justify an “API first” approach by organizations. This same vision was supported by Massimo Pezzini, Vice-President of Gartner, with view of HIP (Hybrid Integration Platform) as a digital facilitator for organizations. In the afternoon, there were parallel streams with three different contexts: the red room had Integration and Architecture, the yellow room had Stream Processing and Identity Management, and the green room had Open Banking and success stories.

Day 2 – Agility in Integration

The second day had Agility as its main theme, and it started with two very interesting keynotes from Aria van Bannekum, founder of the Agile Manifesto, and Paul Fremantle, CTO and Co-Founder of WSO2. In the afternoon, there were again three parallel sessions: the red room was focused on API Management, the green room on success stories centred on the API ecosystem, and the yellow room on complete WSO2 product demonstrations.

Day 3 – Ballerina

The last day of the conference was integrated with the event Ballerina Day 2018.

Although it was complemented by exclusive sessions for debates on subjects related to the new partnership programme, this day was totally focused on Ballerina, an open-source and cloud-native programming language that WSO2 has been developing for the last three years. This programming language aims to address flaws created by “non-agile” middleware products and by current programming languages, which are too complex to deal with in integration scenarios. Ballerina promises to simplify this level of complexity and promote agility, providing middleware abilities with the minimal possible amount of code.

It is also oriented towards cloud and DevOps environments; therefore, it allows for integration with Decker and Kubernetes.

Ana LamelasXpand IT at WSO2 Con 2018
read more

Advantages of implementing Big Data in your company

Big Data is not a ‘trend’. It is a necessity associated with most large, or even medium or small, companies that can no longer get sufficient value from the data produced by more traditional Business Intelligence tools. Big Data plays an important role in boosting business, and many companies are already aware of that. According to Forbes, the global market for Big Data (software and services) will grow from 42 trillion dollars, in 2018, to 103 trillion dollars in 2027.

There are many advantages to implementing Big Data in your company, and having a well-defined strategy is a halfway house to being able to make well-informed decisions, which can be a key to success for your business.

What is Big Data?

Big Data is the ability to analyse and/or process very large amounts of data, based either on its volume or on the number of ‘data points’ generated. The concept of Big Data comes to the fore when companies face such a great flow of data that conventional processing and analysis tools cannot handle it effectively.

Data can be structured, semi-structured or unstructured. Structured data are, for example, data from purchases or sales from an organisation or information from forms or operational tables. Unstructured or semi-structured data are information generated without an established order and from sources such as, for example, social media, user logs in web or mobile apps, sharing of opinions or files.

According to data from Harvard Business Review, only 20% of the data that gets to companies is structured, while the other 80% is semi-structured or unstructured. Moreover, the percentage of that structured data that is used to support decision-making and to extract insights is less than 50%; however, for semi-structured or unstructured data, that percentage falls to 1%.

The Big Data concept can be characterised by five Vs:

  • Volume: massive amounts of data are generated and need to be stored and processed. According to the website Statista, already in 2018, 10.6 zettabytes were generated worldwide from cloud data centres.
  • Velocity: the velocity of generating, processing and analysing data can be more important than volume, since real-time or near-real time information provides great agility to companies that have a Big Data strategy implemented.
  • Variety: data can originate from various sources, such as normal data bases, social media, web pages, financial transactions, emails, sensors (IoT), audio, text or video files, archives, forums, etc.
  • Veracity: is the generated data reliable, according to its source or origin?
  • Value: do the generated data have true value for a company? It is necessary to assess if those data will, in fact, generate new opportunities, increase income or optimise costs, for example.

Advantages of implementing a well-defined strategy

So, we know that implementing a Big Data strategy has become a necessity for large organisations, and the focus has changed from “whether to use Big Data” to “how to use Big Data more efficiently”.

We also know that Big Data opens doors to better informed decision-making, based on extremely complex analysis, and that it allows the collection of important insights to optimise the information gathered. Consequently, the decision to implement a Big Data strategy must come from business teams and not from the IT departments that must ensure the technical execution of the project in the most efficient way. Basically, it is those business teams that will get value from the gathered data for their daily work and for the definition of  strategy.

However, what are the true advantages of implementing a Big Data project? What will be the advantages to the competitiveness of your business? We identify three of the main advantages of implementing Big Data in your company:

Advantage 1: Informed decision-making

With data analysis carried out by Big Data technologies, it is possible to find purchase or behaviour patterns that support decision-making from business departments. For example, if a marketing team has information that a certain family buys the same product every single month, it can send discounts for that same product through digital or physical mailing, in order to ensure that the customers stay faithful.

Advantage 2: Reduced costs

Data generated from or for a company are stored, processed and analysed, resulting in finding important business insights or the identification of gaps and errors. Working on data previously analysed and having access, for example, to constant behaviour or purchase trends, allows companies to launch more efficient campaigns that reach directly to the desired target and, therefore, can register a better ROI. This way, optimising the use of a budget will make teams more efficient – also increasing their productivity.

Advantage 3: Possibility to predict future situations

Usually, in Big Data, there are three types of analysis that can be carried out and complement each other:

  • Descriptive analytics, the type of analysis that describes what is happening, often in real-time. By the use of data aggregation and data mining, it is possible to access a picture from the past and understand the reason for a departure or a change – or just summarise a certain aspect.
  • Predictive analytics, the type of analysis that predicts what might happen in the future, relying on statistics and algorithms and providing scenarios of statistically probable situations.
  • Prescriptive analytics, based on optimisation, simulation algorithms, machine learning and computational models; this is quite a complex type of analysis, which seeks to answer the question “what should we do in a given situation?” Basically, the scenarios created will work as specifications of different actions and their expected outcomes, allowing the company to choose the scenario that represents least risk, for example.

Practical examples

Now that you know the advantages of implementing Big Data in your company and how to establish specific and measurable goals, the question is: how can you benefit from the data generated from the organisation, based on the area it impacts?

Here are a few practical examples:

  • Data from sensors in transportation systems;
  • Analysis of financial data to prevent fraud (for example, by detecting the use of a credit card from an unusual user);
  • Analysis of network traffic;
  • Monitoring mentions on social media to assess if the emotions towards a brand/company are positive or negative;
  • Information on traffic flows to predict which times will be more problematic.
Ana LamelasAdvantages of implementing Big Data in your company
read more

Tableau & Jira: A new way to look at your projects

Tableau is a self-service BI platform that allows the identification of valuable insights and provides advanced analysis, visualisations and the capability to share information quickly. As we all know, with the digital transformation era and with all the information surrounding our daily lives, the need to make decisions based on facts increases – this implies the ability to look at data and to be able to analyse and make decisions based on that data. Decisions taken on operational teams are no different, and in a service desk management team (and in bug fixing), having access to the numbers is the path to optimising teams and getting better results.

So, integrating Tableau and Jira is a new way to look at your projects because even though Jira offers some options to create reports or to get essential metrics, only with a tool such as Tableau will your team be able to cross-reference data with other data sources and create great looking dashboards or advanced analytics. This blog post aims to explain in detail how it is done.

Tableau is a visualisation tool, and it is divided into three modules: Tableau Desktop (allows the connection to all types of databases and enables the creation of business rules, field nomination and an overview of all data); Tableau Server (where you can publish views and share information with other team members – granting and removing access, writing comments and editing views); and Tableau Data Prep (which is an ETL tool that helps users prepare data and extract data from a variety of sources, transform that data and output it, saving much precious time).

Jira, as a project management tool, is not intended to analyse data in detail or to extract insights. It does have some features to create reports or to obtain some information: it has widgets; but if your team has different needs, for example, if your team truly needs to cross-reference data, you will need Tableau (because Jira can only access its own data). For example, to compare data from Jira with a timesheet application to see if the time registered in one app matches the time logged on the other app, you would need to install the All-In-One Tableau connector.

The All-In-One (AIO) Tableau connector is an app for Jira that implements a Web Data Connector (WDC) for Tableau. The WDC enables connections to data through HTTP when the data source does not have a Tableau native connector. The data is obtained and placed in an extract that becomes available to either Tableau Desktop or Server.

Setup

For an app to connect to Tableau through the WDC, you need to whitelist the corresponding URL available on the server. Make sure your sysadmin performs this step, and be aware that multiple keys will be generated, so the URL pattern and command need to be something along these lines:

tabadmin whitelist_webdataconnector -a  https://yourJIRAdomain/plugins/servlet/aio-tb/public/tableauconnect(.*)

Connecting to data

  1. Open Jira
    1. Obtain your AIO connector URL. Each URL provides access on behalf of a particular user, ensuring you will only have access to your own projects:
  1. On Tableau Desktop:
    1. Select the WDC connection type
    2. Paste the URL and click enter
    3. Define a name for this connection – perhaps the project name – and a JQL Query (to get all issues from a project, use: project = PROJECTID)
    4. Choose the fields or subjects of fields to retrieve, and click continue
    5. Depending on your selection, the WDC makes available multiple labels that you can now join as normal Tableau data sources
    6. Start your analysis
    7. Publish if you want to share

The result

You can now explore Jira data, create powerful dashboards, extract the most valuable insights and increase your team performance – all with a fantastic tool called Tableau!

Ana LamelasTableau & Jira: A new way to look at your projects
read more

Sentiment Meter: the perfect mix between Gamification and Artificial Intelligence

The Sentiment Meter is an artificial intelligence (AI) solution conceived, developed and designed by Xpand IT and combines two components: gamification and artificial intelligence. It is a game of emotions in which, after the user fills in a small form, the software randomly selects an emotion that the user has to try to express to the best of their ability. After photographing this moment, the Sentiment Meter evaluates the user’s performance and gives the user a score. In fact, this game has turned out to be a real success, and it can be said that the Sentiment Meter is the perfect mix between gamification and artificial intelligence.

This game was born from the need to create something that would not only show Xpand IT’s technical abilities – ­from our new AI Solutions Centre – but also that would be able to entertain people coming to our stands in the countless events in which we take part. IDC Directions 2018 and Web Summit were the first conferences we took the Sentiment Meter to, and we can say that it did not go unnoticed. Here are some pictures of those moments:

The technology

The logic is very simple: the player spins the wheel of emotions, the computer selects an emotion/facial expression and the player simply has to express that emotion with his/her face. Finally, the interface scores to the player and he/she wins a prize. It seems quite simple; but, actually, what is behind this analysis of emotions is an intelligent algorithm from Microsoft: Azure Cognitive Services. In this case, we use the Face API, which allows processing and recognizing faces and identifies which emotion a person is expressing. This algorithm is fueled by each use and by the pictures taken all over the world.

The whole infrastructure is based on the cloud, in Azure. Some other Microsoft tools were also used, such as SignaIR, which manages interactions in real time between the screen that presents the game and the tablet that gives the commands to the person playing. Moreover, a .NET Core open source framework was used as the basis for this project, allowing for the development of web and cloud apps. The project’s front end was developed on the web with HTML, CSS and Javascript, also relying on some extensions, such as JQuery, Ajax and p5.js.

The team

This whole project would not have been possible without the teamwork of members from our Digital Xperience, UX/UI and Marketing teams, who were able to design and mastermind this solution: Francisco Correia, Senior Project Manager, and Ricardo Duarte, Developer, both from the Digital Xperience department; Marina Mendes, UX/UI Designer, and all the other members of the Marketing team who ensure its proper functioning during events.

 These are the screens of the game:

The Sentiment Meter sets the direction for what it is possible to do today by using artificial intelligence and good ideas born in simple conversations!

Ricardo Duarte, Developer at Xpand IT

With this project we were able to show that integrating Artificial Intelligence services can, even today, set the bar high, whether it concerns interactivity or decision-making.

Francisco Correia, Senior Project Manager at Xpand IT

We were able to bring the user closer to the interface by thinking in emotions and conveying them to technology in a fun and relaxed way.

Marina Mendes, UX/UI Designer at Xpand IT
Ana LamelasSentiment Meter: the perfect mix between Gamification and Artificial Intelligence
read more

Xpand IT’s new Artificial Intelligence Center

The US Merriam-Webster dictionary defines Artificial Intelligence (AI) as:  a field of computer science that works on the simulation of intelligent behaviour in computers; the ability of machines to copy intelligent human behaviour. In fact, this is exactly what Xpand IT seeks to achieve with the new Artificial Intelligence centre: to incorporate an intelligence component in all areas of our society.

AI is a trend that is here to stay, and it will not be long before all companies have incorporated at least one solution simulating human intelligence in their systems, in order to accomplish basic tasks on a daily basis. However, let’s take it easy – we still are not at the level of The Matrix, or Ex Machina! The main focus of Xpand IT is to find real use cases and make  prototype solutions that can ultimately be presented to the end customer. There are plenty of examples of the use of AI that are meant to reduce effort in certain tasks, improve performance and speed in the solving of certain problems, or just gather valuable information that can be used by some departments inside a company. We present a few examples of the use of AI:

Development of a conversational interface (chatbots)

Chatbots are increasingly important in the global technological scene, and it does not take much thought to name several websites that have conversational interfaces trained to help visitors. This type of bot can be a big help to perform simple tasks, such as setting an appointment or buying a movie ticket, and can be applied to countless industries: banking, education, health, retail, and others. The main goal is to have a chatbot that is truly useful for users.

Text analysis and emotion analysis

Currently, we exchange a huge amount of information in text format. Therefore, text analysis ability is expected to improve as the amount of information improves. However, human beings have limits in their ability to process and analyse information, and that is why we have artificial intelligence. By taking advantage of specific techniques and more advanced technologies, it is possible to process all information in record time and to simultaneously gather other types of information, such as the mood of a person who wrote a certain message.

Image or video processing

Another case in which human limits are an opportunity to introduce AI is image (or video) processing. Being able to learn from a large dataset, an artificial intelligence solution can be taught to identify elements in an image or video, a task that would take a lot more time if it was done by human eyes. We can refer to facial recognition for app authentications, or even finding people or specific products in a live video. An AI solution can be an answer to these challenges.

In essence, Xpand IT has gathered specialists from various teams – such as Digital Xperience, Big Data and Data Science – to form a unit of true experts in Artificial Intelligence experiments and solutions, capable of developing projects completely out-of-the-box!

Ana LamelasXpand IT’s new Artificial Intelligence Center
read more