Smart Data: when quality and reliability take precedence over volume

Trustpair & Altares white paper - Data Management & smart data

Last modified on July 20th, 2021

Unlike Big Data, which is based on a large volume of data, the Smart Data approach is based on the quality of the available data sets. From this point of view, Financial departments seek to identify which data they need to process and for what purpose.

At the same time, they pay particular attention to the internal and external processes and technologies at their disposal to ensure the highest possible level of reliability. 

Redefining the use of data for greater reliability

The limits of the Big Data model

Big Data is based on three main principles: 

  • Volume: traditionally, enterprise CIOs managed data volumes measured in terabytes. They now have to deal with petabytes (1015 bytes) or even exabytes (1018 bytes) because of the constantly increasing volume of data
  • Variety: the usual structured data are now supplemented by numerous semi-structured and unstructured data from internal sources (ERP, CRM, etc.) but also external sources (open data, social networks, etc.).
  • Speed: the frequency with which data is generated and updated is continually increasing. Companies need to take the right steps, both in terms of IT infrastructure and software solutions, to take advantage of this speed.

The benefits of Big Data are multiple, at least on the surface: operational efficiency, cost reduction, fraud detection and prevention, risk analysis, regulatory compliance, etc. 

What are the reasons for the limitations of Big Data projects?

  • Fragmentation and siloed data: like many organisations, the data collected is still too often siloed, making it impossible to achieve overall coherence.
  • Data quality: the same data can sometimes be duplicated in several places in the information system, with very different levels of quality.
  • Weak data culture: many Big Data projects fail because of the weakness of the data culture within companies, which has repercussions on internal and external processes but also on the capacity of the entire organisation to accommodate such projects (resistance to change, lack of skills, etc.).
  • Necessary investments: a Big Data project requires significant investments in terms of IT infrastructure, software and also skills. Data scientists are indeed a scarce resource, and therefore complex to recruit and retain.

Datalakes as evidence of the model’s exhaustion 

Closely related to Big Data, data lakes are intended to provide a storage space for all the information available to a company, regardless of its origin and format.

But, like Big Data projects, there are many aspects to the deployment of data lakes:

  • Lack of data structuring and prioritisation: these shortcomings can hamper decision- making within the business units and senior management of an organisation. 
  • Poor quality: of the main areas of data governance, quality is certainly the most important. Too often, data lakes have weaknesses in data consistency, accuracy and reliability, which affects the overall quality of the data.
  • Lack of involvement of business units: too often, the business value of a data lake project is eroded over time, in favour of technical considerations or cost issues. 

“Although data lake implementation projects have exploded in recent years, many have failed because the use of the data was not sufficiently well defined and the expected quality was not always achieved” – Frédéric Malagoli, PwC 

Data Management - White Paper - Trustpair x Altares

Towards Smart Data for a successful strategy 

Faced with the failure of Big Data and its associated data lakes, many companies are turning to a different approach: Smart Data. Deciding to implement a Smart Data strategy means giving priority – through the notion of “intelligence” – to processing less data but in a more refined way. 

“When we talk about Smart Data, we combine the notions of linking and history. The value of data is revealed when it is linked. Historical data allows us to determine a trend and a forecast.” Frédéric Paresy, Altares

Linking is perfectly in line with the notion of reference data. When data is linked, it should also important be linked to references and history in order to understand the history of the data, and once trust in the information has been created and sustained, it becomes possible to perform analysis and prediction using data science.

Technology for an effective Data Management strategy

Technology at the service of data

Today, the technological solutions managed by company’s ISDs enable to simplify and secure all processes. This translates into:

  • Improved reporting
  • More detailed monitoring of actions in real time
  • More effective and relevant decision support
  • Better management of skills. 

The growth of companies is better supported and sustained as a result. 

Integration solutions are strategic in this respect. They respond to the need to simplify the complexity of companies, by securing transactions and facilitating the work of employees. Thanks to them, companies gain in performance, carry out updates and continuous monitoring, thus saving repetitive and tedious tasks.

Essential tools for prediction and decision-making

Companies are very mature when it comes to creating a “rear view”, i.e. when they have to analyse the past. But their goal is to move towards prediction, in order to detect trends and insights that help them in their decision-making. This requires a certain amount of data that is reliable, of high quality and clean across all functions.

The job of any data team within a company is first and foremost to define the use of the data in close collaboration with the business departments. The more history you have on a piece of data, the better you can predict the future. 

“One of our jobs is to predict the failure of companies. But while history is important, it needs to be re-evaluated! It’s letting data sleep that’s dangerous.” – Michael Lisch, Altares

A transformation of the processes necessary for reliable and secure data

Challenge internal processes for better data protection

Ensuring data security and protection throughout the company is a long-term task. In addition to the legal measures that need to be put in place (review of contracts with important or at-risk subcontractors and deployment of internal charters aimed at raising awareness of cybersecurity risks among employees), organisational measures must be taken.

These measures aim to develop strict internal processes:

  • Mapping of internal data flows in order to make all data exchanges fluid and secure
  • Implementation of an Identity and Access Management (IAM) strategy defining access rights and specific authorisations for each employee
  • Implementation of alert thresholds according to the amount or sensitivity of a transaction.

“To achieve good Data Management and data governance, it is essential to combine: a technological solution, good internal organisation, defined processes and team awareness.” Betty Sfez, Cabinet Solegal

Want to know more about Data Management and Smart Data?

Get the last white paper “Data Management: the cure for wire transfer fraud ” paper co-branded by Trustpair and Altares!

Data Management - White Paper - Trustpair x Altares

Manage the risks related to corporate treasury.

Receive our latest news

Subscribe to the Trustpair Newsletter and receive advice every week…
Thanks ! Your subscription to the Trustpair newsletter has been taken into account.

        By clicking on “Subscribe”, you agree to receive the Trustpair newsletter to be informed of news or important information about our services. By subscribing, you agree to our Privacy Policy.

Related Articles