Last week it was made public that the personal information of 33 million French citizens could be exposed after two French health insurance operators suffered a data breach early February. Unfortunately, this isn’t an isolated incident. Other recent breaches occurred at i.e. EuroParcs, Air France KLM, JD Sports, T-Mobile, Sony, Cloudflare, …

Cybersecurity goes beyond network, application and endpoint security. Especially in today’s digital age, where data is the lifeblood of organizations, safeguarding sensitive information has become paramount. As organizations amass vast amounts of data, protecting it from unauthorized access, breaches, and misuse has become a complex challenge. In this context, implementing robust lines of defense through techniques such as data masking, data encryption, data  security gateway and data governance policy management is crucial to fortify an organization’s data management strategy.

Data Masking: Concealing the Vulnerabilities


What is Data Masking?

Data masking involves the transformation of sensitive information within a database, making it unintelligible to unauthorized users. The primary objective is to protect sensitive data while maintaining its usability for testing and analytics and overall usage in your processes.


The Defense Mechanism:

Data masking acts as the first line of defense by obscuring sensitive data such as personal identifiers, financial details, or confidential business information. This ensures that even if unauthorized access occurs, the exposed information is rendered useless and non-identifiable. This mechanism can also be useful in the context of specific compliancy driven initiatives such as i.e. GDPR.


Data Encryption: Securing the Data Source


What is Data Encryption?

Data encryption is the process of converting plain text into ciphertext, making it unreadable without the appropriate decryption key. It is a fundamental technique in securing data during transmission and storage.


The Defense Mechanism:

By implementing data encryption, organizations create a robust barrier against unauthorized access to sensitive information. It safeguards data in transit, preventing interception and tampering, and protects stored data from being deciphered by unauthorized entities. This mechanism can also be useful in case your infrastructure is outsources to a third party. Depending on the setup, even internal or external IT personal doesn’t have access to encrypted data.


Data Security Gateway: Active Control for Holistic Protection


What is a Data Security Gateway based on Data Virtualization?

A Data Security Gateway based on Data Virtualization acts as a centralized control point for securing data access, ensuring that only authorized users can retrieve and interact with sensitive information based on their role or profile.


The Defense Mechanism:


By implementing a Data Security Gateway, organizations gain real-time visibility into data access and usage. This proactive approach allows for immediate detection and response to potential threats, providing an additional layer of defense alongside masking, encryption, and governance.


The security layer of the data virtualization platform not only offers extra authentication functionality but also offers row, column and even cell-level security. With this approach you can enforce a security layer that is more strict than the underlying data sources.


Data Access Governance: Establishing Regulatory Compliance


What is Data Access Governance?

Data governance policy management involves defining and enforcing policies that dictate how data is collected, stored, processed, and shared within an organization. It provides a structured framework for managing data assets and allows you to easily create data access policies with a few clicks and preview them before they’re implemented. 


The Defense Mechanism:

Data governance policy management acts as the overarching defense strategy, ensuring that data is handled in accordance with regulatory requirements and internal standards. By establishing clear guidelines and enforcing policies, organizations mitigate risks associated with data breaches and non-compliance. Depending on the technology this can be enabled with a no-code approach to configure and execute a policy in a matter of minutes accross .



Conclusion: Integrating Defense Mechanisms for Holistic Protection

While each technique offers a specific layer of defense, their true strength lies in their integration. Data masking, encryption, data security gateways and governance policy management work synergistically to create a comprehensive and resilient data protection strategy.

By combining these techniques, organizations not only mitigate the risk of data breaches but also ensure compliance with industry regulations and standards. This is crucial in maintaining the trust of customers and stakeholders and avoiding legal repercussions.

By adopting the combination of these techniques, businesses can fortify their data management practices, instill confidence in stakeholders, and navigate the digital data landscape with resilience and security.


Interested in elevating your data security to the necessary standards? Discover how Datalumen can assist you in achieving this goal. 



Customer & household profiling, personalization, journey analysis, segmentation, funnel analytics, acquisition & conversion metrics, predictive analytics & forecasting, …  The marketing goal to deliver a trustworthy and complete insight in the customer across different channels can be quiet difficult to accomplish.

A substantial amount of marketing departments have chosen to rely on a mix of platforms going from CEM/CXM, CDP, CRM, eCommerce, Customer Service, Contact Center, Marketing Automation to Marketing Analytics. A lot of these platforms are best of breed and come from a diverse number of vendors who are leader in their specific market segment. Internal custom build solutions (Microsoft Excel, homebrew data environments, …) always complete this type of setup.

78% According to a Forrester study, although 78% of marketers claim that a data-driven marketing strategy is crucial, as many as 70% of them admit they have poor quality and inconsistent data.

The challenges

Creating a 360° customer view across this diverse landscape is not a walk in the park. All of these marketing platforms do provide added value but are basically separate silos. All of these environments use different data and the data that they have in common, is typically used in a different way. If you need to join all these pieces together, you need some magical super glue.  Reality is that none of the marketing platform vendors actually have this in house.

Another point of attention is your data scope. We don’t need to explain you that customer experience is the hot thing in marketing nowadays. However marketeers need to do much more than just analyze customer experience data in order to create real customer insight.

Creating insight also requires that the data that you analyze goes beyond the traditional customer data domain. Combining customer data with i.e. the proper product/service, supplier, financial, … data is rather fundamental for this type of exercises. This type of extended data domains is usually lacking or the required detail level is not present in one particular platform.

38% Recent research from  KPMG and Forrester Consulting shows that 38% of marketers claimed they have a high level of confidence in their data and analytics that drives their customer insights. That’s said, only a third of them seem to trust the analytics they generate from their business operations.

The foundations

Regardless of the mix of marketing platforms, many marketing leaders don’t succeed in taking full advantage of all their data. As a logical result they also fail to make a real impact with their data driven marketing initiatives. The underlying reason for this issue is that many marketing organizations lack a number of crucial data management building blocks that allow them to break out of these typical martech silos. The most important data capabilities that you should take into account are:




Master Data Management (aka MDM)

Creating a single view or so called golden record is the essence of Master Data Management. This allows you to make sure that a customer, product, etc is consistent across different applications.


Business Glossary

Having the correct terms & definitions might seem trivial but reality is that in the majority of the organizations noise on the line is reality. However having crystal clear terms and definitions is a basic requirement to have all stakeholders manage the data in the same way and prevent conflicts and waste down the data supply chain.


Data Catalog

Imagine Google-like functionality to search through your data assets. Find out what data you have, what’s the origin, how and where it is being used.


Data Quality

The why of proper data quality is obvious for any data consuming organization. If you have disconnected data landscape, data quality is even more important because it also facilitates the automatic match & merge glue exercise that you put in place to come to a common view on your data assets.


Data Virtualization

Getting real-time access to your data in an ad hoc and dynamic way is one of the missing pieces to get to your 360° view in time and budget. Forgot about traditional consumer headaches such as long waiting times, misunderstood requests, lack of agility, etc.



We intentionally use the term capability because this isn’t a IT story. All of these capabilities have a people, process and technology aspect and all of them should be driven by the business stakeholders. IT and technology is facilitating.

The results

If you manage to put in place the described data management capabilities you basically get in control. Your organization can find, understand and make data useful. You improve the efficiency of your people and processes, and reduce your data compliance risks. The benefits in a nutshell:

  1. Get full visibility of your data landscape by making data available and easily accessible across your organization. Deliver trusted data with documented definitions and certified data assets, so users feel confident using the data. Take back control using an approach that delivers everything you need to ensure data is accurate, consistent, complete and discoverable.
  2. Increase efficiency of your people and processes. Improve data transparency by establishing one enterprise-wide repository of assets, so every user can easily understand and discover data relevant to them. Increase efficiency using workflows to automate processes, helping improve collaboration and speed of task completion. Quickly understand your data’s history with automated business and technical lineage that help you clearly see how data transforms and flows from system to system and source to report.
  3. Reduce data and compliance risks. Mitigate compliance risk setting up data policies to control data retention and usage that can be applied across the organization, helping you meet your data compliance requirements. Reduce data risk by building and maintaining a business glossary of approved terms and definitions, helping ensure clarity and consistency of data assets for all users.

42% of data-driven marketers say their current technology in place is out of date and insufficient to help them do their jobs. Walker Sands Communications State of Marketing Technology report.


The data you need to be successful with your marketing efforts is there. You just have to transform it into useable data so that you can get accurate insights to make better decisions. The key in all of this is getting rid of your marketing platform silos by making sure that you have the proper data foundations in place. The data foundations to speed up and extend the capabilities of your datadriven marketing initiatives.

Need help unlocking your marketing data?

Would you like to find out how Datalumen can also help you with your marketing & data initiatives?  Contact us and start our data conversation.


Data Virtualization is definitely on the rise. At its Data and Analytics Summit in London, Gartner was projecting accelerated data virtualization adoption for both first-time and expanded deployments. Besides market analysts, we also see high demand and can confirm this being one of the hottest data solutions. But what are the top uses cases for data virtualization?


Interested in Data Virtualization?

Would you like to know how Datalumen can also help you understand how your organization can benefit from using Data Virtualization?  Contact us and start our data conversation.


Summer is here and the longer days it brings means more time available to spend with a ripping read. That’s how it ideally works at least. We selected 3 valuable books worth your extra time.


The Chief Data Officer’s Playbook

The issues and profession of the Chief Data Officer (CDO) are of significant interest and relevance to organisations and data professionals internationally. Written by two practicing CDOs, this new book offers a practical, direct and engaging discussion of the role, its place and importance within organisations. Chief Data Officer is a new and rapidly expanding role and many organisations are finding that it is an uncomfortable fit into the existing C-suite. Bringing together views, opinions and practitioners experience for the first time, The Chief Data Officer’s Playbook offers a compelling guide to anyone looking to understand the current (and possible future) CDO landscape.

Search on Google

Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility

Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility, the first book ever written on the topic of data virtualization, introduces the technology that enables data virtualization and presents ten real-world case studies that demonstrate the significant value and tangible business agility benefits that can be achieved through the implementation of data virtualization solutions. The book introduces the relationship between data virtualization and business agility but also gives you  a more thorough exploration of data virtualization technology. Topics include what is data virtualization, why use it, how it works and how enterprises typically adopt it. 

Search on Google

Start With Why

Simon Sinek started a movement to help people become more inspired at work, and in turn inspire their colleagues and customers. Since then, millions have been touched by the power of his ideas, including more than 28 million who’ve watched his TED Talk based on ‘Start With Why’ — the third most popular TED video of all time. Sinek starts with a fundamental question: Why are some people and organizations more innovative, more influential, and more profitable than others? Why do some command greater loyalty from customers and employees alike? Even among the successful, why are so few able to repeat their success over and over? 
People like Martin Luther King, Steve Jobs, and the Wright Brothers had little in common, but they all started with Why. They realized that people won’t truly buy into a product, service, movement, or idea until they understand the Why behind it.  ‘Start With Why’ shows that the leaders who’ve had the greatest influence in the world all think, act, and communicate the same way — and it’s the opposite of what everyone else does. Sinek calls this powerful idea The Golden Circle, and it provides a framework upon which organizations can be built, movements can be led, and people can be inspired. And it all starts with Why.

Search on Google

Summer Giveaways

We’re giving away 50 copies of ‘Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility’.  Want to win? Just complete the form and cross your fingers. Good luck!

Winners are picked randomly at the end of the giveaway. Our privacy policy is available here.


Forrester Research recently published The Forrester Wave™: Enterprise Data Virtualization, Q4 2017 report. The firm profiled 13 vendors in this report. The last Wave on this topic was published a while ago,  March 2015, with 9 vendors. Here is an overview of what has changed in the last two-and-a-half years.

Data Virtualization Market has Expanded

According to this Forrester report, the enterprise data virtualization market has expanded along multiple dimensions – customer adoption, more industries, more use cases, new players, and acquisitions.

  • More customer adoption – Forrester states customer adoption of data virtualization has been gaining momentum. In 2017, Forrester profiled 2,106 global technology decision makers for its Data Global Business Technographics Data and Analytics Survey, and found that “…56% of global technology decision makers in our 2017 survey tell us they have already implemented, are implementing, or are expanding or upgrading their implementations of DV technology, up from 45% in 2016.”
  • More industries – Forrester states that in its early years, data virtualization was primarily used in financial services, telecom, and government sectors. In the last 5 years, however, Forrester has found significant adoption of DV in insurance, retail, healthcare, manufacturing, oil and gas, and eCommerce verticals as well.
  • More use cases – Further, Forrester found that among the customers who have been using data virtualization, the deployment has increased from single-use case, primarily customer analytics, to a broader enterprise-wide use involving multiple use cases such as internet of things, fraud detection, and integrated insights.
  • New players – In the 2017 Enterprise Data Virtualization Wave report, four new vendors have been included implying, in our opinion, expanding data virtualization market.
  • Acquisitions – In signs that the data virtualization market is maturing, TIBCO Software recently acquired Cisco Information Server, thus entering the data virtualization market.

We think all these data points are significant indicators that the data virtualization market is a healthy, growing market that is reaching maturity.

Data Virtualization Poised for Further Growth Pushed Forward by Leaders

Forrester expects the data virtualization market to grow further “because more enterprise architecture (EA) professionals see data virtualization as critical to their enterprise data strategy.” It says that these EA Pros are looking to support more complex data virtualization deployments. To satisfy such needs, the leaders featured in the report provide high-end scale, security, modeling, and broad use case support with their mature product offerings. “The leaders we identified offer large and complex deployments, and they support a broader set of use cases and more mature data management capabilities,” Forrester says. It is worth noting that four of the past five leaders retained their Leaders positions, while one vendor slipped into the Strong Performers.

Read the Complete Report

The Forrester Wave: Enterprise Data Virtualization, Q4 2017 is a must read for enterprise architecture (EA) professionals. According to Forrester, “Enterprise data virtualization has become critical to every organization in overcoming growing data challenges. These platforms deliver faster access to connected data and support self-service and agile data-access capabilities for EA pros to drive new business initiatives.”

More info on Data Virtualization?

Would you like to know what
Data Virtualization can also mean for your organization?
Have a look at our Data Virtualization section 
and contact us.


Business intelligence & analytics today have dramatically shifted from the traditional IT-driven model to a modern self-service approach. This is due to a number of changes, including the fact that the balance of power has steadily shifted from IT to the business, and also the fact that the business community has new access to more innovative technologies that give them powerful analytical and visualization capabilities (e.g. Tableau, …).  This increased use and capability has put the business in the driver seat of much front-end BI decision-making. 

In order to help your business community continue to increase its self-service capabilities, there is one important, but often-overlooked item: Many implementations fail to realize their full potential because they fall into the trap of building out just the proverbial shop window, and forgetting the actual shop!  It is just as important to add increased accessibility and flexibility to the underlying data-layer (and ease the access, discovery, and governance of your data), as it is to provide users the front-end thru powerful analytics and visualization capabilities. 

With respect to self‐service analytics, four phases can be identified in the market. These also typically mirror how analytics are implemented in many of companies. The following diagram describes in four phases how data virtualization can strengthen and enrich the self‐service data integration capabilities of tools for reporting and analytics:



To support both IT-driven and Business-driven BI, two techniques are required: data preparation, and data virtualization.   There are a many scenarios where you can use these techniques to strengthen and speedup the implementation of self‐service analytics:

  • Using data virtualization to operationalize user‐defined data sets
  • Using data virtualization as a data source for data preparation
  • Using data virtualization to make data sets developed with data preparation available for all users 

To learn about how to succeed in your data journey, feel free to contact us. More info about our full spectrum of data solutions is also available on the Datalumen website.

Read more in detail about the different scenario’s in the ‘Strengthening Self-Service Analytics with Data Preparation and Data Virtualization’ whitepaper. In addition, this whitepaper describes how these two BI forms can operate side by side in a cooperative fashion without lowering the level of self‐service for business users. In other words, it describes how the best of both worlds can be combined. This whitepaper is written by Rick Van Der Lans, an indepedent analyst and expert.