something strange with the main post image?
What you actually see there is a variation of Hermann’s grid, which I generated with the help of Gemini. And to be exact, I based it on the different modifications of this grid created by Jacques Ninio. The classic Hermann Grid creates illusory grey spots at intersections because retinal cells misinterpret the brightness of peripheral stimuli. Jacques Ninio’s variations magnify how ‘easy’ it is to manipulate perception by visual grouping and using focus [1].
Returning to the opening question, if the answer is positive, then perhaps you would be interested to know that you were deceived by a powerful optical illusion known as the scintillating grid illusion. As you glance at the grid, you will likely see dark, phantom-like ‘ghost’ dots appear inside the white circles at the intersections. These dots seem to ‘scintillate’ or sparkle, appearing and disappearing as your eyes move, but the strongest part of the illusion is that when you try to look directly at one of the black dots: it vanishes. Actually these dots appear only in your peripheral vision. This phenomenon is caused by the way neurons in your eyes process high-contrast areas, essentially ‘tricking’ your brain into perceiving a dot that isn’t actually there.
These grids serve as powerful cautionary tales for data and analysis. In my opinion, they illustrate very well the gap between ‘raw data’ (the actual black and white lines) and ‘perceived data’ (the illusory black/ grey spots). They demonstrate that the way information is visually presented can fundamentally alter human perception and even create false realities. Does that ring a bell? Consider data visualization: if a chart or graph is designed without taking into account the ‘bugs’ in our own perceptual systems, it can inadvertently mislead the audience, causing them to perceive false trends or correlations—‘grey spots’—that don’t actually exist.
Call for data humanization
‘Traditional’ approaches to data analysis, business intelligence, and data science focus primarily on the technical attributes of data—its volume, velocity, and variety. In this setup, metrics are treated as ends in themselves. The result? Critical insights remain buried in extensive spreadsheets or lengthy reports. ‘Data-driven’ decision-making, in turn, takes ages and often proves ineffective [2]. Even with the most meticulous plans, comprehensive dashboards, and robust data sets, leaders, managers, and colleagues today still find themselves asking:
What is that light at the bottom of the well?
This particular ‘vanishing dot’ exemplifies that even perfect data cannot always eliminate the fundamental uncertainties of any complex endeavor if people cannot use it properly (in this case read it upside down).
To escape the ‘data-rich, action-poor’ paradox, organizations need a new philosophy: data humanization.
This concept is more than simply buying a new tool: it’s embracing a new way of thinking. The objective is to transform data from a passive spreadsheet into a compelling narrative that moves stakeholders to action. Implementation of ‘humanized’ approach, in my view, is based on four elements:
- Some small fixes (for a start): Rather than launching another complex corporate project, start with making a few small fixes today.
- The Artisan: Establishing ‘Data Artisan’ roles to shape and translate complex data.
- The Story: Embedding ‘Data Storytelling’ as a core competency to make insights clear and actionable.
- The Impact: Enforcing robust ethical governance and, critically, measuring the financial return on analysis.
What does ‘humanized data’ mean?
Before discussing these elements in detail, it is essential to establish a clear understanding of what we mean by ‘humanized data.’
Humanized data is a strategic asset that translates what is happening into why it matters. This context is what makes the data actionable. Instead of just tracking symptoms (Key Performance Indicators, the KPIs), teams can finally solve the root-cause problems.
The true power emerges when traditional KPIs and humanized insights are combined. They mutually enhance each other, making the path forward clear and straightforward.
From metrics to meaning: examples
| Standard KPI (The What) | Humanized Insight (The Why & Who) |
| Cart abandonment rate is 75%. | 75% of shoppers abandon carts. Our analysis shows 60% of them drop off at the shipping page, citing ‘unexpected fees’ as the primary reason. |
| Project ‘Phoenix’ is 30% over budget. | Project ‘Phoenix’ is 30% over budget, driven by 800 hours of unplanned overtime from the core engineering team to fix scope creep in Module 3. |
| Production line B uptime is 88%. | Line B’s 12% downtime is almost entirely due to manual changeovers. Automating this specific process will reclaim 10 hours of production per week. |
| Q3 customer churn increased by 8%. | Our 8% Q3 churn increase was driven by long-time customers (3+ years) who experienced our new support system, reporting a 50% drop in ‘first-call resolution.’ |
Source: Table by the author.
Table 1 illustrates the claim mentioned above. The left side shows simple comments based on raw KPIs, while the right side enriches the same metrics with broadened, humanized insights. As this comparison shows, raw KPIs merely reveal symptoms, whereas humanized insights expose the root causes—such as customer motivations or process roadblocks. This resulting clarity is far more actionable, enabling teams to move beyond just tracking metrics and begin solving the core problems that stifle success.
Key benefits of humanized data:

Icons in the center and in the top right corner were generated in Gemini.
Elements of data humanization
Small fixes & quick wins on the path to humanized data.
To streamline my weekly reporting process, which involves pulling data from multiple sources, such as an extensive KPI deck, I recently developed an agent. In order to ensure the report provides more than just a simple number update, I give the agent an additional task, prompting it:
Find me a unique insight for this week… look for something out of the ordinary: an anomaly, a trend breakout, or something simply interesting I could share.
Whatever the agent produces, I always review it and enrich with my own qualitative insights gathered from business meetings. Occasionally, I feed this enhanced, final comment back into the model, allowing it to learn and improve its suggestions for the following week.

This simple example demonstrates one of the powerful techniques I’ll be sharing in this paragraph. All of them have three common characteristics: they are practical, require hardly any capital investment, and consume just a few minutes out of your weekly schedule. You can begin these at either a team or individual level, applying them directly to your own work.
Here are eight simple ways to get started.
| Find real problems | Talk to your colleagues in other departments. Ask about their frustrations with data or what data-related tasks take too long to complete. Listen to their challenges to find the real problems worth solving. This builds trust and allows you to address issues that matter. |
| Tell the human story | Metrics like ‘Monthly churn rate’ are often abstract. Reframe them. Instead of ‘Churn: 3.4%’, write ‘Last Month: 452 customers left us.’ This small change on a dashboard connects data to real people, making the metric more meaningful and actionable. |
| Share a data story of the week | Each week, find one simple, interesting insight from your data. Create a clear chart for it, write 2-3 sentences explaining why it matters, and share it in a company-wide channel, such as Slack. This makes data a regular, non-intimidating conversation. |
| Add a quick ethics check before sharing your data or insight | Take a few minutes to ask key ethical questions. For example: ‘Could this analysis harm any group?’ or ‘How could this data be misinterpreted?’ Make this a required step to ensure you are using data responsibly. |
| Add customer voices to dashboards | Your charts show what is happening, but customer comments explain why. Add a section to your dashboards that shows real, anonymized customer quotes from surveys or support chats. This provides crucial context for the numbers. |
| Build a ‘5-Minute Dashboard’ | Use a simple, free tool (such as Looker, Datawrapper), or an AI Assistant (like Gemini or ChatGPT) to quickly answer one urgent question for a stakeholder. Don’t aim for perfection. Create two or three simple charts, share them immediately, and get feedback. This collaborative approach delivers value fast. |
| Master one visualization tool | In most cases, you don’t need complex, expensive software. Become proficient with one tool, free or paid, even Excel or Sheets will do the job. Most important is that you can create clean, compelling charts with this tool. Use this tool for your ‘Data Story of the Week’ to practice and improve your storytelling. |
| Use AI for drafts, not final reports. | Let generative AI write the first draft of a summary or report (similarly to my little agent). Then, use a tool like Grammarly to make it sound more natural. Always have a human review the final text to check for accuracy, tone, and empathy (!!!). |
Source: table by the author, based on own experiences.
The Artisan

Humanizing data is the key to making complex information accessible. By adding context, raw data is transformed into consumable insights, empowering business analysts without requiring them to become programming or statistics experts.
This transformation requires elevating the role of data analyst into that of a ‘Data Artisan.’
The Data Artisan must learn how to act as an ‘architect of context.’ This effectively becomes a hybrid role that combines deep business knowledge with technical skills to build sophisticated data workflows. Their primary function is to make data ‘tell its story,’ enabling and driving strategic decisions.
The Data Artisans should fulfil these functions:
- Ingest and integrate: They master ‘the art’ of combining traditional structured data with unstructured context from sources such as social media or sensors. Something that machines still can’t do – seek unexpected patterns, associate facts (or assumptions) that have theoretically no obvious, clear linkage, that otherwise could have been read by an AI assistant.
- Seek patterns over perfection: They shift the analytical goal from ‘pixel-perfect’ accuracy to identifying meaningful, predictive patterns within large data volumes. Sometimes, a bold hypothesis that is later not confirmed can bring more value than spotless data. Sometimes, having something that answers our question with 80% accuracy tomorrow is worth more than 99.9% accuracy, but in three weeks.
- Insight at the point of decision: Artisans help to decentralize powerful analytical tools, making them accessible to empower decision-makers. They advocate for utilizing simple dashboard creation tools, such as Looker or Datawrapper, even if they are fed with static data. The goal is not the flawless UX or beautiful design. The goal is to facilitate faster decision-making. If the insights ‘click’, it is always easy (or at least easier) to find time and resources to ensure proper data uploads or a friendly interface.
- Reuse analytical IP: Create robust, reusable data objects and analytic workflows. Optimize your work. Create Agents to handle repetitive tasks, but give them ‘freedom’ to spot something beyond the basic algorithm.
The principal goal of this role is to democratize complex analytics. The Data Artisan absorbs the burden of complexity by creating reusable IP and accessible platforms. This, in turn, enables non-specialists across the organization to make informed, rapid decisions and fosters true organizational agility [3].
The Story
Data storytelling is the primary conversion mechanism that translates technical insights into persuasive, human action. If insights are the currency of the insight-centric organization, storytelling is the transaction system.
Every compelling data story must intentionally acknowledge and integrate three foundational elements:

Choosing a narrative framework is a critical, strategic decision that hinges on the communication’s primary goal. This selection becomes paramount when the audience consists of executive stakeholders. Executives operate under intense time pressure and are focused on strategy, risk, and ROI. A data story built for a technical team—perhaps a deep, exploratory dive—will fail to resonate.
The framework must be tailored to the goal. If the goal is to secure funding for a new platform, a persuasive structure like AIDA (Attention, Interest, Desire, Action) is crucial for building a compelling business case. If the goal is to report on an operational bottleneck and propose a solution, the logical, problem-centric SCQA (Situation, Complication, Question, Answer) framework will more effectively demonstrate due diligence and lead to a clear recommendation. The framework serves as the vehicle for insight, and for an executive audience, that vehicle must be fast, clear, and directly targeted at a decision.
Strategic data storytelling frameworks examples

For executives, effective data storytelling is a strategic translation, not a data dump. Leaders don’t need raw data; they need insights. They require data to be presented clearly and concisely so they can quickly grasp implications, identify critical trends, and communicate those findings to other stakeholders. A strong narrative structure—one that moves from a clear problem to a viable solution—prevents valuable insights from being lost in a poorly presented argument. This ability to translate data into strategy is what elevates data professionals from mere statisticians to true strategic partners capable of influencing high-level business direction.
Principles of high-impact data visualization
Data visualization is the bridge between complex datasets and human understanding (and a subject of a number of my articles). To be effective, the choice of a chart or graph must align with the message. For example, line charts are best for showing trends over time, bar graphs for making clear comparisons, and scatter plots for revealing relationships between variables.
Beyond selecting the right chart type, the intentional use of color and text is critical. Color should not be decorative; it should be used purposefully to highlight the most important information, enabling the audience to grasp the key takeaway more quickly. Text should be minimal, used only to clarify points that the visual cannot make on its own.
Finally, all visualization carries an ethical mandate. Data integrity must be maintained. Visualizations must never intentionally misrepresent the facts, for instance, by using misleading scales or inappropriate color contrasts.
The impact
The core idea: Prove data’s value to get support
To get executives to fund ‘data humanization’ (making data clear and easy to use), you must prove its financial value. The best way to do this is by showing its Return on Investment (ROI).
How to Prove the Value: A 2-Step ROI Plan
The ROI calculation is a simple comparison:
The value of action (from clear data) vs. The cost of inaction (from confusing data)
A confusing dashboard that gets ignored doesn’t just cost nothing; its ROI is negative because it wastes time and money. A clear, humanized dashboard is an investment that makes teams smarter and faster.
Step 1: Find the true cost of bad data
First, measure the real cost of your existing, ‘non-humanized’ reports. This baseline is more than just an analyst’s salary. Include the hidden costs of confusion:
- Time to insight: How many hours do managers waste trying to understand the complex report?
- Translation labor: How many hours do analysts spend re-explaining findings or making simpler PowerPoint versions?
- Insight adoption: How many key decisions are actually based on the report? (If it’s zero, the report is worthless.)
This total is the high price you are currently paying for confusion.
Step 2: Measure the gains from humanized data
Once you launch your new, clear dashboard, measure the return against that baseline. The gains are twofold:
- Efficiency gains (saving money):
- The manager’s Time to insight might drop from one hour to five minutes.
- The analyst’s Translation labor (re-explaining) all but disappears.
- Value gains (making money):
- This is the real prize. Track the new, better, or faster decisions made because the data was finally clear.
- Example: A marketing team shifts its budget 10 days sooner, or a sales team spots a new opportunity, generating measurable new revenue.
A simple example
- Before (bad data): A 10-tab data dump spreadsheet costs the company $10,000 a month in wasted manager time and analyst support.
- After (humanized data): A new, one-page dashboard costs $1,500 to build.
- The return (Month 1): It saves $8,000 in recovered time and helps a sales team generate $20,000 in new value.
The bottom line: Humanizing data isn’t a ‘nice-to-have’ design choice. It’s a high-return business strategy that converts organizational waste into decisive action [7].
Conclusions
Ultimately, the journey from raw data to real-world impact is fraught with perceptual traps, much like the illusory dots of the Hermann Grid. As we’ve seen, numbers alone are not self-evident; they are passive spreadsheets and abstract KPIs that often leave us ‘data-rich but action-poor.’
Breaking this cycle requires a strategic and cultural shift to data humanization. This transformation is not about a new piece of software but about a new way of thinking—one that empowers Data Artisans to find context, embeds Data Storytelling as a core competency, and relentlessly proves its Impact through a clear ROI.
By embracing these principles, we move beyond the ‘ghosts’ in the grid—the false correlations and missed opportunities—to see the human reality underneath. This is how we finally close the gap between analysis and action, transforming data from a simple report of what happened into a compelling catalyst for what happens next.
Sources
[1] Ninio, J. and Stevens, K. A. (2000) Variations on the Hermann grid: an extinction illusion. Perception, 29, 1209-1217.
[2] Data Storytelling 101: How to Tell a Powerful Story with Data – StoryIQ, 2025, https://storyiq.com/data-storytelling/
[2] Humanizing Big Data – DLT Solutions, https://www.dlt.com/sites/default/files/sr/brand/dlt/PDFs/Humanizing-Big-Data.pdf
[3] Gouranga Jha, Frameworks for Storytelling with Data, Medium, https://medium.com/@post.gourang/frameworks-for-storytelling-with-data-5bfeb1fbc37b
[4] Michal Szudejko, Turning Insights into Actionable Outcomes, https://towardsdatascience.com/turning-insights-into-actionable-outcomes-f7b2a638fa52
[5] Michal Szudejko, How to Use Color in Data Visualizations, https://towardsdatascience.com/how-to-use-color-in-data-visualizations-37b9752b182d
[6] Michal Szudejko, How Not to Mislead with Data-Driven Story, https://towardsdatascience.com/how-not-to-mislead-with-your-data-driven-story
[7] ROI-Driven Business Cases & Realized Value – Instrumental, https://instrumental.com/build-better-handbook/roi-business-cases-realized-value-technology-investments
Disclaimer
This post was written using Microsoft Word, and the spelling and grammar were checked with Grammarly. I reviewed and adjusted any modifications to ensure that my intended message was accurately reflected. All other uses of AI (image and sample data generation) were disclosed directly in the text.



