Why Smart Companies Skip Cleaning Data

This article critiques the common practice of exhaustive data cleaning before implementing AI, labeling it a consultant-driven "scam." Data cleaning is a never-ending and expensive process, delaying AI implementation while competitors move forward. Instead, I champion a "clean as you go" approach, emphasizing starting with a specific AI use case and cleaning data only as needed. Smart companies prioritize iterative improvement by using AI to fill in data gaps and building safeguards around imperfect data, ultimately achieving faster results. The core message is it’s more important to prioritize action over perfection, enabling quicker AI adoption and thereby competitive advantage.
Why Smart Companies Skip Cleaning Data

The digital transformation consultants have sold you a lie. They’ve convinced executives everywhere that before you can even think about AI, you need to embark on a months-long (or years-long) data cleaning odyssey. Clean everything! Standardize everything! Make it perfect!


It’s expensive, time-consuming, and worst of all—it’s completely backwards.

The Great Data Cleaning Scam

Here’s what’s really happening: consulting firms have discovered the perfect business model. Tell companies they need to clean all their data first, charge premium rates for the work, and enjoy projects with no clear endpoints. How do you know when your data is “clean enough”? You don’t. The goalposts keep moving, the invoices keep coming, and meanwhile, your competitors are already using AI to solve real problems.

This isn’t incompetence—it’s a feature, not a bug. Data cleaning projects are consultant gold mines because they’re nearly impossible to finish and even harder to measure success.

Why Perfect Data is a Myth

Let’s be brutally honest: your data will never be perfect. It can’t be. Here’s why:

Your data is constantly changing. While you’re spending six months cleaning historical warehouse data, new inventory is arriving, items are moving, specifications are updating. By the time you finish, your “clean” dataset is already outdated.

You don’t know what “clean” means yet. Until you understand exactly how you’ll use the AI system, you can’t know how to prepare the data. You might spend months standardizing product categories one way, only to discover your AI application needs them classified completely differently.

Unbalanced datasets make most cleaning irrelevant anyway. You could have the most pristine data in the world, but if you have 10,000 examples of one thing and 50 examples of another, most of that perfectly cleaned data is useless for training.

The Clean-As-You-Go Revolution

Smart organizations are taking a fundamentally different approach: they clean only what they need, when they need it, for the specific AI application they’re building.

Here’s how it works:

Start with your AI use case, not your data. Define exactly what problem you’re solving and what the AI needs to accomplish. Only then do you look at what data you actually need.

Let AI help clean the data. Cutting-edge AI systems are remarkably good at working with messy, incomplete data. They can fill in missing values, standardize formats, and even identify inconsistencies better than traditional data cleaning tools.

Curate, don’t clean everything. Instead of trying to perfect your entire dataset, create focused, high-quality subsets for your specific AI applications. This produces better results in a fraction of the time.

Embrace iterative improvement. Start with what you have, see what works, then clean and improve incrementally based on actual performance needs.

Real-World Examples

Consider a warehouse management system. The traditional approach says you need to track down size and weight information for every single item before you can start. That could take months and cost a fortune.

The smart approach? Use AI to estimate missing information based on available data, product categories, and similar items. Deploy the system, let it learn from real operations, and improve the data quality over time through actual use.

Or let’s take customer data. Instead of spending a year standardizing every customer record, start with the customers you actually interact with regularly. Clean as you go, focusing on the data that matters for your specific AI applications.

The Swiss Cheese Principle

AI systems don’t need perfect data—they need appropriate safeguards. Think of it like the Swiss cheese model: each layer of protection (human oversight, validation rules, AI confidence scoring, business logic checks) covers the holes in other layers.

Your data quality is just one layer in this system. Instead of trying to make it perfect, make it good enough and focus on building robust safeguards around it.

The Bottom Line

The companies winning with AI aren’t the ones with the cleanest data—they’re the ones who started fastest and learned most quickly. While their competitors are still debating data governance frameworks, they’re already on their third iteration of working systems.

Stop letting consultants hold your AI initiatives hostage with endless data cleaning projects. Your data doesn’t need to be perfect. It just needs to be good enough to start, with a plan to improve it through actual use.

The future belongs to organizations that embrace “clean as you go” and start building AI systems today, not to those still preparing for a perfect tomorrow that will never come.

Start messy. Start now. Clean as you learn. Your competitors are already doing it—and they’re not waiting for perfect data to get started.


Recent Content

5G-Advanced is redefining mobile networks through AI-native intelligence, sustainability, and advanced capabilities like XR support, NTN integration, and low-latency industrial IoT. Built on 3GPP Releases 18–20, it enables predictive automation, 30% energy savings, and sets the stage for 6G.
Memphis Light, Gas and Water (MLGW) and Nokia have launched the first standalone private 5G network by a U.S. municipal utility. This $31 million investment will modernize infrastructure across Memphis and Shelby County, enhancing real-time monitoring, outage response, cybersecurity, and smart grid capabilities for over 420,000 customers.
Predicting AI’s future is difficult, but its impact on work and life is certain. Many organizations are hesitant, “nibbling around the corners” instead of embracing transformative applications. This slow adoption, however, has allowed us to better understand and utilize large language models. The AI revolution mirrors the steam engine transformation, with organizations needing to integrate AI to stay competitive. The biggest winners will be those that successfully integrate AI, gaining a significant advantage. The most significant transformation will be in knowledge management, how organizations make decisions and leverage collective intelligence.
Connected aviation is transforming airports with secure private networks, IoT, and real-time data. This article unpacks how smart airports boost efficiency, safety, and passenger experience while unlocking new business value with real-world case studies from Heathrow, Changi, Dubai, and more.
Connected aviation is reshaping airports into smart, seamless ecosystems inside and outside the terminal. This case study reveals how hubs like Changi, Schiphol, and SAN use private networks, IoT, and cross-team collaboration to improve passenger flow, airside operations, sustainability, and safety.
Connected aviation is reshaping airports with autonomous systems, from security drones to robotic baggage vehicles and self-driving tugs. Automation improves safety, cuts turnaround times, and delivers a smoother passenger experience. Learn how airports use AI and robotics to stay competitive.
Whitepaper
Telecom networks are facing unprecedented complexity with 5G, IoT, and cloud services. Traditional service assurance methods are becoming obsolete, making AI-driven, real-time analytics essential for competitive advantage. This independent industry whitepaper explores how DPUs, GPUs, and Generative AI (GenAI) are enabling predictive automation, reducing operational costs, and improving service quality....
Whitepaper
Explore the collaboration between Purdue Research Foundation, Purdue University, Ericsson, and Saab at the Aviation Innovation Hub. Discover how private 5G networks, real-time analytics, and sustainable innovations are shaping the "Airport of the Future" for a smarter, safer, and greener aviation industry....
Article & Insights
This article explores the deployment of 5G NR Transparent Non-Terrestrial Networks (NTNs), detailing the architecture's advantages and challenges. It highlights how this "bent-pipe" NTN approach integrates ground-based gNodeB components with NGSO satellite constellations to expand global connectivity. Key challenges like moving beam management, interference mitigation, and latency are discussed, underscoring...

Download Magazine

With Subscription

Subscribe To Our Newsletter

Scroll to Top