How to Simplify Data Collection for Tech Companies

Explore top LinkedIn content from expert professionals.

  • View profile for Jordan Nelson
    Jordan Nelson Jordan Nelson is an Influencer

    Founder & CEO @ Simply Scale • Grow Faster by Automating Salesforce

    99,973 followers

    I saved this tech company $128,000 in 22 days. Here’s how: 6 months back I worked with Overjet. They’re the world-leader in dental AI. But their marketing department was struggling with Salesforce. They were using 3 tools for their CRM: • Hubspot • Google sheets • Salesforce Because they didn't have enough trust in Salesforce. Now, they’re a mid-market company — with 185+ employees. And they'd just hired a new marketing director. We’ll call him “Steve”. He was receiving high-level pay, but doing low-level work. It was a poor use of resources to say the least. It led to typos and bad data—human error. Hours of time wasted compiling data. Incorrect reporting and analysis. Delayed business decisions. And slow company growth because of the bottlenecks. They were making gut-feeling decisions instead of following the data... Because they knew their data was bad. Now let’s crunch the numbers: Steve was wasting 5.5 Hrs per day on manual entry. The avg. marketing director makes $186,162/yr. That’s an hourly rate of $89.50 down the drain. A whopping $128,557/year wasted. Again, 5.5 Hrs/day wasted on manual entry. That’s 110 Hrs/month. Adding up to 165 working days/year—LOST. So, here’s what we did to help: We always start with a discovery process. We asked them questions like: What tools are you using? What platforms are you using? Then we monitor the process: What does each employee do on a daily basis? Where are they spending the most time? What does their process look like? Why do they do it this way? We’re then able to see what’s repeatable. If it’s a repeated task, you can delegate it or outsource it. Or we can automate it for you. Next comes the build: They were using a tool called 'Pardot'. It's basically HubSpot for Salesforce. It collected their marketing data. For this reason, we integrated their Hubspot with Salesforce. This allowed us to remove their 3rd platform (the manual tracker in google). Then we were able to decide: Where do we want to push the data? What field does it go to? We do this for all our clients. Next comes the testing phase: 'Sandbox' allows us to make a copy of your info. We run it in a testing environment so we don’t “mess anything up". Before anything goes LIVE, we have a User Acceptance Testing (UAT) process. So if you requested something within Salesforce, we have you sign off on it. We want to know—“Okay this works. I tested it. We’re good to go.” That leads us into approval. My greenlight that everything we’ve built works. Then we go LIVE. We make everything we’ve built for you ACTIVE. We test one more time to make sure it runs correctly. Here are the results we got for Overjet: Without their marketing director tied up, they were able to make more money. They were able to scale—better and faster. For every $1 they paid us, they got $12 back. P.s. - Are you a mid-market tech startup that needs help automating Salesforce? Let's connect. Jordan Nelson

  • View profile for Bill Shube

    Gaining better supply chain visibility with low-code/no-code analytics and process automation. Note: views are my own and not necessarily shared with my employer.

    2,592 followers

    Last-minute Q4 #data requests used to catch us flat-footed every year. But we just handled a wave of them without breaking a sweat. Here's how: - Automated analytics: We broke up with #Excel. We adopted #nocode analytical tools to super-charge our ability to work with data. No more manual spreadsheets - we've automated everything, which allows us to manage more data more quickly, with fewer errors. (Of course, we still use Excel. We just use it for its intended purpose - small ad hoc analyses and quick data checks.) - Organize around data: It's not enough to have good tools. You need to have an organized approach to your data work, or else you run the risk of burying yourself under the weight of everything you build. We've borrowed a lot of best practice from our friends in IT: thorough documentation, data quality monitoring, an iterative approach to development, to name a few. We're not perfect, but we're getting better all the time, and the payoff is real. - Improved data literacy: We’re not a tech team - we’re a business team. But we’ve learned to “speak data,” and it’s opened up clearer lines of communication with IT. We can now tackle many challenges ourselves, and for bigger ones, we know how to convey our needs more effectively. The result? When those last-minute Q4 requests came in, we had the data at our fingertips—and the skills and tools to deliver quickly. What used to take a week or more now takes a day or two, sometimes less. I won't pretend that getting here was easy - but it's also not as hard as it may seem, and the value of having our data ready to go is hard to understate. #citizendevelopment #supplychainanalytics

  • At TripleLift, we had to deal with incredibly large datasets with dozens of dimensions and metrics and I wanted to share the techniques we used to make this possible. - Store everything: Ingest all data for immediate access, but beware of scaling challenges and rising storage costs over time. - Optimize storage: Separate storage and compute to enhance efficiency, focusing on data formats, partitioning, and exploring various data warehousing options. - Roll up: Aggregate data to reduce cardinality, eliminate unused dimensions, and summarize data at higher levels for quicker and cheaper analysis. - Cardinality collapse: Group less significant data into broader categories to manage long-tail dimensions and speed up analysis. - Sampling: Implement data sampling to collect a subset of events. More in depth blog post: https://lnkd.in/eP_YncUe