The Data Daily
Less than 2 minutes to read each morning.
Not sure if you want in? Read the archives below.
5 days a week since May 1st, 2023.
Assuming the way things are is the way they will always be
Are you wasting your data?
Oh, you are doing lots of data stuff.
An occasional new data source is integrated.
Bug fixing here, there, and everywhere.
New dashboards are happening.
Plenty of performance tuning.
Some code refactoring
Email spreadsheets
See? Lots of stuff.
But you aren’t confident it’s the right stuff. Are you moving the needle on what matters?
The stakeholder teams ask for things all the time, and you try to accommodate them as much as you can. Yet, it still feels like missing alignment and constantly shifting priorities are your daily life.
You know this isn’t the best way.
You feel like you are wasting data.
More importantly, if you are in a mission-driven organization, it isn't just profit on the line.
It's changing lives.
Empower lives.
Saving lives.
Wasting data means missing out on your mission.
--> Imagine clarity and confidence
…about your data teams’ current situation
…and a compelling vision for where you are headed.
...and how your organization can start using data to change their world.
Imagine knowing not just how to build the thing right, but how to build the right thing. Imagine the feeling of confidence when your team tackles a bug fix, new report request, or data integration initiative because you know how it creates value for the organization and the people you serve.
Data leaders waste a tremendous amount of time thinking the data problems they face are unfixable.
Stakeholders will always change requirements
Leadership will always ignore your data projects
Your data team will always have a high turnover rate
Wasting data happens when you assume that the way things are is the way they will always be.
There are roadmaps to fix these challenges.
The patterns are not a black box.
The first step is acknowledging: “I’m stuck”.
The second step is beginning to look for the answers.
[If this is you... Book a call and let’s talk]
I’m here,
Sawyer
Primary leverage point of data
You are tired of hearing it.
As a leader trying to leverage data, you hear people say it all the time.
But it's frustrating and exhausting.
"Win with data"
"Data is the new oil"
"You need to be data-driven".
"You are falling behind if you aren't using AI".
It's buzzwords. With no actionable advice. But as a leader, you feel a responsibility to do this, but with no clarity about where to start.
And so you fall into one of two extremes:
❌ Go all in on a major data initiative and invest tons of funds - somehow hoping to move the needle on this data stuff.
❌ Or you play it safe and conservative. You don't rock the boat. You don't want to waste donor funds on something you don't understand.
Both will fail you in the long run.
Instead, start here:
✅ Practice using data in making decisions.
Yes, you probably think you already do this. But for a couple of weeks, give it explicit and overt attention. Call it out in your process.
Before spending tens of thousands building data infrastructure, databases, analytical tools, and hiring data staff.
Here's a simple framework to understand how to use data in your org this week:
-> 1. Identify a key decision you are facing this week.
-> 2. Describe or quantify how much uncertainty you have in the decision you are making.
-> 3. What information would help you reduce your level of uncertainty?
-> 4. Find that information from your team.
-> 5. Reassess your level of uncertainty.
-> 6. Make your decision with more confidence.
A primary point of leverage for data is in reducing uncertainty in decision-making.
DM if you are ready to start doing something with data.
I’m glad you are here,
Sawyer
How to get leadership to notice you
After talking with dozens of data leaders over the last couple of years I’m convinced - Great data teams know what success is, how to measure it, and how to communicate it.
That can be you.
But why is measuring success for data teams so hard?
What are the consequences of doing this poorly?
And...how do I get leadership to pay attention to data and our data team?
Yesterday, I hosted a live webinar to discuss these questions, and to provide core principles along with a simple framework for measuring success on your data team.
If your data team could benefit from this, I’m open up slots to work with only 3 data teams this summer. For two weeks, we will work through a Measuring Success Launchpad. In just two weeks, you will define and measure peak success on your data team so you can execute, prioritize, and get noticed by leadership like never before.
Hit reply if you want more details.
I’m glad you are here,
Sawyer
A lighthouse in the fog of night
pre-s: Free Measuring Success for Data Teams Live Stream today at 12 pm ET on Linkedin.
See you there!
—————————————————
Confusion costs more than you think.
Clarity is worth more than you dream.
…
You inherited a data team, a data platform, a random assortment of reports, pipelines, and tools.
It can feel like a mess.
Daily, you see the pain points.
The database is overloaded with ETL processing and batch reports.
The data model can't accommodate most of the new report requests.
And leadership isn't convinced we are making progress in the right area so our budget is under review.
I've talked with many leaders in this position. It's exhausting, frustrating, and often feels hopeless.
Getting clarity through this noise feels like a lighthouse in the fog of night.
Not just doing the next thing to stay busy. But doing the next thing that will move you toward your outcomes. That will move your organization toward its goals.
And get your team out of neutral.
Here’s a simple and effective for you to start the 2nd half of 2024 well and to give your data team clarity.
Two core parts:
Audit and assess
Where are we? Where are we winning? Where do we want to be?
Strategize and roadmap
What's our unique area to excel? What resources do we have? How can we best execute on our vision?
You can go from noise to music. From pain to growth. From being lost to finding a trail.
Confusion is costing you more than you think.
Clarity is worth more than you dream.
An outside perspective is often essential in this process. Here’s a two-week offer to take you on this journey.
Sawyer
How to build a mental model that will make you an in-demand data leader
pre-s: Free Measuring Success for Data Teams Live Stream tomorrow at 12 pm ET on Linkedin.
See you there!
——————
How to build a mental model for data that will make you an in-demand data leader.
As a data leader, if you can develop a robust framework for these 3 things you will set yourself apart from your data peers and generate the attention of leadership.
Why
Why do we do what we do with data? Why is our organization investing so much money into data talent and technology? Why is data important to our organization?
Most people don't ask these questions seriously. They are happy with their assumptions about data. Establishing a real framework for answering the "Why" will take you far.
What
What do you need to build with data? What are the technical pieces that matter most? What are important technology elements that will allow us to be successful?
Many people start with the "what", but they quickly get confused because they don't have a clear "why". Once your "why" foundation is firm, there is invaluable knowledge required about the "what".
How
Naturally, after the "why" and "what" of data you have to ask about the "how". How do you actually build, deliver, and scale with data? How do the people, processes, and technology work together?
This is the phase where you move from theory to reality. Leading with data requires function and practical skills in operating and managing a data team, stakeholders, and data technologies.
Which of these do you feel most comfortable with?
Where do you have gaps in your understanding?
———————-
This "Why", "What", and "How" is the framework we follow for our data leadership cohort this summer.
Starting on July 16th we are gathering for 2 hours a week in a small group to discuss these topics. In between meetings, we will continue the conversations through async chat and video messaging.
If you are looking to transform yourself into an invaluable data leader this summer, this is your opportunity.
Join Ahmad, James, and myself.
I’m here,
Sawyer
Why don’t people use their turn signal?
One of the core metrics I help customers define are what I call Turn Signals.
You could also call them Decision Metrics.
These are specifically defined metrics that are connected to a decision or “turn” you will make. Often, those I work with struggle with this concept at the beginning, so here are a couple of different ways to approach them. This always happens after you have defined Destination and Waypoint metrics.
Two main ways to figure out Turn Signals:
Option 1) —> Start with what you are already measuring. And turn them into decisions.
You likely already have some important things in mind you are measuring. Work backward from the measurement to identify what decision should be made connected with that measurement.
Example: You might start with tracking the conversion rate for website visitors. Turning it into a decision metric means defining “we will A/B test our landing page. When the conversion rate between the two pages is +- X%, we adjust traffic flow toward the higher converting page.”
Option 2) —> Start with your dilemma. What key decisions do you face that are crucial to helping you reach your destination?
You start with a blank slate for the metric. Explore what the key decisions are that you regularly make that are connected to your Destination metric.
Example: Destination metric is to grow donor base, and newsletter subscriber is a key waypoint. You are regularly faced with a decision about what to do with the landing page to convert the most subscribers. So you define a decision metric like this: “we will A/B test our landing page. When the conversion rate between the two pages is +- X%, we adjust traffic flow toward the higher converting page.”
Those are two methods that lead you to the same Turn-Signal; when to change traffic to another landing page. Deciding what that conversion rate % difference should be is an important part of defining the metrics.
As a leader, you are a full-time decision-maker. Turn-Signals are a fundamental way of turning your decision-making into into an informed and data-driven process.
I’m here,
Sawyer
Behold and beware of the games you play
Pre-s: I have a limited number of 1 Hour Strategy Session slots this summer. One hour of pure focus where we strategy, design, and creatively engage your data challenges.
—————————
Behold! Designing success metrics for your team can unlock a new level of energy, clarity, and confidence for your team.
Beware! Success metrics are the primary way your team members and organization is incentivized. Without proper care, and eyes wide open, your metrics could introduce toxic dynamics on your team.
Incentives are all about what people optimize for.
Salespeople on a commission plan optimize for closing the most deals to increase their commission.
Customer Support people measured on how many support cases they close will optimize for speedy resolutions.
Software Engineers who are measured on how many commits they make each sprint will optimize for smaller, more frequent commits regardless of how it impacts code quality or processes.
Your success metric will be gamed. That’s what people do when they are incentivized to optimize for something. You can’t change that.
What can you control?
Make sure your success metric is something you want them to game.
Don’t hate the game. Play it.
I’m here,
Sawyer
From fluffy unicorns to clear progress
Having a destination is crucial. There is no practical way to be successful without it.
But a destination by itself is a fluffy unicorn. Mythical. Imaginary. Disconnected from reality.
So here’s how you create measurements to enable your destination metric.
Waypoints Metrics. These metrics are your key indicators of progress toward your destination metric.
Key Question: What things have to be true in order for our destination metric to improve?
You will likely come up with numerous things that need to be true, but narrow your focus and identify the top two most influential things. It’s subjective and that's ok.
Turn Signals. These are specifically designed decision metrics. When a turn signal hits a specific mark, a decision is made.
Key Questions: What is the dilemma I’m facing? Or What decisions will make the biggest impact on our destination metrics?
These are not “Keep an eye on our conversion rate”. These metrics are specifically designed as a decisions statement: When X goes above/below Y, we will [make this specific decision]
Gauges. A gauge gives you an overall operational perspective, but they are not actively monitored. Imagine staring at your speedometer the entire length of a 10-hour road trip. Pointless. They are reviewed on an infrequent schedule (once a month), but they are not actively managed.
Key Question: If XYZ fails, what’s the metric I’m going to need to check?
Here’s a fictitious example of how a non-profit child safety organization might work through this framework at an Org-Wide level.
Destination Metric (only 1) - # of At-risk children in safe and secure housing and communities
Waypoint Metrics (2 total) - 1) % Of Children Visited on time by Staff 2) Amount of housing and communities available to receive children
Note: Increasing both of these metrics will increase our destination metric
Turn Signal (3 total): 1) When the average length of child stay in a home drops below X months, we revise the host family assessment process. 2) When % of children visited on schedule drops below Y%, we hire another staff member. 3) When number of repeated safety incidence of children in our care increases above XY, we revise the Staff Visit Protocols.
Note: Occasionally your turn signal metrics will be similar to a waypoint metric.
Gauges (4 total): 1) # of New Children introduced into our the program. 2) # of Program Volunteers 3) # of Community Education Initiatives Completed 4) # of Family Counseling Sessions provided.
Note: No one is specifically incentivized to improve gauges. They exist to monitor operational health, not to indicate performance.
This is a powerful framework for getting clear on success on your team.
This summer I’m piloting a two-week program to help data teams define and deliver on success - using this and other frameworks. Hit reply if you want in.
I’m glad you are here,
Sawyer
5 Reasons NOT to do this…
Here are 5 Reasons to NOT Join the Technical and Strategic Data Leader cohort this summer.
We launch in 3 weeks. In case any of you are on the fence, here are key reasons you should stay on the fence.
1. You hate following recommendations.
100% of cohort members (people like data architects, director of analytics, VP of BI, etc) recommend it to others and you like bucking the trend.
2. You have all the network, connections, and peers you need.
What you have is good enough for you. You see no benefit in building a deeper network or connections with other data leaders.
3. You are too busy and wouldn't have the time to engage.
The cohort is an intensive experience. You need to commit to 2-4 hours a week for 6 weeks to engage in the content and conversations happening.
4. You are already an expert on Data Architecture, Power BI, Data Leadership, etc., and have nothing to learn.
Perhaps you already wrote a book on Data Architecture (like cohort mentor James Serra). Or you already teach Power BI at a university, consult with large organizations on Power BI strategy, and coach Power BI developers and leaders like Ahmad Chamy. Don't join if you don't have anything to learn.
5. You navigate organizational or data leadership challenges with ease.
Communicating with executive leadership, gaining strategic buy-in from peers, and aligning your team around core objectives - these are skills you've mastered. You don't need to learn any more here.
6. (Bonus) You aren’t interested in bonus content.
We just added a 7th week to the cohort to focus on Generative AI and LLMs. Same price to join, with an additional week of content and conversation led by another industry expert.
But.
If by chance those things aren't true about you.
And you actually do want to build your network, gain invaluable technical skills, and expand your capacity for leading in complex organizations.
You SHOULD join us. We have limited seats.
Register now to guarantee your spot.
thedatashop.co/leader
I’m here,
Sawyer
Your mantra for data team success
We’ve been talking about measuring success for data teams for a few days. Catch up here, here and here.
————————-
Most teams struggle to understand how to pick a Destination Metric. They can’t determine if they should be Inside the Box, on the Edge of the Box, or Outside the Box.
Asking where you “should” be is the wrong question right now. Start with what is already true about your data team.
Which of the following data teams best describes you?
Enablers:
This data team creates reports and ensures data is accurate and available when needed. They throw the data over the fence for the business teams to use. Their focus is on responding to tickets, optimizing data infrastructure, and providing quality data. This is a team of DBAs and Data Engineers.
Success of these teams is measured by the delivery of good data fast. They define success inside the box. Their mantra? “Here’s the data. Go be successful”
Advisors:
This data team accomplishes the same tasks as the enablers, but they have a focus on engaging with stakeholders, providing context around the data, and offering analytical insight. This team includes Data Analysts, BI Developers, and Data Scientists.
Success of this team is measured by making the stakeholders happy. They define success on the edge of the box. Their mantra? “Let me help you be successful”.
Partners:
This data team provides accurate data, and offers context, advice, and strategic perspective around the data. But, uniquely, they attach themselves to business objectives outside their data team. They are partners with the business and at the decision-making table.
Success of this team is measured by the success of the organization as a whole. They define success outside their box. Their mantra? “We win or lose together”.
Before you can pick a Destination Metric for your team, you have to understand what kind of team you are. An Enabler data team can’t set an Outside the Box metric without creating confusion and frustration.
It’s possible to move up from Enabler to Advisor. Or from Advisor to Partner. But that’s a conversation for another time.
Tomorrow, we will talk about setting your Waypoint, Turn Signal, and Guage Metrics.
It was good to see you today,
Sawyer
Defining success with your box
Data teams are great at metrics and goals. We love numbers, charts, graphs, and tables.
But we are terrible at understanding and defining terminal goals. Or what I call Destination Metrics.
Destination Metrics are the ultimate goal of your team.
“Our team exists to do XYZ”
That kind of statement is hard in itself.
But the next part is harder.
“And we measure our progress and success toward that purpose by tracking YXZ”.
You have three main options when picking a Destination Metric for your data team. Before I explain the options, you need to understand the Box Concept (I’ve written about it more here, and here)
You, as a team, are a box. A contained unit of people, purpose, strategy, flaws, personality, processes, and goals. There are clear walls to your box. You understand where your team begins and ends.
At the same time, your box exists within the context of larger boxes. Mostly obviously, your department and organization as a whole. Each of those are self-contained units, so you can clearly see where the walls are. You know where your department ends. You know where your organization ends.
Yes, the walls of the boxes are fuzzy at times. The culture of the department will bleed into the box of your team. You might have a team member who splits time into different teams and therefore fits into multiple boxes.
But fundamentally, our brains create these boxes with boundaries so that we can make sense of our complex world.
So when it comes to setting a destination metric (i.e. what’s the primary purpose of our data team), here are your three options.
Option 1: Inside the Box
Option 2: Edge of the Box
Option 3: Outside the Box
Inside the Box: Your Destination Metric is defined by the actions and results inside the box. Inside the Box metrics look like things the data team explicitly controls. “Decrease cloud data costs by X%”. “Maintain XYZ% Uptime on data systems”. “Deliver new data requests in Y days”
In this scenario, you optimize for things in the box.
Edge of Box: Your Destination Metric is connected in someway to an element outside your box. You don’t control all aspects of the metric, but it’s highly related to your work. The primary example of an Edge of Box destination metric is stakeholder Satisfaction. This could be NPS or some other method to assess and measure how happy our stakeholders are.
In this scenario, you optimize the edge of box interaction.
Outside the Box: This is the scary one. Your destination metric is not set based on what’s inside the box nor is it based on the edge relationship (that you have some control over). It’s beyond your box entirely. Very often, when you set an Outside the Box destination metric it is the terminal metric of the entire Organization. “Net Revenue”, “Student Outcomes”, “Program Participation/Engagement”, “Community Impact”
In this scenario, you are the data team, optimizing your activities to impact something completely outside the box.
What you set as your Destination is how you define success. It’s what you optimize for.
Tomorrow I’ll share another paradigm to think about these 3 options.
I’m glad you are here,
Sawyer
Escape from KPI purgatory
What if we made this easier?
Data teams are terrible at measuring their own success and progress.
You come up with a splattering of metrics and throw them on a dashboard to track “progress”.
Or you do nothing at all and hope for the best.
So you try for KPIs.
And try to think of everything that is “key” or relates to “performance”. The end result is a random assortment of things that seem important. Some are going up and some are going down. What do we do with them? Nobody knows.
So you try for OKRs.
The framework helps you identify important things you want to focus on. However, the ambiguous nature of the key results makes it difficult to quantify results. The art of OKRs feels complex to learn and you don’t have 18 months to get over the learning curve.
Here’s a simpler framework to try. It’s the 1-2-3-4 Roadtrip Method
Define 1 (and only one) Destination Metric. This is your terminal goal for your team. It’s not progress toward something else, it’s the end purpose.
Define (up to) 2 Waypoint Metrics. Waypoints are key markers along a journey. These are crucial for tracking your progress toward your Destination Metric.
Define (up to) 3 Turn Signal Metrics. These are decision metrics. They are specifically defined for to alert you when you hit an important decision point.
Define (up to) 4 Gauge Metrics. A gauge is an indicator of what’s happening operationally on your journey. What’s your speed? How much gas is left in the tank? Important to have identified, but pointless to stare at constantly.
It seems easy at first. But where I see most data teams struggle with is defining a Destination. Consequentially, the rest of the metrics fall down flat.
Tomorrow we’ll talk more about how to pick a destination metric.
I’m here,
Sawyer
[Podcast] How to design and build a data architecture for maximum impact
In this episode of Making Data Matter we sit down with guest James Serra - Technical Data Architect at Microsoft.
We discuss:
What is a data architecture and what purpose does it serve in an organization?
History of data architecture, data technologies, and how some things haven't changed.
How to decide on a data architecture based on the size and maturity of your organization.
Data Mesh, decentralization vs centralization, and picking and choosing the best pieces for your data architecture.
and more.
We are making this 20x more effective
We are one month away from the launch of The Technical and Strategic Data Leader and we have a few seats left in the cohort. I wanted to share with you a quick update:
---------
Did you know only ~5-15% of self-paced online courses get completed? Most aren't even started. Your Udemy account is likely a graveyard of unfinished good intentions.
We need to do learning and professional development differently. So this summer cohort of the Technical and Strategic Data Leader we are launching a learning experience that is far more successful than a video course.
5 ways I'm transforming a data leader learning experience to be 20X more effective.
1. A small group of data leaders (<12) for you to interact with, ask questions, and build relationships.
2. Live meetings. Conversation, questions, and interactions in real time between peers and mentors are crucial for in-depth engagement.
3. Async Communication. We improved communication methods in between the live sessions so you can continue the conversations and interact with your peers when it fits with your schedule.
4. Accountability. We all know each other in this small cohort. We know when people are missing. And we all lean on each other to learn the best. You are wanted and needed at every meeting.
5. Direct Access to Mentors. The cohort mentors are close at hand at all times, ready to answer questions, direct you to resources, and provide discussion topics.
The results of this cohort model?
100% of participants recommend it to others.
100% of participants said it was a great value.
We are launching the next cohort on July 16th.
Plus, I’m excited to share we’ve added a 7th bonus week to the cohort - Focused on LLM and Generative AI. Incredibly practical and tactical to equip you to lead in this new age of AI. Same price - more value.
Sawyer
Why is measuring success so hard?
“How do you measure success as a data team?”
So many different answers surface when I ask this question of data leaders. “Stakeholder satisfaction”, “Employee retention/morale”, “Data Quality”, “Completing our new data warehouse”, “Decreasing cloud costs”, “No one on the team working overtime”, “Data turnaround time” etc.
These answers range from operational efficiency and effectiveness to internal team culture, to technical goals, to stakeholder perception of the data team.
Why is this such a hard question for us to answer?
Here are the first few reasons that came to mind:
We don’t have a clear vision for data in our org
Is the data team a group of ticket-takers that churn out reports? In what way, and by what methodology, does data support leadership decisions? Can you tie any business results to something the data team delivered?
Data teams align to many different functions in a company
Is your data team embedded in the business units? Is it centralized? Does it roll up to finance, marketing, IT, or software engineering? Because of the numerous places data teams sit on the org chart, data leaders have constantly shifting views of what success looks like based on who they report to.
We haven’t been asked to define success before
For the last decade of low interest rates and cheap cloud tools, data teams could build all sorts of fancy tools and exciting data projects. The actual ROI was never demanded because R&D budgets were flush. That’s changed and the cheap money is gone. Now the budget and staffing cuts are here, we are scrambling to define success.
We love data too much.
It’s hard to find clarity about success when we have (or know how to find) any data point we want. Our dashboards are clouded with KPIs that make any sort of uniform message unclear. When “these six success markers are up, but these 5 are down” good luck sharing your progress clearly with leadership.
Ambiguous definitions of measurements
Leaders will tell me they measure success by “customer satisfaction”, ‘Data quality”, or “by contributing to business goals”. But they never clearly define what those terms mean or how they quantitatively measure them to track success.
For as much as you love data, you are relying a lot on your gut feeling about success.
There’s a better way to do this. This summer, I’m piloting a program called the Measuring Success Launchpad.
Hit reply if you want a short video explaining how the program works.
I’m here,
Sawyer
Just yeet the code change into prod and hope for the best
Pre-s - I’m piloting a two-week program this summer that walks data leaders through how to define and adopt success metrics for their data teams. It’s built on my experience working with dozens of data teams over the last several years.
If that that sounds beneficial to your team, hit reply and I’ll send you a short video walk through of the program.
——————
Don't approach your data development lifecycle this way.
The average data team has:
Minimal source control.
Limited testing framework.
Haphazard deployment process
Indescribable environment strategy.
It's costing you. Far more than you realize.
Without a meaningful dev lifecycle process you:
Are slower
More error prone
Confused about bug fixes
Losing the trust of your stakeholders
It's painful to watch. Painful to be a part of.
It drains the life from your team. It kills morale.
But most data professionals don't know any different. They assume prod failures are just a part of the process. They expect merges to always take days to unravel conflicts.
-> If you want to attract and retain better talent.
-> If you want your data team to take the next step.
-> If you want to deliver better data faster and more reliably.
Then your best next step is investing in CI/CD and development lifecycle.
I’m here,
Sawyer
The highest ROI for you career
Pre-s: I’m opening up a limited number of 1 Hour Strategy Session slots this summer. One hour of pure focus where we strategy, design, and creatively engage your data challenges.
——————————————
Great Data leaders know this.
Most others don't.
Your career won't be spent solving coding problems (like technical interviews or boot camps would have you believe).
I sit down and talk with data leaders ever week. It always sounds like this.
Instead, your time and energy will be spent on these two things:
Solving people problems
Solving (technical) design problems.
It looks like this.
Solving People Problems
Advocating for your team and opportunities with executives
Partnering effectively with other department leaders
Casting a compelling vision for your team to follow
Resolving conflict among team members
Solving (technical) design and process problems:
Which technology is worth investing in for the long-term of our team?
How can we resolve bottlenecks without spending more on tech?
How can we improve stakeholder satisfaction with our data?
How can we best manage a scaling data team?
Sure, learn to write good code. But the highest ROI for your team and career is in people problems and design problems.
I’m glad you are here,
Sawyer
Why the tech hype cycle will fail you
Every 12-18 months the tech hype cycle gets high on a new idea, framework, or model. Most recently it’s LLMs.
On the whole, as the pace of technology innovation accelerates, so does the hype cycle. If you try to follow along, it makes you dizzy.
I have data leaders both at Global 100 companies and at small manufacturing businesses asking me the same questions - What about the cloud? What about AI? What about blockchain? What about IoT? Data Lakehouses?
Some of these are sticky tech and some aren’t. But that’s not the point of this email.
Rather, I want to notice how tech functions very differently based on the size of your organization. Here’s how it looks in non-profit organizations (my focus area):
Large Organizations (1,000+ employees): These groups have dedicated teams to manage and configure technology infrastructure. They spin up VMs and have the skills to leverage IaaS solutions. They have a team of people solely responsible for licensing and administration of core software. Their main constraints are not about budget, but internal resistance and company politics.
Medium Organizations (250-1,000 employees): This size group has reduced capacity and capabilities, but is still able to operate modern tools and have dedicated tech teams. They lean heavier on managed services, but their size and unique needs also mean that they need to customize and occasionally invest in PaaS solutions.
Small Organizations (<250 employees): Little or limited dedicated IT staff. It’s common for this work to be outsourced to a third-party IT firm. The focus is on running lean and core constraints more often revolve around budget. SaaS solutions are the primary meaningful option.
Here’s the thing
The hype cycle will show you examples and use cases that won’t work for your size org. Before you can even evaluate how your company can adopt the latest tech cycle, you have to first understand what solutions are the most optimal for your size org.
Large Orgs: Investing in a custom solution.
Medium Orgs: Leverage existing frameworks, or lean on a managed platform.
Small Orgs: If it’s not a managed solution built for your size org, ignore it and move on.
This does not mean that small orgs are at a disadvantage. In fact, they can adopt tech so much faster than huge orgs can. But the types of platforms you should consider are vastly different.
I have these conversations with organizations every week. Hit reply if you want help navigating data solutions at your org.
It’s great to be back here with you,
Sawyer
Avoiding vanity metrics, making decisions, and driving outcomes with marketing data
New episode of Making Data Matter
In this episode, we sit down with guest Josh Burns - Founder and CEO of Spark Collective - a Digital Marketing agency for nonprofits.
We discuss:
Why marketing matters for nonprofits
How do you make data-driven decisions about marketing?
What are some fundamental marketing best practices for nonprofits?
How do you decide which metrics to track in your marketing campaigns?
and more.
Bad creativity
Creativity isn’t always a good thing. In fact, there are numerous areas where you shouldn’t be creative.
Bad creativity:
Not remembering dataset should be cleaned, and so you create a new process each time.
Not having a standard color palette, and so you spend an hour with the color wheel each time you build a new data viz report.
Not having an established naming convention for database tables and so you thought up a different standard each time you create a table.
Not having a shared place for documentation so sometimes you write documentation in Notion, others in a Google Doc, others in a Wiki, and other times in Slack.
I could go on.
Don’t be creative in these areas.
Good creativity:
The questions you ask when exploring a new problem.
The workflow design for an analytics experience.
Setting tactics and strategies for hitting your goals next quarter.
Options for how AI can improve the work experience for your team.
Some things we should automate, standardize, and scale.
Others we need to innovate, create waste, and explore.
I’m here,
Sawyer
p.s. The Nyquist family is out of town for vacation for the next several days. It will be quieter here on the daily email front than normal while I enjoy some busy family fun.