Data is just a fad
In my last blog post, we took a high-level look at data analytics in the church. We briefly touched on the promise it holds and the challenges and objections that often surface against it. In this post, we’ll address one of the initial and most frequently heard objections…data analysis in the church is just another fad.
With the history of fads that have invaded…and beat a hasty retreat from…the church, this objection is completely understandable. Ministers, pastors, admins, ministry directors – we’ve all been inundated with the “newest” – “most innovative” – “effective” approach to you-name-it — childrens’ ministry, worship service, outreach program, evangelism approach. We’ve all seen these “can’t miss” approaches implemented – and promptly fail (or at least fail to deliver) – or then be replaced by the next hot approach. [potential meme placement]
From this perspective, it’s valuable to review the definition – and history of – data analytics. The definition that we’ll use is this:
Analytics is the discovery, interpretation, and communication of meaningful patterns in data; and the process of applying those patterns towards effective decision making. In other words, analytics can be understood as the connective tissue between data and effective decision-making within an organization.
To summarize – it’s about analyzing our data to gain understanding – finding connections in the data, and then making wise decisions based on that (along with other factors). This concept (extracting patterns from data) is not new by any means. A very brief history of key data analysis developments could be summarized like this:
- Bayes’ theorem, which dealt with probability, was created in the 1700s,
- Regression Analysis became very popular in the 1800s,
- Computer-based techniques began coming into their own (coincidentally with the advent of computers…) in the 1950s,
- Data mining and programs like SAS® / SPSS® originated in the late 1960s,
- Business Intelligence (“BI”) started in earnest in the 1990s, and
- The R programming language, one of the most widely used languages for statistical analysis, was created in 1993.
So the foundations of data analysis go back hundreds of years, and the most commonly used programming language for it was designed over 25 years ago. Hopefully, this helps begin to dispel that objection.
The other approach that can help demystify data analytics is simply describing what exactly it is and how it works.
Thankfully, data analytics is a progression of perfectly logical and understandable (albeit complex) steps in an overall process. So while we won’t go into great depth here, we will highlight the components and how they progress and fit together. Although the names (and numbers) of these steps can be described differently, we’ll cover the four major types of analytics.
The first is Descriptive analytics. This is the most simple and widely used type of analytics and typically answers the question – “what has happened?” It is primarily concerned with aggregating and summarizing groups of data. An easy way to envision this, in terms of church, would be things like “giving to date” or “past event attendance” and such. Given that understanding, these types of “descriptive analytics” have been used by churches for decades (if not centuries). The first church analytics dashboard was the church bulletin board…Who knew?!
The next type is Diagnostic analytics. This approach moves from the “what” (answered by descriptive analytics) to the “why.” Here we begin to look for contributing factors (e.g., other related pieces of data) that might help explain what we’re seeing. For a simplistic example (in a pre-Covid world), we might overlay our giving data with weather and attendance data. Attendance is often correlated with giving – and weather can impact church attendance. So the drop in giving we see on certain dates could be related to severe weather events.
The next phase is Predictive analytics. Although it may sound mystical, predictive analytics simply relies on the previous steps and takes it one step further. It looks at connected patterns or sequences in the data (in our example, weather → attendance → giving) and shows what could happen in the future (based on that data). This often takes the form of projections or forecasts – in mathematical terms, we’re just talking about probabilities based on underlying/related factors. Sticking to our example, this could play out by looking at next month’s projected weather patterns for our area, applying that to our historical church attendance data (for that month in prior years), and then calculating projected giving for next month based on those factors. We also learn about what factors help to either accentuate (the positive) or remediate (the negative) associated outcomes. For example, we learned that when the church proactively communicated about upcoming weather events and contingency options (streaming, online church giving), the negative impacts were reduced. Obviously, predictive analytics can take multiple factors into account – but this is what it does at its essence.
The final phase is Prescriptive analytics. Like predictive, this approach simply takes everything that has happened previously and carries it to its logical conclusion. So if we understand what has happened (descriptive), why it happened (diagnostic), and the sequence of events that lead to either positive/negative outcomes (predictive) – we can then take appropriate actions to influence the outcome we want to see (prescriptive). In our scenario (and as has been similarly driven home with Covid) – if we see a contributing factor come into play (e.g., a huge snowstorm is going to hit this weekend) – then we can take (or even automate) the appropriate steps (send out notifications about online church streaming capabilities, online giving, etc.) that will alleviate the unwanted outcome (decreased attendance, engagement, and giving) and encourage the desired outcome (online attendance, engagement, online giving).
Hopefully, those descriptions help take some of the mystery out of what data analytics encompasses. As opposed to being yet another “fad” to distract churches, I believe it’s a great gift (like many technologies) that, used properly, can provide immense benefit to the Church as we strive to mature and grow the body of Jesus.
For more resources on Data Analytics, please visit Church Growth.
Brett Herzog is a husband, father, pastor, and tech nerd. He has served in new product development since 2003 for industry-leading companies such as Thomson Reuters, Merrill Corporation, and Follett Corporation. He’s also co-vocational – pastoring a group of home churches in the Greenville, South Carolina area. As the Director of Ministry Intelligence at ACS Technologies, Brett is responsible for leveraging ACST’s research, data, and analytical IP to deliver true “Ministry Intelligence” to its ministry partners and the Church.
Pingback: 7 Deadly Myths of Data Analytics in the Church (Pt 3): Data Analytics is just about cold, hard numbers | Church Growth Blog by ACS Technologies