How To Create A Weekly Google Analytics Report That Posts To Slack

Featured Imgs 29

Google Analytics is great, but not everyone in your organization will be granted access. In many places I’ve worked, it was on a kind of “need to know” basis.

In this article, I’m gonna flip that on its head and show you how I wrote a GitHub Action that queries Google Analytics, generates a top ten list of the most frequently viewed pages on my site from the last seven days and compares them to the previous seven days to tell me which pages have increased in views, which pages have decreased in views, which pages have stayed the same and which pages are new to the list.

The report is then nicely formatted with icon indicators and posted to a public Slack channel every Friday at 10 AM.

Not only would this surfaced data be useful for folks who might need it, but it also provides an easy way to copy and paste or screenshot the report and add it to a slide for the weekly company/department meeting.

Here’s what the finished report looks like in Slack, and below, you’ll find a link to the GitHub Repository.

GitHub

To use this repository, follow the steps outlined in the README.

https://github.com/PaulieScanlon/smashing-weekly-analytics

Prerequisites

To build this workflow, you’ll need admin access to your Google Analytics and Slack Accounts and administrator privileges for GitHub Actions and Secrets for a GitHub repository.

Customizing the Report and Action

Naturally, all of the code can be changed to suit your requirements, and in the following sections, I’ll explain the areas you’ll likely want to take a look at.

Customizing the GitHub Action

The file name of the Action weekly-analytics.report.yml isn’t seen anywhere other than in the code/repo but naturally, change it to whatever you like, you won’t break anything.

The name and jobs: names detailed below are seen in the GitHub UI and Workflow logs.

The cron syntax determines when the Action will run. Schedules use POSIX cron syntax and by changing the numbers you can determine when the Action runs.

You could also change the secrets variable names; just make sure you update them in your repository Settings.

# .github/workflows/weekly-analytics-report.yml

name: Weekly Analytics Report

on:
schedule:
– cron: ‘0 10 * * 5′ # Runs every Friday at 10 AM UTC
workflow_dispatch: # Allows manual triggering

jobs:
analytics-report:
runs-on: ubuntu-latest

env:
SLACK_WEBHOOK_URL: ${{ secrets.SLACK_WEBHOOK_URL }}
GA4_PROPERTY_ID: ${{ secrets.GA4_PROPERTY_ID }}
GOOGLE_APPLICATION_CREDENTIALS_BASE64: ${{ secrets.GOOGLE_APPLICATION_CREDENTIALS_BASE64 }}

steps:
– name: Checkout repository
uses: actions/checkout@v4

– name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: ’20.x’

– name: Install dependencies
run: npm install

– name: Run the JavaScript script
run: node src/services/weekly-analytics.js

Customizing the Google Analytics Report

The Google Analytics API request I’m using is set to pull the fullPageUrl and pageTitle for the totalUsers in the last seven days, and a second request for the previous seven days, and then aggregates the totals and limits the responses to 10.

You can use Google’s GA4 Query Explorer to construct your own query, then replace the requests.

// src/services/weekly-analytics.js#L75

const [thisWeek] = await analyticsDataClient.runReport({
property: `properties/${process.env.GA4_PROPERTY_ID}`,
dateRanges: [
{
startDate: ‘7daysAgo’,
endDate: ‘today’,
},
],
dimensions: [
{
name: ‘fullPageUrl’,
},
{
name: ‘pageTitle’,
},
],
metrics: [
{
name: ‘totalUsers’,
},
],
limit: reportLimit,
metricAggregations: [‘MAXIMUM’],
});

Creating the Comparisons

There are two functions to determine which page views have increased, decreased, stayed the same, or are new.

The first is a simple reduce function that returns the URL and a count for each.

const lastWeekMap = lastWeekResults.reduce((items, item) => {
const { url, count } = item;
items[url] = count;
return items;
}, {});

The second maps over the results from this week and compares them to last week.

// Generate the report for this week
const report = thisWeekResults.map((item, index) => {
const { url, title, count } = item;
const lastWeekCount = lastWeekMap[url];
const status = determineStatus(count, lastWeekCount);

return {
position: (index + 1).toString().padStart(2, ‘0’), // Format the position with leading zero if it’s less than 10
url,
title,
count: { thisWeek: count, lastWeek: lastWeekCount || ‘0’ }, // Ensure lastWeekCount is displayed as ‘0’ if not found
status,
};
});

The final function is used to determine the status of each.

// Function to determine the status
const determineStatus = (count, lastWeekCount) => {
const thisCount = Number(count);
const previousCount = Number(lastWeekCount);

if (lastWeekCount === undefined || lastWeekCount === ‘0’) {
return NEW;
}

if (thisCount > previousCount) {
return HIGHER;
}

if (thisCount < previousCount) {
return LOWER;
}

return SAME;
};

I’ve purposely left the code fairly verbose, so it’ll be easier for you to add console.log to each of the functions to see what they return.

Customizing the Slack Message

The Slack message config I’m using creates a heading with an emoji, a divider, and a paragraph explaining what the message is.

Below that I’m using the context object to construct a report by iterating over comparisons and returning an object containing Slack specific message syntax which includes an icon, a count, the name of the page and a link to each item.

You can use Slack’s Block Kit Builder to construct your own message format.

// src/services/weekly-analytics.js#151

const slackList = report.map((item, index) => {
const {
position,
url,
title,
count: { thisWeek, lastWeek },
status,
} = item;

return {
type: ‘context’,
elements: [
{
type: ‘image’,
image_url: ${reportConfig.url}/images/${status},
alt_text: ‘icon’,
},
{
type: ‘mrkdwn’,
text: ${position}. &lt;${url}|${title}&gt; | *${x${thisWeek}}`* / x${lastWeek}`,
},
],
};
});

Before you can run the GitHub Action, you will need to complete a number of Google, Slack, and GitHub steps.

Ready to get going?

Creating a Google Cloud Project

Head over to your Google Cloud console, and from the dropdown menu at the top of the screen, click Select a project, and when the modal opens up, click NEW PROJECT.

Project name

On the next screen, give your project a name and click CREATE. In my example, I’ve named the project smashing-weekly-analytics.

Enable APIs & Services

In this step, you’ll enable the Google Analytics Data API for your new project. From the left-hand sidebar, navigate to APIs & Services > Enable APIs & services. At the top of the screen, click + ENABLE APIS & SERVICES.

Enable Google Analytics Data API

Search for “Google analytics data API,” select it from the list, then click ENABLE.

Create Credentials for Google Analytics Data API

With the API enabled in your project, you can now create the required credentials. Click the CREATE CREDENTIALS button at the top right of the screen to set up a new Service account.

A Service account allows an “application” to interact with Google APIs, providing the credentials include the required services. In this example, the credentials grant access to the Google Analytics Data API.

Service Account Credentials Type

On the next screen, select Google Analytics Data API from the dropdown menu and Application data, then click NEXT.

Service Account Details

On the next screen, give your Service account a name, ID, and description (optional). Then click CREATE AND CONTINUE.

In my example, I’ve given my service account a name and ID of smashing-weekly-analytics and added a short description that explains what the service account does.

Service Account Role

On the next screen, select Owner for the Role, then click CONTINUE.

Service Account Done

You can leave the fields blank in this last step and click DONE when you’re ready.

Service Account Keys

From the left-hand navigation, select Service Accounts, then click the “more dots” to open the menu and select Manage keys.

Service Accounts Add Key

On the next screen, locate the KEYS tab at the top of the screen, then click ADD KEY and select Create new key.

Service Accounts Download Keys

On the next screen, select JSON as the key type, then click CREATE to download your Google Application credentials .json file.

Google Application Credentials

If you open the .json file in your code editor, you should be looking at something similar to the one below.

In case you’re wondering, no, you can’t use an object as a variable defined in an .env file. To use these credentials, it’s necessary to convert the whole file into a base64 string.

Note: I wrote a more detailed post about how to use Google Application credentials as environment variables here: “How to Use Google Application .json Credentials in Environment Variables.”

From your terminal, run the following: replace name-of-creds-file.json with the name of your .json file.

cat name-of-creds-file.json | base64

If you’ve already cloned the repo and followed the Getting started steps in the README, add the base64 string returned after running the above and add it to the GOOGLE_APPLICATION_CREDENTIALS_BASE64 variable in your .env file, but make sure you wrap the string with double quotation makes.

GOOGLE_APPLICATION_CREDENTIALS_BASE64=”abc123″

That completes the Google project side of things. The next step is to add your service account email to your Google Analytics property and find your Google Analytics Property ID.

Google Analytics Properties

Whilst your service account now has access to the Google Analytics Data API, it doesn’t yet have access to your Google Analytics account.

Get Google Analytics Property ID

To make queries to the Google Analytics API, you’ll need to know your Property ID. You can find it by heading over to your Google Analytics account. Make sure you’re on the correct property (in the screenshot below, I’ve selected paulie.dev — GA4).

Click the admin cog in the bottom left-hand side of the screen, then click Property details.

On the next screen, you’ll see the PROPERTY ID in the top right corner. If you’ve already cloned the repo and followed the Getting started steps in the README, add the property ID value to the GA4_PROPERTY_ID variable in your .env file.

Add Client Email to Google Analytics

From the Google application credential .json file you downloaded earlier, locate the client_email and copy the email address.

In my example, it looks like this: smashing-weekly-analytics@smashing-weekly-analytics.iam.gserviceaccount.com.

Now navigate to Property access management from the left hide side navigation and click the + in the top right-hand corner, then click Add users.

On the next screen, add the client_email to the Email addresses input, uncheck Notify new users by email, and select Viewer under Direct roles and data restrictions, then click Add.

That completes the Google Analytics properties section. Your “application” will use the Google application credentials containing the client_email and will now have access to your Google Analytics account via the Google Analytics Data API.

Slack Channels and Webhook

In the following steps, you’ll create a new Slack channel that will be used to post messages sent from your “application” using a Slack Webhook.

Creating The Slack Channel

Create a new channel in your Slack workspace. I’ve named mine #weekly-analytics-report. You’ll need to set this up before proceeding to the next step.

Creating a Slack App

Head over to the slack api dashboard, and click Create an App.

On the next screen, select From an app manifest.

On the next screen, select your Slack workspace, then click Next.

On this screen, you can give your app a name. In my example, I’ve named my Weekly Analytics Report. Click Next when you’re ready.

On step 3, you can just click Done.

With the App created, you can now set up a Webhook.

Creating a Slack Webhook

Navigate to Incoming Webhooks from the left-hand navigation, then switch the Toggle to On to activate incoming webhooks. Then, at the bottom of the screen, click Add New Webook to Workspace.

On the next screen, select your Slack workspace and a channel that you’d like to use to post messages, too, and click Allow.

You should now see your new Slack Webhook with a copy button. Copy the Webhook URL, and if you’ve already cloned the repo and followed the Getting started steps in the README, add the Webhook URL to the SLACK_WEBHOOK_URL variable in your .env file.

Slack App Configuration

From the left-hand navigation, select Basic Information. On this screen, you can customize your app and add an icon and description. Be sure to click Save Changes when you’re done.

If you now head over to your Slack, you should see that your app has been added to your workspace.

That completes the Slack section of this article. It’s now time to add your environment variables to GitHub Secrets and run the workflow.

Add GitHub Secrets

Head over to the Settings tab of your GitHub repository, then from the left-hand navigation, select Secrets and variables, then click Actions.

Add the three variables from your .env file under Repository secrets.

A note on the base64 string: You won’t need to include the double quotes!

Run Workflow

To test if your Action is working correctly, head over to the Actions tab of your GitHub repository, select the Job name (Weekly Analytics Report), then click Run workflow.

If everything worked correctly, you should now be looking at a nicely formatted list of the top ten page views on your site in Slack.

Finished

And that’s it! A fully automated Google Analytics report that posts directly to your Slack. I’ve worked in a few places where Google Analytics data was on lockdown, and I think this approach to sharing Analytics data with Slack (something everyone has access to) could be super valuable for various people in your organization.

A Better Google Analytics Alternative

Featured Imgs 29

Our recent migration to GA4 left a lot to be desired and led us to explore for better google analytics alternatives. We tried just about everything out there, including Plausible, Fathom, and several others, all with their own pros and cons. The biggest hurdles were: limited features and higher costs.

That’s why we were so excited when we stumbled across Fullres recently. Not only do they have the best pricing around but they’re bundling multiple tools we use—ad revenue, analytics, web vitals—all into a single platform. Usually, you have to subscribe to multiple services and jump between browser tabs to see that amount of data together. Looking at their roadmap, there’s a lot more coming too.

Fullres also stood out with their quick 5-second installation setup. You get instant access to audience statistics in a GDPR-compliant manner and built-in Web Vitals data to continuously improve key metrics such as First Contentful Paint (FCP), Largest Contentful Paint (LCP), and other more.

For those who found the switch to GA4 challenging, Fullres is worth a try. It’s currently invite-only, so join the waitlist as soon as possible to get early access.

Data Analytics Trends To Watch in 2024

Featured Imgs 29

Technological advances in data analytics have influenced how data is accessed, stored, and managed over the years. Many companies today have robust tools, cutting-edge technology, and flexible ways to insightfully define, identify, and implement new technologies and trends as they emerge each year, improving best practices and shortening bad data cycles.

In this blog post, we explore the latest trends in data analytics services for organizations of all sizes in 2024 and beyond.

Knowledge Graphs and Analytics Without Graph Databases for Gen-AI

Featured Imgs 29

Graphs are more relevant and useful today than ever. Thanks to the AI revolution happening right now, engineers are thinking about the opportunities around Gen-AI, leveraging open Gen-AI solutions with dynamic prompting, data grounding, and masking which further pushes them to think about effective solutions like knowledge graphs.

Engineer, Mary is working on a data grounding problem and is considering building their Knowledge Graph for an AI solution for personalized product recommendations at work, and starts to wonder about

How Big Data Is Saving Lives in Real Time: IoV Data Analytics Helps Prevent Accidents

Featured Imgs 29

Internet of Vehicles, or IoV, is the product of the marriage between the automotive industry and IoT. IoV data is expected to get larger and larger, especially with electric vehicles being the new growth engine of the auto market. The question is: Is your data platform ready for that? This article shows you what an OLAP solution for IoV looks like.

What Is Special About IoV Data?

The idea of IoV is intuitive: to create a network so vehicles can share information with each other or with urban infrastructure. What‘s often under-explained is the network within each vehicle itself. On each car, there is something called Controller Area Network (CAN) that works as the communication center for the electronic control systems. For a car traveling on the road, the CAN is the guarantee of its safety and functionality, because it is responsible for:

Achieve Continuous Improvement With Augmented Analytics

Featured Imgs 29

If you want to encourage and enable continuous improvement within your enterprise, analytics can go a long way toward helping you achieve your goal. When you provide access to augmented analytics solutions for your business users, you distribute knowledge and engender fact-based decision making, helping your users to accurately complete tasks and plan for future success. 

McKinsey recently surveyed 2000 businesses and found that 83% of high-tech/media/telecom, 76% of banking, and more than 50%  of consumer companies identified as continuous improvement organizations. There is good reason for these results. Continuous improvement improves results! And, as McKinsey reports, continuous improvement companies seek to eliminate costs and to empower employees to improve efficiency and growth of product and service innovation. 

Amazon Neptune Introduces a New Analytics Engine and the One Graph Vision

Featured Imgs 29

It's not every day that you hear product leads questioning the utility of their own products. Brad Bebee, the general manager of Amazon Neptune, was all serious when he said that most customers don't actually want a graph database. However, that statement needs contextualization.

Suppose Bebee had meant that in the literal sense, the team himself and Amazon Neptune Principal Product Manager Denise Gosnell lead would not have bothered developing and releasing a brand new analytics engine for their customers. We caught up with Bebee and Gosnell to discuss Amazon Neptune's new features and the broader vision.

Data Ingestion for Batch/Near Real-Time Analytics

Featured Imgs 29

In the midst of our ever-expanding digital landscape, data management undergoes a metamorphic role as the custodian of the digital realm, responsible for the ingestion, storage, and comprehension of the immense volumes of information generated daily. At a broad level, data management workflows encompass the following phases, which are integral to ensuring the reliability, completeness, accuracy, and legitimacy of the insights (data) derived for business decisions. 

  1. Data identification: Identifying the required data elements to achieve the defined objective.
  2. Data ingestion: Ingest the data elements into a temporary or permanent storage for analysis.
  3. Data cleaning and validation: Clean the data and validate the values for accuracy.
  4. Data transformation and exploration: Transform, explore, and analyze the data to arrive at the aggregates or infer insights.
  5. Visualization: Apply business intelligence over the explored data to arrive at insights that complement the defined objective.

Within these stages, Data Ingestion acts as the guardian of the data realm, ensuring accurate and efficient entry of the right data into the system. It involves collecting targeted data, simplifying structures if they are complex, adapting to changes in the data structure, and scaling out to accommodate the increasing data volume, making the data interpretable for subsequent phases. This article will specifically concentrate on large-scale Data Ingestion tailored for both batch and near real-time analytics requirements. 

Data Anonymization in Test Data Management

Featured Imgs 29

The potential for data analytics to unlock economic opportunities is immense; however, as this potential expands, it also gives rise to new privacy challenges. Data anonymization is a crucial technique in this landscape, ensuring that sensitive information is removed or concealed. This results in anonymous data that can be used without risk of data breaches or authorization requirements. 

The implications of a data breach, as discovered by an IBM study, reveal a significant temporal distribution of financial impacts, making data anonymization a critical consideration.

How can I learn about digital marketing?

Featured Imgs 29

Learning about digital marketing can be an exciting and rewarding journey. Here's a step-by-step guide to help you get started:

  1. Understand the Basics:
    Begin by familiarizing yourself with the fundamental concepts of digital marketing, such as SEO (Search Engine Optimization), SEM (Search Engine Marketing), social media marketing, email marketing, content marketing, and more.
  2. Online Courses and Tutorials:
    Enroll in online courses on platforms like Udemy, Coursera, LinkedIn Learning, and Skillshare. Look for courses specifically focused on digital marketing. Some recommended courses include "Digital Marketing Specialization" on Coursera and "The Complete Digital Marketing Course" on Udemy.
  3. Read Blogs and Books:
    Follow reputable digital marketing blogs such as Neil Patel, Moz, HubSpot, and DigitalMarketer. Reading books like "Contagious" by Jonah Berger and "Jab, Jab, Jab, Right Hook" by Gary Vaynerchuk can also provide valuable insights.
  4. YouTube and Webinars:
    Watch educational YouTube channels and webinars that cover various Aspects of digital marketing. Channels like "Google Ads" and "Digital Marketer" offer insightful content.
  5. Practice Hands-On:
    Apply what you've learned by creating your own website or blog. Experiment with different digital marketing strategies and tactics to see what works best.
  6. Google Analytics and Google Ads:
    Gain proficiency in using tools like Google Analytics and Google Ads. Google's online resources and certification programs can help you understand how to analyze website traffic and create effective ad campaigns.
  7. Social Media Platforms:
    Learn how to leverage different social media platforms for marketing purposes. Each platform has its own nuances and best practices.
  8. Content Creation and SEO:
    Study content creation techniques and SEO principles to improve your website's visibility on search engines.
  9. Email Marketing:
    Explore email marketing tools and learn how to create engaging email campaigns that resonate with your target audience.
  10. Networking:
    Join online communities, forums, and social media groups dedicated to digital marketing. Engage with professionals in the field, ask questions, and share your insights.
  11. Certifications:
    Consider earning certifications such as Google Ads Certification, HubSpot Academy certifications, and others. These can boost your credibility and demonstrate your expertise.
  12. Stay Updated:
    Digital marketing is a rapidly evolving field. Subscribe to industry newsletters, follow influencers, and keep up with the latest trends and updates.
  13. Hands-On Projects:
    Work on real-world projects to apply your knowledge and build a portfolio. This can be immensely valuable when seeking job opportunities or freelance work.
    Remember, digital marketing is a broad field, so take your time to explore different areas and find what resonates with you the most. Continuously learn, adapt, and experiment to stay ahead in this dynamic landscape.

Five Critical Questions To Inform an IPaaS Selection for SMBs

Featured Imgs 29

For a long time, only large, well-funded businesses made data analytics the centerpiece of their operations. That's because standing up an analytics department was costly, and positive results were far from guaranteed. In fact, some estimates indicate that only a paltry 20% of analytics-driven insights yielded meaningful business outcomes as of 2022. At scale, though, that's not a major problem. After all, a single well-placed insight within that 20% could mean that a big analytics program had paid for itself.

However, for a small business, those odds don't always justify spending scarce resources on an analytics program. That was, of course, until a panoply of plug-and-play SaaS analytics solutions appeared that took much of the heavy lifting out of mining business data for insights. As a result, some 67% of small businesses now spend at least $10,000 per year on analytics operations.

Conducting UX Surveys: A Practical Guide

Featured Imgs 29

UX surveys can be pivotal tools for designers seeking to understand user preferences, opinions, and behaviors. They foster alignment between design strategies and user expectations and can improve product or service usability. Our overview unravels the process of conducting UX surveys, highlighting how both quantitative and qualitative approaches can yield essential user insights.

The UX Designer Toolbox

Unlimited Downloads: 500,000+ Wireframe & UX Templates, UI Kits & Design Assets
Starting at only $16.50 per month!


Conducting UX Surveys: Their Role and Execution

UX surveys serve as channels to collect insights directly from users about a product or service. They come in various forms, from online questionnaires to in-person discussions. These surveys aim to acquire both qualitative and quantitative data about user satisfaction, ease of use, and areas of potential improvement.

Conducting UX surveys follows a structured process. You begin by setting clear goals, and deciding what you aim to learn from the users. Then, you design a set of questions that invite insightful and actionable responses. Following the data collection, the task of data interpretation begins, leading to design changes that respond to the user’s needs.

Quantitative vs Qualitative: A Balancing Act

Quantitative surveys are useful when your goal is to collect numerical data. These types of surveys are great for tracking metrics such as usage frequency, user demographics, or user preferences. They offer the advantage of capturing data from a large audience, which can then be statistically analyzed to discern broader patterns and trends.

However, qualitative surveys offer something different. They are used when you want to dive deeper into the user’s thoughts, emotions, and experiences. Crucially, open-ended questions are the cornerstone of qualitative surveys, encouraging users to express their opinions freely. Although they might not yield broad statistical data, qualitative surveys provide detailed, nuanced information that can be invaluable for your design process.

Effective UX Survey: The Practical Steps

A well-designed UX survey is a careful process, requiring both strategic thinking and an empathetic understanding of your users. We’ll observe some of the indispensable steps that can guide your survey creation.

Objective Setting

Every UX survey must start with clear objectives. Whether you’re seeking to understand user behavior, assess user satisfaction, or gather feedback on a new feature, defining these goals will steer the development of your survey. It influences the kind of questions you will ask, the selection of respondents, and even the choice of the survey method. Clear goals ensure the collected data is genuinely useful and purpose-driven for your design strategy.

Drafting and Revision

The initial draft of your survey questions serves as a blueprint that should ideally be subjected to a review process. Don’t hesitate to involve your team, respected peers, or mentors in refining the questions. Their feedback will help eliminate ambiguities, prevent biased questions, and ensure the questionnaire resonates with your target audience.

Choosing the Right Platform

Selecting the most suitable platform for your UX survey significantly affects response rates and data quality. The nature of your survey – whether it’s a quick poll, an in-depth questionnaire, or an interactive survey – plays a huge role in this decision. Other factors to consider include the complexity of your survey, the technical competency of your target demographic, the platform’s user-friendliness on various devices, its visual appeal, and cost-effectiveness.

Question Design

The construction of your questions can be vital for the insights you gather. Close-ended questions, such as multiple-choice or Likert scale items, provide structured responses that are easier to analyze and compare. Meanwhile, open-ended questions encourage users to express their thoughts freely, providing deeper context and insight into their experiences. The key is to strike a balance: ask specific, direct questions to capture hard data, and open-ended ones to allow space for unexpected but valuable feedback.

Strategic Question Ordering

The placement of questions in your survey requires careful thought. Given the reality that some respondents will not complete the entire survey, it’s practical to position the most critical questions at the beginning. With this, you can somewhat secure the most valuable data, regardless of whether the user completes the entire questionnaire. Still, ensure a natural flow that doesn’t feel abrupt to the participant.

Testing the Waters

Prior to a full-scale launch of the survey, it’s beneficial to conduct a pilot test with a smaller, yet representative, sample of your user base. This approach allows for the identification and rectification of any potential issues – from ambiguous questions and technical glitches to unexpectedly long completion times. Moreover, pilot testing provides an opportunity to assess the survey’s ease and relevance, ensuring that the final version is as refined as possible before it reaches all users.

Wrapping Up

UX surveys can yield valuable user perspectives, but they should be seen as guides rather than definitive decision-makers in design choices.

Additionally, remember that a survey is a time commitment for your users. Avoid deterring completion or introducing response bias by overloading it with questions. Aim for a concise, engaging survey with a balance of question types.

Instead of duplicating data from analytics, use surveys to uncover user motivations, thoughts, and feelings that analytics can’t capture.

Lastly, consider both the user experience and your analysis capabilities when formatting questions. Open-ended questions offer rich insights but can overwhelm users and complicate analysis. Pilot-test these questions and refine them based on feedback. Some may work better as closed-ended questions for easier response and analysis.

For additional insights on managing broader yet valuable UX aspects, such as minimizing decision fatigue, feel free to check out this article.

Building A Log Analytics Solution 10 Times More Cost-Effective Than Elasticsearch

Featured Imgs 29

Logs often take up the majority of a company's data assets. Examples of logs include business logs (such as user activity logs) and Operation and Maintenance logs of servers, databases, and network or IoT devices.

Logs are the guardian angel of business. On the one hand, they provide system risk alerts and help engineers quickly locate root causes in troubleshooting. On the other hand, if you zoom them out by time range, you might identify some helpful trends and patterns, not to mention that business logs are the cornerstone of user insights.