If you'd be up for sharing some lessons / takeaways / challenges here, or even better, having a chat (I'll reach out) that would be amazing.
We'll of course fully attribute learnings / quotes etc.
I think "self-serve" analytics is silly, the idea that you put all of the data in front of people and they'll derive "insights". That's not how normal people or data work. We just had a discussion on HN the other day about Facebook's Prophet, and its pitfalls. Meanwhile we expect Joe in sales to be able to identify useful trends on a chart he made. Every company needs to forecast, regardless of their sophistication. That stuff needs to be defined by the right people and given to the users.
Good decision support is where most of the value is, and it’s about building things that draw conclusions, not just throwing the data over the fence with 50 filters and expecting the end consumer to do the actual analysis.
I now work on an open source, code-based BI tool called Evidence, which incorporates a lot of these ideas, and might be of interest to people in this thread.
https://github.com/evidence-dev/evidence
Previous discussions on HN:
https://news.ycombinator.com/item?id=28304781 - 91 comments
https://news.ycombinator.com/item?id=35645464 - 97 comments
Too many systems have too much data for too many customer categories and end up being useless to everybody.
Also, it might surprise a lot of less experienced developers just how many reporting tools are actually pieces of a workflow, not just reports. If you sniff this out during the requirements phase, do your best to convert these reports into features of an actual workflow app/system rather than allow them to persist as standalone reports.
I think some people have a skewed view if they do most of their work with VC funded/SV companies. The average person at these companies is way more data savvy than average.
But there are so many companies out there that make a ton of money and have data-unsophisticated-but-domain-wise users, and old systems. Low hanging fruit.
These things happen all the time. And yet most companies out there think that the solution is to just build a bunch of dashboards, foreseeing what everyone will ask in the future. And then nobody checks the dashboards. Or finds the right one. And then they have a team of SQL translators pulling data for ad-hoc questions. That's silly IMO.
I'm obviously biased as a founder of a self-service analytics company based on AI (https://www.veezoo.com). But this is just my 2 cents on a topic I really care about.
In my experience, what "self-serve" really means is "non-developer". The end user won't build it, they'll have a BA do it. But it does mean they don't need IT to help.
I’ve heard pretty high-level managers respond to that question with things like “we were hoping your data would tell us” in response and I’m not sure what to make of it.
Hah, 90% of the time. I think a big part at being good at this job is being able to coerce that information from people.
You need a process of drilling down, kind of like the 5 Whys[0]. You want to make more profits, right? That means we need to either increase revenues or decrease costs. Are we measuring all these things (you'd be surprised at the number of seemingly successful companies who can't)? Okay, how do we affect revenue? By increasing the number of users or increasing the revenue per user. Are we measuring those things? And on and on. It's a perfect way to iterate, and as the company matures it can be infinitely more and more sophisticated. For lower level people, sometimes it means sitting there and watching them do their job.
> I think "self-serve" analytics is silly, the idea that you put all of the data in front of people and they'll derive "insights". That's not how normal people or data work
So well said. It doesn't shock me anymore when someone asks for a succinct summary or a PDF version rather than digging through dashboards on their own. In my company, we have a user-facing analytics product, and we added the option to take a PDF snapshot on a recurring basis and send it via email!
- Everyone asks to translate simpler spreadsheets and Excel charts/graphs into dashboards in your BI tool of choice. As soon as it's there, they'll ask you why they can't export to manage the data themselves. This vicious cycle can sometimes be stopped but is a slow-motion drag on productivity in lots of orgs.
- Build in validations, and/or work on ways to check the dashboard. Dashboards sometimes put their builders and consumers on auto-pilot. The dashboard "must be right" but could easily have a bug or inaccuracy for weeks/months/etc. that isn't obvious without some external validation.
- The dashboard never has the "right" metrics - users will continue asking for changes. Be your best advocate and say no as a way of understanding the importance of the ask
- Related: always ask why about everything you're building into or modifying in dashboards. Business users often ask for things without an ounce of rationale.
- Related: taking away is harder than not doing at all!
Finally, I think most dashboards miss one fundamental point. Imagine you're the CEO/COO and you've got this beautiful 3 or 4-chart dashboard in front of you. What should you know about what you're seeing? What's the succinct summary?
I like building in spots to write 2-3 sentence executive summaries.
Take a metric like Average Order Value (AOV). It may be ; total sales / order quantity. But as that metric is used it’s often being compared to something like last year, last month, or a plan and anyone interested in that number is really interested in understanding the “why” it has changed from some other point in time/scenario.
For that, you actually need to bring in line item details behind orders as each order has multiple products/skus and they likely sold at different prices from a year ago or what was expected in a plan. An analysis of this has a name, price-volume-mix analysis or PVM.
I always seem to have to explain this to BI teams when I join a new company and am seeking data. I’m currently going through it with a BI team, that apparently the BI tool wouldn’t store this information. It’s like it only stores aggregate values so it’s not even possible to get base level data for analysis (without major architectural changes). I don’t know if that’s normal in BI or was an implementation decision at some point but I’ve come across this same thing on a handful of companies and as I said I really have to drive this concept for those teams. When I ask of it I’m usually met with a “why would you need that info / give us a use case”. Which means, the don’t even understand how un-intelligent their BI tool is or why the execs likely aren’t feeling like investing in BI has been worthwhile (eg. Ever build a dashboard that then goes unused? I probably wasn’t perceived as useful for some reason like this).
This could be more concise put as, understanding your end users needs. Understand the difference of what people ask for is often different than what they need. If they ask for AOV metrics, they’re really saying “I need to understand AOV” and that’s done via PVM analysis.
> Which means, the don’t even understand how un-intelligent their BI tool is or why the execs likely aren’t feeling like investing in BI has been worthwhile
And this relates to what I was thinking about in my first comment. I once was conversing the COO of my company (my last job), at a 1000+ person company, and asked him if he thought more concise requests for things would drive productivity. He, point blank, said: "sometimes I don't even know what I'm asking for"
I've remembered that moment for years. In so many situations, the actual BI/dashboard is the least important part of the puzzle. Instead it's all of the conversation and discovery to understand the real need(s)
Deserves extra upvotes just for this statement.
This has always been painful to me working in the data analysis and reporting space. When I get many requests for dashboards or reports that lack an answer to the question of "how will this be used?", I seem to find the requesting groups are cost-centers in the larger organization and are somewhat obsessed with processes and procedures.
This is rarely a good group to build a career with . . .
Other times it’s because a supervisor or internal customer likes to see certain things on a chart, and putting the chart in Power BI/Tableau/other tool will make them prettier than Excel charts.
Very few people, starting from the top on down, have a good understanding that dashboards mean very little in and of themselves.
Validation/testing has always been a challenge, especially given that dashboards are by definition quite “full stack” implementations where testing just the front end or back end is not sufficient and testing both in isolation can also often be challenging due to the huge possible variations in input data.
Mocking data is also hard because dashboards may also lean a lot on database-side calculations/filtering.
All of this has lead me to take quite a full-fat approach to testing dashboards, by using a real DB populated with test data, and testing the full complete application stack (driven by something like Playwright or Cypress) as well as more granular unit tests where a mocked data layer may be used.
I’m also looking at introducing visual regression tests next time I work on this kind of thing. The visual aspects of dashboards can easily drift over time even if the data is correct. You’re often theming charting libraries for example and the compliance of the theme can drift slightly if you update the library without really checking every detail of the visual appearance/layout every time. Or you may not even notice the “visual drift”…
> Validation/testing has always been a challenge, especially given that dashboards are by definition quite “full stack” implementations where testing just the front end or back end is not sufficient and testing both in isolation can also often be challenging due to the huge possible variations in input data.
Constantly evolving but I've always tried hard to keep calculations away from the display tools. So, I put lots of things in SQL SPs, or in Python, or more broadly in tooling that allows me to recreate the summary data without the front-end. My nightmare is having to check a PowerBI calc that itself is based on an underlying SQL calc. Which one is wrong? Now spend twice as long figuring it out!
> The visual aspects of dashboards can easily drift over time even if the data is correct. You’re often theming charting libraries for example and the compliance of the theme can drift slightly if you update the library without really checking every detail of the visual appearance/layout every time. Or you may not even notice the “visual drift”..
Love it, very smart. Why I prefer tables for many things too - one less thing to maintain and check.
Having been on both sides of this, I think the challenge is that the CEO/COO's job is to figure out "what should we do about this?," which is the right approach to coming up with that summary (it's not just "here's a text version of the chart"). And the corollary challenge is that, in most cases, non-technical people with domain knowledge are the ones who need to produce the analysis: so any feature incomplete dashboard is going to stymie them and any general framework that requires a technical person to step in for code or configuration is going to slow the process to a crawl.
It's the rule (not the exception) that (especially if things are going poorly) the next step is asking more questions, which involves investigating something else in more detail. A dashboard, however pretty, is as useless as a doorknob if it doesn't have the needed information.
I have found that dashboards per say are always great as the high-level KPI trackers, like the things you would consider hanging on a wall in an office (e.g. "revenue growth this month" or "new customers acquired"). You'll always want to know that information, and many people of unrelated departments need to have that information shared to them.
The other helpful area is a deep-dive domain-specific analytics programs, like for example Google Analytics, where it has a very full featureset for non-technical marketing people to go in and drill down to answer questions. The UI/UX designers of that product have spent years honing and A/B testing which types of graphs to show where, and mapping out how to have people click around to find what they're looking for, to the point it is pretty easy for non-technical people. They even have courses and certifications on how to use the system.
Organizations that try to internally build a feature-complete system like google analytics for a specific domain need to consider it like building an entire software product (even if there's a general low/no-code BI SaaS to assist) because you'll need collaboration between general technical experts and non-technical stakeholders with changing and vague requirements. It can be done, but likely only with years of investment and UI/UX research, just like any other software product that solves a domain problem well. In practice: millions of dollars.
Technologists often forget that Excel *is* a turing complete programming language (and it's a functional programming paradigm too!). If an org is not committed to spend years and millions of dollars on deep dive analytics for a specific domain, the right choice is almost always using a commercial analytics system for that domain that costs less than the internal build, or embracing the trusty spreadsheet.
Totally agree. I'd even go a little further and say the business is in trouble if the CEO doesn't know "what we should do about this". It's the CEOs job to know those things, and it's the data team's job to provide the tools to make those decisions easier, faster and better.
Original creator of (the now woefully dated-looking) GBD Compare [https://vizhub.healthdata.org/gbd-compare/] here, where we found this super useful since we had so many controls that it could take a lot of clicking (and knowledge of the UI) to recreate a graph someone else was looking at. It really helped with reach, as folks could email/tweet their specific view then others could use that as a starting point to dive in without starting from scratch or having to create an account.
Sharing the parameters, filters, etc.
Sharing the results.
They can both be very important.
* Layout for dashboards is almost completely formulaic. A panel for selected high-level stats (user growth % increase from last year, user % increase from last month, # new users added), a panel for breakdowns (user growth by marketing channel, user growth by registration cohort), a panel at the top for filters ("let's filter the entire dashboard by just this marketing channel, or just this registration cohort") identical to all breakdowns provided, and finally a row-level drill-down ("show me the users in this particular aggregation"). It took me a very long time to learn that this design is entirely cookie-cutter for good reason. Users always want the same things: stats, breakdowns, filters and drill-downs.
* Padding matters, font matters, color palette matters, no typos matter, visual hierarchy matters (i.e. big bold numbers versus smaller grey numbers).
* Always define the key metrics first (based on fact tables). All dimensions and drill-downs in the dashboard will derive from these front-and-center stats.
* Reconcile to existing metrics before broadcasting widely - almost always, people have the same stats in extant technologies (i.e. Excel, Mixpanel, Salesforce) and will instantly find inconsistencies between your figures and the extant ones.
* The vast majority of users will be passive viewers. Very few users will be "power" EDA (exploratory data analysis) users. EDA views will look different from the view that passive viewers want - keep them separate
* Obviously, the more things done in code, which promotes modularity and composability, the fewer data integrity issues you will have
Is there any chance you could link an image of what a good version of this looks like?
1. Six top-level stats jump out at you: customers, orders, revenue, growth %, current week revenue, previous week revenue. All of these stats are adorned with a few substats (smaller text), almost always a % up/down from last period
2. A few large panels with breakdowns: revenue over time, revenue vs projections, revenue by referral source, revenue by location
3. The top right has your filter buttons, and generally it includes every breakdown dimension on the page. For example, "let's look at this dashboard by just the Google referral source" or "let's look at our stats from the U.S. geography only" or "let's filter this for last 2 years only"
4. Drill-down is "top selling products." This isn't truly a drill-down, as it is still an aggregation, so you really want to drill-down to the record-level. If you filter the dashboard for "U.S. sales by the Google referral source for the last 2 years only", people invariably want to see what the actual row-by-row sales were, and that is the drill-down. They can easily export this and reconcile to source systems. As an example, for some of the work I do, sales reps don't just want aggregations about their sales leads, they want the actual names of actual sales leads (as rows) so they can contact them.
So again, four major parts to a dashboard, which really drive from two simpler (likely familiar to most data analysts): metrics and dimensions.
Dashboards seem alluring because we imagine that users will sit there and somehow have insights delivered to them automatically. It’s often less clear what those insights will be or what is needed to produce them, we somehow hope they will materialize by just displaying some data. Often the focus is on making pretty-looking charts (which only ever look good when you demo with picturesque fake data), because you want the product to feel colorful, welcoming and visual.
A better approach is to either make a focused tool for solving a specific problem you know users have - you won’t think of what you end up with as a “dashboard” but it might occasionally end up looking a little like one - or to make general tools that allow users to dig through data interactively to find the things they care about.
- What do you hope to learn from this tool?
- Is there a less expensive way to get this information?
- The data will move 1 of three directions; up, down or stay the same. Ahead of time, what will you do in each case? Asking me to change the direction of the line is not an acceptable answer. Do we still need to make the chart? Or were all three answers the same?
- This is not a one-and-done project. The moment some visibility emerges in the fog, you will be desperate for more answers. We must set up a process for the never ending litany of questions that will emerge from this work.
- Smaller is better, incremental, fast iteration and ability to change are all far more important in dashboard work than stable, long term, deeply reliable.
- This is the conversation I even have with myself as I work on data for my own company.
It's a feedback system. Feedback is only useful if it can trigger behavior change. How can this measurement change the company's behavior?
Anything else is a vanity metric.
For instance, if a marketing head wants to plot CAC (cost of acquiring customers) over time, saying CAC is number of customers divided by marketing spend is manager-speak. Spends are budgeted higher early in the month and adjusted with actuals. Customers ask for refunds and cancel accounts. Some campaigns have volume incentives which are known later... and so on. The solution is to write well commented SQL which laymen can audit and improve.
Another thing customers love is the dynamic ability we give them to be able to switch how certain visuals are grouped or what value is being displayed. We can’t for see all the different ways users will want to slice and dice the data so giving them that ability was huge.
There is a chicken and the egg problem when it comes to designing these things.
I can ask "What do you want the dashboard to look like" and they'll answer "I don't know before I see the data".
Then I'll ask what data they want to see, and they'll respond "What will it look like?", or we'll spend significant time on data collection only to find they never actually want it in a dashboard after all.
By far and away the most time consuming aspect of this entire domain is to find out what users actually want to see, as they almost never have something specific enough when they approach me.
Answers don't just lead to Eureka moments, they lead to follow up questions and follow up questions.
Not a complaint - it's actually great. Just an observation (and a challenge)
I showed him this tutorial I had recently seen. Just a few minutes and the thumbnail, about how to build a "dashboard" in excel. https://youtu.be/z26zbiGJnd4?si=HWn8qTbozD8vmXiF
"Oh wow, I didn't know excel could look so beautiful!". He asked for the link, never did anything with it of course but was totally satisfied. I am pretty sure he just wanted a shiny toy and also felt inadequate about "just using excel" to do his important founder work. Showing him that excel can look beautiful and is a powerful tool was enough. No more feeling inadequate, no need for an actual (or even excel) dashboard.
In our team’s experience, the most important factor in getting engagement from users is including the right context directly within the report - definitions, caveats, annotations, narrative. This pre-empts a lot of questions about the report, but more importantly builds trust in what the data is showing (vs having a user self-serve, nervous that they’re making a decision with bad data - ultimately they’ll reach out to an analyst to get them to do the analysis for them).
The second most important factor was loading speed - we noticed that after around 8 seconds of waiting, business users would disengage with a report, or lose trust in the system presenting the information (“I think it’s broken”). Most often this resulted in people not logging in to look at reports - they were busy with tons of other things, so once they expected reports to take a while to load, they stopped coming back.
The third big finding was giving people data where they already are, in a format they understand. A complicated filter interface would drive our users nuts and turned into many hours of training and technical support. For this reason, we always wanted a simple UI with great mobile support for reports - our users were on the go and could already do most other things on their phones.
We couldn’t achieve these things in BI tools, so for important decisions, we had to move the work to tools that could offer text support, instant report loading, and a familiar and accessible format: PowerPoint, PDF, and email. Of course this is a difficult workflow to automate and maintain, but for us it was crucial to get engagement on the work we were producing, and it worked.
This experience inspired my colleague and I to start an open source BI tool which could achieve these things with a more maintainable, version controlled workflow. The tool is called Evidence (https://evidence.dev) if anyone is interested.
I also feel that speed builds trust, although I don't know specifically why. Perhaps people envision more errors or error-prone processes when a system is slow. It certainly shows more understanding of the data to present it quickly.
Obey rules of spacing more carefully than other rules to avoid overwhelming.
Do not use colours unless signalling information, so users can be alert and relaxed when needed.
As soon as you have more than 2 types of information, have expanding panels, which remember whether the user expanded/collapsed them.
Lastly, remember that speed of loading data is much more important for dashboards in general than a random page. Cache data, or initially load only summary data, or only load the latest day by default and then fetch the weeks data. Remember clients may make purchasing decisions based on how fast your stats page of your SaaS usage loads when they are showcasing it to their C-suite, and a 15 second wait can cost you your enterprise sale.
>“Data is the new oil.” Clive Humby, 2006
>“Most of my career to date seems to involve redesigning legacy reports to make it easier for existing users (if any) to see that they contain absolutely no actionable insight with a lot less effort.” Jeff Weir, 2023
For my perspective:
In general, I find most users can't actually say whether they need any given number/visual on an ongoing basis. So large amounts of work go into building dashboards that are used for a very short amount of time and then discarded. Probably we should do a better job on one-off analyses and only dashboard after the fact.
Many users don't actually want a dashboard, what they actually want is a live data dump into excel where they can pivot table it. Maybe, maybe a bar or line chart.
In general, I find people always ask for more filters, more slicers, just endless options to reconfigure the data as they please. But they quickly become trapped in a swamp of their own making, now nobody knows how this should be sorted or sliced, does it even make sense to do it this way? People think what they want is a 'data democracy' with hundreds of dashboards with hundreds of options with hundreds of users and so they ask for and usually receive it. But they usually just end up coming back to the data team and asking - 'so what's the answer?' What many orgs need is actually a data dictator.
On the other hand, dashboards do allow you to establish really good feedback loops within the business so when you can identify an ongoing constraint, figure out how to track it and then force people to receive it on a regular cadence and be accountable to it, you can make a lot of headway. But that's a more niche use-case than how they're frequently used and the skills involved are different - less visualization skills, more business analysis - and you need to be positioned to make sure someone is held accountable.
- You can output the most elegant metrics, you will never know if it was the right one until you talk to actual customers. Most of the time, they don't even understand what is presented.
- Use libraries, ui-kit made for this, it will save a huge amount of time.
- Whatever you do it will: never be enough, wrongly interpreted, used in the wrong context.
- Try to tie graph and metrics to use cases or questions. e.g titling: "Active user" vs "How many users were active* in the last 30days?" (* define active in the description) can make a huge diff in terms of comprehension
I’m not sure what they were trying to manage, but it was purdy and looked dashboardy.
There's nothing special about demographic data.
- What users ask for and what users really want are often extremely different.
- Engineering executives like to place their "thumbprint" on every business analytics dashboard. They want evidence that the "intelligence" being reported has been customized by them. It's their way of imparting branding on the organization.
- UI/UX is far more important to users than how you handle the technical details. When discussing implementation with them, start with the UI so that they have a mental model to build from.
- Leave space to create cool things that you/your team want to make. The developers of BI dashboards often have excellent ideas for visualizing data that an end-user would not immediately think of. Leave room to "delight".
- Never assume the data is clean or accurate (even when there are regulatory reasons for it to be either of those things)
- Not everyone's opinion is equally valuable.
- Beware of corporate politics. I once had an analytics project completely shut down because it would expose certain weaknesses in the business that were not acceptable to discuss publicly.
Bonus: Read "Envisioning Information" by Edward Tufte.
The hard part is knowing what information to surface, and how to drive the user towards those insights in an intuitive way. You need a strong team that intersects product, data science and UX. Engineering is the least important aspect of it.
1. Relevant: Don't just build a dashboard for the sake of building a dashboard. First, understand what the goal of the user is, and what metrics they'll want to look at to understand their progress towards that goal
2. Reliable: You only have one shot to get this one right. As soon as you present incorrect data to your users, you've lost their trust forever, so make sure you have solid tooling in place across your data stack that ensures data quality, from collection, through transformations to query time
3. Accessible: The data the user will be looking at needs to be either self explanatory, or the user has to have access to documentation that describes the data they're looking at in detail.
For point 1/, here's a framework to help you identify which metrics to focus on: https://www.avo.app/blog/tracking-the-right-product-metrics
1. Nobody knows what to monitor exactly, every new dashboard is based on a guess.
2. Not much user feedback to base the decisions on if you don't have much users to begin with.
3. Often, the metrics exposed by the app under the monitoring prove grossly inadequate or suitable metrics do not exist.
4. You can't just add new metrics. Users have to update the whole distributed app for the new metric to become available. This has to be accounted for at the UI design stage.
5. Somebody has to spend a significant amount of time gathering all the information from random people in the company, because see 1.
Giving customers "secure" sql access was a must have feature from upper management, and it was very tricky/a nightmare to get right.
Customers actually liked it though, sql is king.
Advise I would give is make sure your analytics api's data models and queries are well though out and extensible. It makes it very hard to change them and rework the ux.
Building dashboards that will actually be useful requires the same approach.
1. You want to build UI to be config driven. At some point, adding a new chart in code will not scale. Writing good config is hard and require a lot of careful thinking.
2. Product owners want special snowflake, try to push back on any customization as it increase complexity and make config harder. It is better to implement usable search, navigation, sitemap or focus on developer experience (CI/CD, feature flags etc.)
3. GraphQL is overrated, for complex charts with filters and multiple options it makes caching hard to use in practice. I would like to try tRPC next time or similar rpc based approach.
4. Performance impact of large bundles in minimal in practice. You can be shipping 20MB of JS to users, but inefficient re-renders/re-fetches will have way more impact that amount of code. For charting, I would try ECharts or any commercial WebGL based charting library. For tables, I would try something that mimic excel as closely as possible.
5. Centralize state of application via redux/signals/jotai. You want to have clear separation between config and state of components. You want to build this as early as possible. I guarantee that product would want to have URL sharing and adding this later is very difficult.
6. Designers love whitespace, You should fight for information density as much as possible. Design system sounds like a great idea, but it cost millions in practice.
Often it's something as a different interpretation of data in multiple places (revenue in one place, profit in another) or differing date logic (one query includes a date in the range, others are "up to" that date, etc). Caching is another issue, especially if you selectively cache only slow queries.
To minimize this, always have an explanation on the chart/card (even if it's hidden but clickable to show)
- a clean data pipeline is critical. Is your data pipeline manageable? Is it observable? Is it monitorable? Can you make changes quickly at different stages? How do overrides work? Does your data pipeline have entitlements? (Can private data points be provisioned to specific users?)
- Should you implement your own dashboard? Or are you reinventing the wheel? Can you reuse/recycle existing BI tools? What are the licenses involved? Power BI is proprietary to microsoft and will have per user economics. Grafana is AGPL, be very careful with anything AGPL in your tech stack because it may force you to open source your code. Apache Superset is pretty cool. I've seen big startup valuations with off-the-shelf BI tools. If its an MVP, definitely consider using this as opposed to rolling your own.
- Making assumptions for your users is bad because users will always ask for more. So building a flexible framework where users can add/remove visuals and build their own analytics may be necessary. The flipside is this adds complexity and can confuse the user. Its a delicate balance to cater to all types of users: the basic user vs the power user.
- How do users send you feedback? Bad data point? How do you find out? Can the user address it themselves?
So it's super important to get on the same page RE: goals and expectations and keep that alignment going to the end - so that there aren't any unpleasant surprises at the delivery stage. Some more on who to get involved and how here: https://www.avo.app/blog/who-should-be-involved-in-tracking-...
https://jorin.me/gettting-started-building-a-data-driven-bus...
Thought it might be relevant to others here.
The ones who actually use it, you won't cover all their edge cases.
I'm looking after a decision support system at the moment, and am encountering all the challenges raised here. Glad to see my experienced is not unique.
Don't do that. Show only the things users need to act on what's on the screen. Minimize the information, make it "glanceable".
If you have a troubleshooting dashboard, and you're showing 999 items with nothing going wrong, that one item that's actually wrong is not going to pop.
I regularly tell customers "Open up this JSON, hit control f and search for the stats that you need." And they're like "Thank you, you just saved me 50000 hour of work."
One thing not emphasized well:
1. Make it accessible. At some point, virtually all of us will have some form of accessibility issues. 508 compliance is a solid standard for this, though can be a pain to manage without starting with it from the get-go.
2. Make it tabbable (similar to accessible).
3. For development side, make it able to client-side OR server-side render -- not every dashboard will have or need a rendering server. In python, Altair is the only client-side rendering that is also interactive that I'm aware of. It's important for payload considerations
4. Related to 3 - consider payload considerations. Make it transparent, either in debug logs or similar, how large the elements passing across the wire are.
I understand this, but I disagree with some of this or have trouble understanding how this can be applied in practice:
- Reports can absolutely be built in a way that is flexible enough to enable knowledge discovery. If instead of creating a chart that plots Conversion Rate over time, instead create a chart that plots a Primary Metric against a Primary Dimension and use parameters to allow users to choose what the Primary Metric and Primary Dimension are. This drastically reduces the maintenance costs of reports because you don't need to create more charts, rather you just need to make new data available.
- This design strategy can be expanded to Secondary Metrics, Secondary Segments and Splits to enable comparison between segments. This is a big step towards finding out the why
- If you're a big business with both a team of BI developers and a team of Data Analysts I can imagine you'll have plenty of resources to conduct more thorough analysis whenever they are needed. But if you're a startup, you probably have a few Analytics Engineers doing both BI development and analysis. How do you enable them to do both if stakeholders most often don't know what they need? You have to be efficient and I don't think that means having these few Analytics Engineers holding stakeholders hands through a series of discussions to figure out what the hell do they even need...
- Why would you not want everyone in the business to be able to discover new things in the data? Why only allow data analysts to do that? If you provide a platform that enables data exploration in a guided way to avoid wrong use/interpretation of data, isn't it best to open it up to everyone? More people looking into data = more hypotheses = higher probability that at least one of them will be proven and very impactful.
- I think there are different types of data work: setting up data architecture to collect and transform data into a format that enables easy analysis; building solutions for monitoring KPIs (the what); building solutions for understanding the drivers of KPI fluctuations (the why); advanced analytics to support decision making (the actions). My opinion is that the real value is in the last point. Whatever we can do to serve the other needs with minimal effort, we should do
i always assumed i have a good taste, and that my designs are good looking and should appeal to others
different people have a completely different idea on what is usable and what is a good taste, so just be open minded to listen and accommodate for the taste of others