What You'll Learn
- How to access the measureQuick cloud dashboard and what subscription level is required
- What key metrics the dashboard displays and how to interpret them
- How to filter dashboard data by date, technician, location, and test type
- How to track trends over time and identify patterns in team performance
- How to compare technician performance using diagnostic data
- How to identify training needs from subsystem failure patterns
- How to export or share dashboard views
What You'll Need
- Device: Computer with a modern web browser (Chrome, Firefox, Safari, Edge)
- Account: measureQuick Premier subscription with Company Administrator or Manager role
- Data: At least 30 days of synced project data (more data produces more meaningful trends)
- Knowledge: Cloud sync setup (K1) and subscription activation (A4)
- Time: 10 minutes to read, ongoing use for team management
Why Dashboard Analytics Matter
Individual test reports tell you about one system. Dashboard analytics tell you about your entire operation. A manager looking at a single project sees whether that system passed or failed. A manager looking at the dashboard sees whether the team is improving, which technicians need coaching, what types of failures are most common, and whether seasonal patterns affect diagnostic quality.
The cloud dashboard at cloud.measurequick.com aggregates all synced project data from every technician in your company into a single view. No exports or spreadsheets required.
Step-by-Step Guide
Step 1: Access the Dashboard
- Open cloud.measurequick.com in your web browser.
- Sign in with your Company Administrator or Manager account.
- The dashboard home page displays a summary of company-wide metrics.
The dashboard requires a Premier subscription. Basic accounts can sync and view projects, but the analytics views, KPI summaries, and team comparison features are Premier-only.

Cloud dashboard home page showing the summary metrics panel with test counts, pass/fail rates, and technician activity overview
Step 2: Review Key Metrics
The dashboard presents several categories of metrics:
Test volume:
- Total tests completed (all time, or within the selected date range)
- Tests per day, per week, or per month
- Breakdown by test type (AC/Heat Pump Cooling, Heat Pump Heating, Gas Furnace/Boiler, Quick Tests)
Pass/fail rates by subsystem:
- Each subsystem (refrigerant charge, airflow, static pressure, electrical, venting, etc.) shows its pass/fail percentage
- These rates reflect post-override results, meaning they include any corrections technicians made to the automated pass/fail determinations
- High failure rates in a specific subsystem may indicate a training gap, a regional installation pattern, or a common equipment issue in your service area
Technician activity:
- Number of tests completed per technician
- Average time between Test In and Test Out (indicates job duration and workflow efficiency)
- Number of projects with Test In only (no Test Out), which may indicate incomplete workflows
Project completion:
- Projects with both Test In and Test Out vs projects with Test In only
- Vitals Score distribution (if Premier reporting is used)

Dashboard metrics panel showing pass/fail rates by subsystem as a horizontal bar chart, with refrigerant charge, static pressure, and airflow highlighted
Step 3: Filter by Date Range
Date filtering controls what data the dashboard displays:
- Locate the date range selector at the top of the dashboard.
- Choose a preset range (Last 7 days, Last 30 days, Last 90 days, Last 12 months) or set a custom range.
- The dashboard refreshes to show only data within the selected period.
Common date ranges and their uses:
- Last 7 days: Daily check-in on current team activity
- Last 30 days: Monthly performance review
- Last 90 days: Quarterly business review, seasonal analysis
- Last 12 months: Annual trends, year-over-year comparison
- Custom range: Specific cooling or heating season, before-and-after a training event, or a promotional campaign period
Step 4: Filter by Technician
- Use the technician dropdown to select a specific team member.
- The dashboard updates to show only that technician's data.
- Compare individual performance against the company average by toggling between "All Technicians" and a specific name.
This filter is the foundation of individual performance reviews. A manager can sit down with a technician and walk through their specific metrics: how many tests they completed, what their pass/fail rates look like, and where they differ from the team average.

Cloud dashboard with technician filter dropdown showing team members
Step 5: Track Trends Over Time
The dashboard includes time-series views that show how metrics change over weeks and months:
- Test volume trends. Are you doing more tests month-over-month? Seasonal dips in winter (less cooling work) and summer (less heating work) are normal.
- Pass/fail rate trends. Is your team's refrigerant charge pass rate improving? After a training session, you should see a measurable improvement within 30-60 days.
- Vitals Score trends. If your team is using Vitals Reports, track whether average scores are trending upward.
Trend data is most useful after 90+ days of consistent data collection. Short time windows produce noisy results that may not reflect real patterns.
Step 6: Compare Technician Performance
To compare technicians side by side:
- Set a date range that covers the period you want to evaluate.
- Review the technician activity summary, which shows each tech's test count and pass/fail breakdown.
- Look for outliers: technicians with significantly higher or lower failure rates in specific subsystems.
What the comparisons reveal:
- A technician with a much higher static pressure failure rate than the team average may be testing more carefully (catching issues others miss) or may be working in a territory with worse ductwork.
- A technician with a very low overall failure rate may be overriding failures rather than documenting them. Cross-reference with override rates if available.
- A technician with high test volume but low Test Out completion may be running diagnostics but not completing the full test-in/test-out workflow.
Context matters. Do not use raw pass/fail rates alone to judge technician quality. A tech who works primarily on older equipment in a difficult climate zone will have different numbers than one servicing newer systems in a mild climate.
Step 7: Identify Training Needs
Dashboard analytics point to specific training opportunities:
- High refrigerant charge failure rate across the team: May indicate a need for charge procedure training or metering device identification refresher.
- One technician with low airflow pass rates: May need coaching on static pressure measurement or duct system evaluation.
- Low Test Out completion rates: Technicians may not understand the value of the complete test-in/test-out workflow. Consider reinforcing this in team meetings.
- Few Vitals Reports generated: Technicians may not be using the reporting features that differentiate your company's service.
Use the dashboard data to make training decisions based on evidence rather than assumptions.
Step 8: Export Dashboard Views
To share dashboard data with others or include it in presentations:
- Apply the filters you want (date range, technician, test type).
- Use the Export function to download the filtered data as a CSV file.
- For visual sharing, take a screenshot of the dashboard view or use your browser's print-to-PDF function.
For more on working with exported data, see Exporting Data.

measureQuick Cloud Dashboard home page showing Quick Links, company statistics, and Active Projects section with status and technician filters

Project Export page showing filter options, date range, CSV and Include Vitals download buttons, and a list of projects with job date, type, name, customer, and status
Tips & Common Issues
How much data do I need before the dashboard is useful?
The dashboard starts showing data as soon as your team syncs their first project. However, meaningful trend analysis requires at least 30-60 days of consistent data from multiple technicians. Pass/fail rates based on fewer than 50 tests per technician can swing dramatically based on a few unusual jobs.
A technician's data is not showing up
Verify three things: (1) the technician's account is linked to your company in measureQuick Cloud, (2) they are syncing their projects (Exit & Sync after each job), and (3) their subscription is active. Projects saved to the device but never synced will not appear in the dashboard.
Pass/fail rates seem too high or too low
Remember that dashboard pass/fail rates are post-override. If your technicians routinely override subjective subsystems (condensate, outdoor unit condition, indoor condition, air filtration), the pass rate for those subsystems will be artificially high. Focus on measurement-based subsystems (refrigerant charge, airflow, static pressure, electrical) for more reliable quality indicators.
Can I set up automated reports or alerts?
The current dashboard does not support scheduled email reports or threshold-based alerts. Check the dashboard manually on a regular cadence (daily for active managers, weekly for periodic reviews). If you need automated reporting, export data (J6) and build reports in your preferred tool.
Related Articles
Prerequisites (complete these first):
Follow-up articles (next steps after this one):
Related in the same domain:
Need Help?
If you have questions about interpreting dashboard metrics or need help with your analytics setup:
- Check the Related Articles section above
- Contact measureQuick support: support@measurequick.com