Skip to main content

Visitor Summary Uploader

The Visitor Summary Uploader action aggregates local visitor counting data and uploads hourly summaries to the central cloud database, enabling organization-wide reporting and dashboards.


Overview

The Visitor Entry Counting action stores detailed records locally. This action:

  1. Reads those local records
  2. Aggregates them into hourly summaries
  3. Uploads summaries to the cloud database
  4. Marks local records as processed

This separation allows continuous local counting even during internet outages, with data syncing to the cloud when connectivity is available.


What It Does

1. Reads Local Data

The action queries the local visitor_entries table for records that haven't been uploaded yet (processed = false).

2. Aggregates by Hour

Raw entry/exit events are grouped and summed by hour:

AggregationDescription
Total EntriesSum of all entries in the hour
Total ExitsSum of all exits in the hour
Male CountNumber of male entries (if demographics enabled)
Female CountNumber of female entries (if demographics enabled)
Child CountEntries in child age group
Young Adult CountEntries in young adult age group
Adult CountEntries in adult age group
Senior CountEntries in senior age group

3. Uploads to Cloud

The hourly summaries are inserted into the cloud database table branch_visitor_entries. This table is used by:

  • Organization-wide dashboards
  • Cross-branch reporting
  • Historical trend analysis
  • Management reports

4. Marks as Processed

After successful upload, the local records are marked as processed to prevent duplicate uploads.


Configuration

This action typically requires no specific configuration. It automatically:

  • Reads from the correct local table
  • Uploads to the correct cloud table
  • Uses the current branch context

Optional Parameters

ParameterTypeDefaultDescription
Hours BackNumber24How far back to look for unprocessed data
Batch SizeNumber100Maximum records to process per run

Understanding Results

ResultMeaningAction
Uploaded hourly summariesData was aggregated and uploaded successfullyNormal operation
No new dataNo unprocessed records foundNothing to do; normal if counting isn't active
Upload failedCould not upload to cloudCheck internet connectivity
Partial uploadSome data uploaded, some failedMay retry automatically

Scheduling Recommendations

Running frequently ensures:

  • Near-real-time cloud dashboards
  • Small batches (faster processing)
  • Quick recovery from failed uploads

Alternative: Hourly

Running hourly is acceptable if:

  • Real-time data isn't critical
  • Network bandwidth is limited
  • Server resources are constrained

End of Day

Running once at closing time ensures:

  • All daily data is uploaded
  • Complete daily reports available next morning
  • Minimum network usage during business hours

Common Use Cases

Real-Time Dashboards

  • Schedule: Every 15 minutes
  • Result: Cloud dashboards show data within 15-30 minutes of occurrence

Daily Reporting

  • Schedule: Run at closing time + 30 minutes
  • Result: Complete daily data available for morning reports

Bandwidth Conservation

  • Schedule: Once during overnight hours
  • Result: Minimizes daytime network usage while ensuring data reaches cloud

Multi-Branch Analytics

  • Setup: All branches run this action on similar schedules
  • Result: Consistent, comparable data across all locations

Troubleshooting

Upload Failed

  1. Internet Connectivity: Verify the ResEngine server can reach the internet.

  2. Cloud Credentials: Check that API keys for the cloud database are valid.

  3. Service Availability: The cloud database service may be temporarily unavailable.

  4. Firewall: Ensure outbound HTTPS is allowed.

No New Data

  1. Counting Active: Verify Visitor Entry Counting is running and recording data.

  2. Already Processed: Data may have been uploaded by a previous run.

  3. Time Range: Check the "Hours Back" parameter isn't limiting the query.

Duplicate Data in Cloud

  1. Multiple Runs: If the action runs while a previous run is still processing, duplicates may occur.

  2. Process Flag Issues: If records aren't being marked as processed, they'll upload repeatedly.

  3. Manual Fix: May require database intervention to remove duplicates and fix process flags.

Data Discrepancy

  1. Time Zones: Local timestamps are converted to UTC for cloud storage. Ensure reports use consistent time zones.

  2. Aggregation Boundaries: Hourly aggregation may split events that cross hour boundaries.

  3. Incomplete Hours: Running mid-hour uploads partial data for the current hour.

Large Backlog

If internet was down for extended period:

  1. Increase Batch Size: Temporarily increase to process more records per run.

  2. Run Frequently: Run action multiple times to clear backlog.

  3. Check Resources: Large batch processing may temporarily increase CPU/memory usage.


Best Practices

  1. Regular Schedule: Set up automated scheduling rather than relying on manual runs.

  2. Monitor Failures: Set up alerts for repeated upload failures to catch connectivity issues early.

  3. Backup Before Purge: Before purging old local data, ensure it's been uploaded.

  4. Consistent Timing: Run at similar times across all branches for comparable reporting.

  5. Off-Peak Processing: For large deployments, stagger upload times to avoid overwhelming cloud infrastructure.


Data Flow Summary

Local Camera → Visitor Entry Counting → visitor_entries (local)

Visitor Summary Uploader

branch_visitor_entries (cloud)

Dashboards & Reports

  • Visitor Entry Counting: Creates the local records this action uploads
  • Line Crossing Events Reader: Alternative way to query local visitor data
  • Zone Dwell Summary Uploader: Similar uploader for zone occupancy data