Visitor Summary Uploader
The Visitor Summary Uploader action aggregates local visitor counting data and uploads hourly summaries to the central cloud database, enabling organization-wide reporting and dashboards.
Overview
The Visitor Entry Counting action stores detailed records locally. This action:
- Reads those local records
- Aggregates them into hourly summaries
- Uploads summaries to the cloud database
- Marks local records as processed
This separation allows continuous local counting even during internet outages, with data syncing to the cloud when connectivity is available.
What It Does
1. Reads Local Data
The action queries the local visitor_entries table for records that haven't been uploaded yet (processed = false).
2. Aggregates by Hour
Raw entry/exit events are grouped and summed by hour:
| Aggregation | Description |
|---|---|
| Total Entries | Sum of all entries in the hour |
| Total Exits | Sum of all exits in the hour |
| Male Count | Number of male entries (if demographics enabled) |
| Female Count | Number of female entries (if demographics enabled) |
| Child Count | Entries in child age group |
| Young Adult Count | Entries in young adult age group |
| Adult Count | Entries in adult age group |
| Senior Count | Entries in senior age group |
3. Uploads to Cloud
The hourly summaries are inserted into the cloud database table branch_visitor_entries. This table is used by:
- Organization-wide dashboards
- Cross-branch reporting
- Historical trend analysis
- Management reports
4. Marks as Processed
After successful upload, the local records are marked as processed to prevent duplicate uploads.
Configuration
This action typically requires no specific configuration. It automatically:
- Reads from the correct local table
- Uploads to the correct cloud table
- Uses the current branch context
Optional Parameters
| Parameter | Type | Default | Description |
|---|---|---|---|
| Hours Back | Number | 24 | How far back to look for unprocessed data |
| Batch Size | Number | 100 | Maximum records to process per run |
Understanding Results
| Result | Meaning | Action |
|---|---|---|
| Uploaded hourly summaries | Data was aggregated and uploaded successfully | Normal operation |
| No new data | No unprocessed records found | Nothing to do; normal if counting isn't active |
| Upload failed | Could not upload to cloud | Check internet connectivity |
| Partial upload | Some data uploaded, some failed | May retry automatically |
Scheduling Recommendations
Recommended: Every 15-30 Minutes
Running frequently ensures:
- Near-real-time cloud dashboards
- Small batches (faster processing)
- Quick recovery from failed uploads
Alternative: Hourly
Running hourly is acceptable if:
- Real-time data isn't critical
- Network bandwidth is limited
- Server resources are constrained
End of Day
Running once at closing time ensures:
- All daily data is uploaded
- Complete daily reports available next morning
- Minimum network usage during business hours
Common Use Cases
Real-Time Dashboards
- Schedule: Every 15 minutes
- Result: Cloud dashboards show data within 15-30 minutes of occurrence
Daily Reporting
- Schedule: Run at closing time + 30 minutes
- Result: Complete daily data available for morning reports
Bandwidth Conservation
- Schedule: Once during overnight hours
- Result: Minimizes daytime network usage while ensuring data reaches cloud
Multi-Branch Analytics
- Setup: All branches run this action on similar schedules
- Result: Consistent, comparable data across all locations
Troubleshooting
Upload Failed
-
Internet Connectivity: Verify the ResEngine server can reach the internet.
-
Cloud Credentials: Check that API keys for the cloud database are valid.
-
Service Availability: The cloud database service may be temporarily unavailable.
-
Firewall: Ensure outbound HTTPS is allowed.
No New Data
-
Counting Active: Verify Visitor Entry Counting is running and recording data.
-
Already Processed: Data may have been uploaded by a previous run.
-
Time Range: Check the "Hours Back" parameter isn't limiting the query.
Duplicate Data in Cloud
-
Multiple Runs: If the action runs while a previous run is still processing, duplicates may occur.
-
Process Flag Issues: If records aren't being marked as processed, they'll upload repeatedly.
-
Manual Fix: May require database intervention to remove duplicates and fix process flags.
Data Discrepancy
-
Time Zones: Local timestamps are converted to UTC for cloud storage. Ensure reports use consistent time zones.
-
Aggregation Boundaries: Hourly aggregation may split events that cross hour boundaries.
-
Incomplete Hours: Running mid-hour uploads partial data for the current hour.
Large Backlog
If internet was down for extended period:
-
Increase Batch Size: Temporarily increase to process more records per run.
-
Run Frequently: Run action multiple times to clear backlog.
-
Check Resources: Large batch processing may temporarily increase CPU/memory usage.
Best Practices
-
Regular Schedule: Set up automated scheduling rather than relying on manual runs.
-
Monitor Failures: Set up alerts for repeated upload failures to catch connectivity issues early.
-
Backup Before Purge: Before purging old local data, ensure it's been uploaded.
-
Consistent Timing: Run at similar times across all branches for comparable reporting.
-
Off-Peak Processing: For large deployments, stagger upload times to avoid overwhelming cloud infrastructure.
Data Flow Summary
Local Camera → Visitor Entry Counting → visitor_entries (local)
↓
Visitor Summary Uploader
↓
branch_visitor_entries (cloud)
↓
Dashboards & Reports
Related Actions
- Visitor Entry Counting: Creates the local records this action uploads
- Line Crossing Events Reader: Alternative way to query local visitor data
- Zone Dwell Summary Uploader: Similar uploader for zone occupancy data