How to Log Data to the Cloud (InfluxDB)
Configure Profinity to log CAN bus data to InfluxDB for cloud-based data analytics and visualization.
Prerequisites
- Profinity V2 installed and running
- An InfluxDB instance (cloud or self-hosted)
- InfluxDB database created and accessible
- Network access to your InfluxDB server
- Admin access to Profinity
Steps
Step 1: Set Up InfluxDB
Option A - InfluxDB Cloud: 1. Sign up for InfluxDB Cloud 2. Create a new organization 3. Create a new bucket for your data 4. Generate an API token with write permissions 5. Note: Organization Name, Bucket Name, API Token, InfluxDB URL
Option B - Self-Hosted:
1. Install InfluxDB on your server
2. Start InfluxDB service
3. Create a bucket: influx create bucket my-profinity-data
4. Generate an API token: influx auth create --org my-org --all-access
Step 2: Add InfluxDB Logger Component
- Navigate to ADMIN → Components
- Click Add Component
- Select InfluxDB Prometheus Logger
- Click Add
Step 3: Configure InfluxDB Connection
- Set Component Name (e.g., "InfluxDB Cloud Logger")
- Set Logger Type to "InfluxDB"
- Enter InfluxDB URL:
- Cloud:
https://us-east-1-1.aws.cloud2.influxdata.com - Self-hosted:
http://your-server:8086 - Enter Organization name
- Enter Bucket/Database name
- Enter API Token
Step 4: Select Components to Log
- Find Components to Log section
- Select components to send to InfluxDB:
- Check individual components, or
- Select "All Components"
- (Optional) Use filters to include/exclude specific components
Step 5: Configure Logging Interval
- Set Logging Interval (e.g., 1-5 seconds)
- Typical values: 1-5 seconds for most applications
- Higher frequency = more granular but more bandwidth
- Lower frequency = less bandwidth but coarser data
Step 6: Test Connection
- Click Test Connection (if available)
- Verify connection successful
- Check for error messages
- Review connection status
Step 7: Save and Activate
- Click Save to save logger configuration
- Ensure logger is added to active profile
- Logger starts automatically when profile is active
Step 8: Verify Data Logging
- Log into InfluxDB web interface
- Navigate to Data Explorer
- Query your data:
- Verify data is arriving
- Check Profinity logs for connection errors
Data Structure in InfluxDB
Profinity logs data with this structure:
- Measurement: Component name
- Tags: Component ID, signal names
- Fields: Signal values
- Timestamp: Data timestamp
Example Data Point:
measurement: "Prohelion BMU"
tags:
componentId: "bmu-001"
signal: "PackVoltage"
fields:
value: 400.5
timestamp: 2024-01-17T10:30:00Z
Tips
- Start with One Component: Test with a single component first
- Monitor Bandwidth: Cloud logging uses network bandwidth
- Optimize Intervals: Balance data granularity with cost
- Use Tags Efficiently: Tags are indexed and help with queries
- Set Retention Policies: Configure data retention to manage costs
- Monitor Costs: Cloud InfluxDB charges based on data usage
Troubleshooting
- Connection Failed:
- Verify InfluxDB URL is correct and accessible
- Check firewall settings
- Verify API token is valid
-
Test network connectivity
-
No Data Arriving:
- Verify components are selected
- Check that components are receiving data
- Verify logging interval is set
-
Check InfluxDB write permissions on token
-
Authentication Errors:
- Verify API token is correct
- Check token has write permissions
- Verify organization name matches
Related Documentation
- InfluxDB Prometheus Logger - Complete logger documentation
- Data Logging Overview - General data logging guide
- InfluxDB Documentation - Official InfluxDB documentation