Export Data to File

Action ID: export_data_to_file

Description

Export data to a file. The file will be stored in S3 and the URL will be returned.

Input Parameters

Name
Type
Required
Default
Description

data

string

-

Data to export to the file

file_extension

string

-

txt

File extension for the exported file

View JSON Schema
{
  "description": "Export Data to File action input.",
  "properties": {
    "data": {
      "title": "Data to export",
      "type": "string"
    },
    "file_extension": {
      "default": "txt",
      "title": "File extension",
      "type": "string"
    }
  },
  "required": [
    "data"
  ],
  "title": "ExportDataToFileActionInput",
  "type": "object"
}

Output Parameters

Name
Type
Description

file_url

string

URL of the exported file in S3 storage

View JSON Schema
{
  "description": "Export Data to file action output.",
  "properties": {
    "file_url": {
      "title": "File URL",
      "type": "string"
    }
  },
  "required": [
    "file_url"
  ],
  "title": "ExportDataToFileActionOutput",
  "type": "object"
}

How It Works

This node takes your data string and writes it to a file with the specified extension. The file is then uploaded to S3 cloud storage with a unique filename. The node returns a publicly accessible URL where the file can be downloaded or referenced in subsequent workflow steps.

Usage Examples

Example 1: Export JSON Data

Input:

data: "{\"name\": \"John Doe\", \"email\": \"[email protected]\", \"age\": 30}"
file_extension: "json"

Output:

file_url: "https://s3.amazonaws.com/bucket/export-12345.json"

Example 2: Export CSV Data

Input:

data: "Name,Email,Age\nJohn Doe,[email protected],30\nJane Smith,[email protected],25"
file_extension: "csv"

Output:

file_url: "https://s3.amazonaws.com/bucket/export-67890.csv"

Example 3: Export Plain Text Report

Input:

data: "Workflow Execution Report\n\nTotal Records: 150\nSuccess: 145\nFailed: 5\nExecution Time: 2.3 seconds"
file_extension: "txt"

Output:

file_url: "https://s3.amazonaws.com/bucket/export-24680.txt"

Common Use Cases

  • Data Backup: Export workflow data to files for backup and archival purposes

  • Report Generation: Create downloadable reports from processed data

  • Data Sharing: Generate files that can be shared with external systems or users

  • Batch Processing Output: Save the results of batch processing operations to files

  • Log Exports: Export logs and execution details for analysis and auditing

  • Configuration Files: Generate configuration files based on workflow data

  • Data Migration: Export data in specific formats for migration to other systems

Error Handling

Error Type
Cause
Solution

Empty Data

Data parameter is empty or null

Ensure the data parameter contains valid content before exporting

Invalid Extension

File extension contains invalid characters

Use only alphanumeric characters for the file extension (e.g., txt, json, csv)

Data Too Large

Data size exceeds maximum allowed size

Break large data into smaller chunks and export multiple files

S3 Upload Failed

Unable to upload file to S3 storage

Check S3 connectivity and permissions, then retry the operation

Encoding Error

Data contains characters that can't be encoded

Ensure data is properly encoded in UTF-8 format

Permission Denied

Insufficient permissions to write to S3

Verify that the workflow has proper S3 write permissions configured

Network Error

Connection to S3 storage failed

Check network connectivity and S3 service status

Notes

  • File Extension: The file extension determines the file type but doesn't validate the data format—ensure your data matches the extension.

  • Default Extension: If no file extension is provided, the file will be saved with a .txt extension.

  • Unique Filenames: Each export generates a unique filename to prevent overwrites and conflicts.

  • Storage Location: Files are stored in S3 with configurable retention policies based on your setup.

  • URL Accessibility: The returned URL is publicly accessible—avoid exporting sensitive data without encryption.

  • Data Format: The node accepts string data, so ensure complex data structures are serialized (e.g., JSON.stringify).

  • File Size Limits: Be aware of S3 upload limits and workflow data size constraints.

  • Cost Considerations: S3 storage and bandwidth usage may incur costs depending on your plan and usage volume.

Last updated

Was this helpful?