Backup SR Connect logs into AWS S3


Get Started

Not the template you're looking for? Browse more.

About the template


This template periodically copies ScriptRunner Connect script invocation logs into AWS S3. Get started to learn more.

About ScriptRunner Connect


What is ScriptRunner Connect?

ScriptRunner Connect is an AI assisted code-first (JavaScript/TypeScript) integration platform (iPaaS) for building complex integrations and automations.

Can I try it out for free?

Yes. ScriptRunner Connect comes with a forever free tier.

Can I customize the integration logic?

Absolutely. The main value proposition of ScriptRunner Connect is that you'll get full access to the code that is powering the integration, which means you can make any changes to the the integration logic yourself.

Can I change the integration to communicate with additional apps?

Yes. Since ScriptRunner Connect specializes in enabling complex integrations, you can easily change the integration logic to connect to as many additional apps as you need, no limitations.

What if I don't feel comfortable making changes to the code?

First you can try out our AI assistant which can help you understand what the code does, and also help you make changes to the code. Alternatively you can hire our professionals to make the changes you need or build new integrations from scratch.

Do I have to host it myself?

No. ScriptRunner Connect is a fully managed SaaS (Software-as-a-Service) product.

What about security?

ScriptRunner Connect is ISO 27001 and SOC 2 certified. Learn more about our security.

Template Content


README

Scripts

TypeScriptRunBackupJob
Scheduled

README


๐Ÿ“‹ Overview

This template periodically copies ScriptRunner Connect script invocation logs into AWS S3.

๐Ÿ–Š๏ธ Setup

  1. Set up API connection for AWS S3.
  2. Set up API connections for SR Connect:
    1. Click on your name > API Keys
    2. Generate yourself a new API key, give it a meaningful name so you know what it is being used for.
    3. Grab the generated username and password, and use them as values for basic authentication.
    4. Set the base URL to: https://api.scriptrunnerconnect.com/
  3. Configure the scheduled trigger, set it to trigger in every 15 minutes.
  4. Go to Parameters and configure necessary parameters.

๐Ÿš€ Usage

When backup job script gets triggered on a schedule, it will find the latest logs and will back them up into S3. When the job is triggered for the first time, only the first batch of logs are backed up. If you wish to backup all the historical logs, remove the condition that check for it in RunBackupJob script and run the script manually, after which the job will go though all the historical logs.

Backup format

For each script invocation logs, following files will be created in your S3 bucket:

  • {invocationId}/invocation.json - Contains information about the script invocation log.
  • {invocationId}/eventPayload.json - Contains the event payload that the script was triggered with. This is not always present, some invocation types, such as MANUAL or SCHEDULED don't have event payloads.
  • {invocationId}/consoleLogs.json - Contains console logs. Not present if there were no console logs emitted during invocation.
  • {invocationId}/httpLogs.json - Contains HTTP logs. Not present if there were no HTTP API calls made during the invocation.
  • {invocationId}/largeMessages/{messageId}.json - Message contents that are too large are excluded from consoleLogs.json. For each large message in the console log, their contents is stored separatelly in this file.

Known limitations

  • There is no retry logic should the script backup fail. You should look into dealing with failed jobs yourself, for example send the notification when one of the log backups fail and then try to back it up later manually. Look for a TODO in the RunBackupJob script to where a failure could be handled.
  • When running the backup job manually to backup historical logs, the job can run up to 40 hours only, which for most cases should be enough time.

API Connections


TypeScriptRunBackupJob

import AWS from './api/aws';
import SRC from './api/src';
import { ScriptInvocationLog, getConsoleLogs, getEventPayload, getHttpLogs, getInvocationLogs, getLargeLogMessage } from './SRConnectAPI';
import { triggerScript } from '@sr-connect/trigger';
import { throttleAll } from 'promise-throttle-all';

interface Event {
    lastInvocationStartTime?: string;
}

export default async function (event: Event, context: Context): Promise<void> {
    // Pull parameters from environment variables
    const { AWS_REGION, S3_BUCKET_NAME, TEAM_ID, Advanced: { BATCH_SIZE, RESERVE_BATCH_TIME_MULTIPLIER, STOP_AFTER_LOGS_EXIST, STORE_LOG_JOB_CONCURRENCY, STORE_LARGE_LOG_MESSAGE_CONCURRENCY } } = context.environment.vars as Params;

    let nextToken: string | undefined;
    let lastInvocationStartTime: string | undefined;
    let logsExist = 0;
    let batchCount = 1;
    const batchProcessingTimes: number[] = [];

    const firstTime = ((await AWS.S3.Object.getObjectsV2({
        bucket: S3_BUCKET_NAME,
        region: AWS_REGION
    })).Contents ?? []).length === 0;

    do {
        const batchStartTime = Date.now();
        // Get latest invocation logs
        const invocationLogs = await getInvocationLogs(SRC, TEAM_ID, BATCH_SIZE, nextToken, event.lastInvocationStartTime);
        console.log(`Starting batch ${batchCount}, found ${invocationLogs.invocations.length} invocations`);

        // Interate through each invocation log
        for (const invocation of invocationLogs.invocations) {
            lastInvocationStartTime = invocation.startTime;

            // Check if invocation log already exists in S3
            if (await invocationLogExists(S3_BUCKET_NAME, AWS_REGION, invocation)) {
                // If it does, then increase logExists counter
                logsExist++;
                console.log(`Invocation log exists: ${invocation.invocationId} - Log start Time: ${invocation.startTime}`);
            } else {
                // if it does not, then resetthe logExists invocation counter, since we want to only count continously existing logs
                logsExist = 0;
                // And then process the invocation log
                await processInvocation(STORE_LOG_JOB_CONCURRENCY, STORE_LARGE_LOG_MESSAGE_CONCURRENCY, S3_BUCKET_NAME, AWS_REGION, TEAM_ID, invocation);
            }

            // If the logExist counter, which counts continously existing logs is greater than defined limit, then consider the job finished
            if (logsExist >= STOP_AFTER_LOGS_EXIST) {
                console.log('Continious invocation log exits threshold reached, stopping job');
                return;
            }

        }

        const batchTime = Math.round((Date.now() - batchStartTime) / 1000);
        batchProcessingTimes.push(batchTime);
        const averageBatchProcessingTime = batchProcessingTimes.reduce((prev, current) => prev + current, 0) / batchProcessingTimes.length;
        console.log(`Batch ${batchCount} finished in ${batchTime} seconds, average batch processing time: ${averageBatchProcessingTime} seconds`);

        nextToken = invocationLogs.nextToken;
        batchCount++;

        // Check if the backup job is running for the first time, if so, then stop the job, so that historical logs won't be backed up
        if (firstTime) {
            console.log('Running for the first time, stopping the job');
            break;
        }

        // Check if there is enough time to run the next batch in this current script invocation cycle
        if (context.startTime + context.timeout - (averageBatchProcessingTime * RESERVE_BATCH_TIME_MULTIPLIER * 1000) < Date.now()) {
            console.log('Not enugh time left to process the next batch, going to start a new job');

            // If not, then trigger a new scipt invocation and send along last invocation start time of the last processed log, so the new job can continue where it left off
            await triggerScript('RunBackupJob', {
                payload: {
                    lastInvocationStartTime
                } as Event
            });

            break;
        }
    } while (!!nextToken);
}

async function invocationLogExists(bucketName: string, awsRegion: string, invocationLog: ScriptInvocationLog) {
    try {
        // TODO: replace it with Managed API call once the `getObjectHead` method gets fixed.
        const response = await AWS.fetch(`https://${bucketName}.s3.${awsRegion}.amazonaws.com/${invocationLog.invocationId}/invocation.json`);

        if (response.ok) {
            return true;
        } else if (response.status === 404) {
            return false;
        } else {
            throw (`Unexpected response while getting object HEAD: ${invocationLog.invocationId}`);
        }

        // const head = await AWS.S3.Object.getObjectHead<null>({
        //     bucket: BUCKET_NAME,
        //     key: `${invocationLog.invocationId}/invocation.json`,
        //     region: AWS_REGION,
        //     errorStrategy: {
        //         handleHttp404Error: () => null
        //     }
        // });

        // return head !== null;
    } catch (e) {
        console.error(`Failed to check if invocation log exists: ${invocationLog.invocationId}`, e);

        return true;
    }
}

async function processInvocation(concurrency: number, storeLargeLogMessageConcurrency: number, bucketName: string, awsRegion: string, teamId: string, invocationLog: ScriptInvocationLog) {
    try {
        await throttleAll(concurrency, [
            () => storeInvocationLog(bucketName, awsRegion, invocationLog),
            () => storeEventPayload(bucketName, awsRegion, teamId, invocationLog),
            () => storeConsoleLogs(storeLargeLogMessageConcurrency, bucketName, awsRegion, invocationLog.workspace.id, invocationLog.invocationId),
            () => storeHttpLogs(bucketName, awsRegion, invocationLog.workspace.id, invocationLog.invocationId),
        ]);

        console.log(`Invocation log stored: ${invocationLog.invocationId} - Start time: ${invocationLog.startTime}`);
    } catch (e) {
        console.error('Failed to store invocation log', e, invocationLog);
        // TODO: send yourself notification if this happens and try to backup the log manually
    }
}

async function storeInvocationLog(bucketName: string, awsRegion: string, invocationLog: ScriptInvocationLog) {
    await AWS.S3.Object.putObject({
        bucket: bucketName,
        key: `${invocationLog.invocationId}/invocation.json`,
        body: JSON.stringify(invocationLog),
        region: awsRegion,
        contentType: 'application/json'
    });
}

async function storeEventPayload(bucketName: string, awsRegion: string, teamId: string, invocationLog: ScriptInvocationLog) {
    if (invocationLog.triggerType !== 'MANUAL' && invocationLog.triggerType !== 'SCHEDULED') {
        const eventPayload = await getEventPayload(SRC, teamId, invocationLog.invocationId);

        if (!eventPayload) {
            throw Error(`Event payload does not exist, but it should for invocation: ${invocationLog.invocationId}`);
        }

        await AWS.S3.Object.putObject({
            bucket: bucketName,
            key: `${invocationLog.invocationId}/eventPayload.json`,
            body: eventPayload,
            region: awsRegion,
            contentType: 'application/json'
        });
    }
}

async function storeConsoleLogs(concurrency: number, bucketName: string, awsRegion: string, workspaceId: string, invocationId: string) {
    const consoleLogs = await getConsoleLogs(SRC, workspaceId, invocationId);

    if (consoleLogs) {
        await throttleAll(concurrency, [
            () => AWS.S3.Object.putObject({
                bucket: bucketName,
                key: `${invocationId}/consoleLogs.json`,
                body: JSON.stringify(consoleLogs),
                region: awsRegion,
                contentType: 'application/json'
            }),
            ...consoleLogs.invocationLogs.filter(l => l.largePayload).map(l => () => storeLargeLogMessage(bucketName, awsRegion, workspaceId, invocationId, l.id))
        ]);
    }
}

async function storeHttpLogs(bucketName: string, awsRegion: string, workspaceId: string, invocationId: string) {
    const httpLogs = await getHttpLogs(SRC, workspaceId, invocationId);

    if (httpLogs) {
        await AWS.S3.Object.putObject({
            bucket: bucketName,
            key: `${invocationId}/httpLogs.json`,
            body: httpLogs,
            region: awsRegion,
            contentType: 'application/json'
        });
    }
}

async function storeLargeLogMessage(bucketName: string, awsRegion: string, workspaceId: string, invocationId: string, logMessageId: string) {
    const largeMessage = await getLargeLogMessage(SRC, workspaceId, invocationId, logMessageId);

    if (!largeMessage) {
        throw Error(`Large log message does not exist, but it should: ${logMessageId}`);
    }

    await AWS.S3.Object.putObject({
        bucket: bucketName,
        key: `${invocationId}/largeMessages/${logMessageId}.json`,
        body: largeMessage,
        region: awsRegion,
        contentType: 'application/json'
    });

}

interface Params {
    TEAM_ID: string;
    S3_BUCKET_NAME: string;
    AWS_REGION: string;
    Advanced: {
        BATCH_SIZE: number;
        STOP_AFTER_LOGS_EXIST: number;
        RESERVE_BATCH_TIME_MULTIPLIER: number;
        STORE_LOG_JOB_CONCURRENCY: number;
        STORE_LARGE_LOG_MESSAGE_CONCURRENCY: number;
    }
}
TypeScriptSRConnectAPI

import { GenericAppApi } from "@managed-api/generic-sr-connect";

export async function getInvocationLogs(apiConnection: GenericAppApi, teamId: string, pageSize: number, nextToken?: string, startTime?: string) {
    return await refetch<GetScriptInvocationLogsResponse>(apiConnection, `/v1/team/${teamId}/invocationLogs?pageSize=${pageSize}&orderBy=startTime&orderByDirection=desc&executionStatuses=FINISHED,ABORTED,TIMED_OUT,FUNCTION_ERROR,RUNTIME_ERROR,DENIED,MALFORMED_PAYLOAD_ERROR${startTime ? `&to=${startTime}` : ''}${nextToken ? `&nextToken=${nextToken}` : ''}`);
}

export async function getEventPayload(apiConnection: GenericAppApi, teamId: string, invocationId: string) {
    const response = await refetch<DownloadableResource>(apiConnection, `/v1/team/${teamId}/invocationPayload/${invocationId}`);

    return (await downloadResource(response.url))?.arrayBuffer();
}

export async function getConsoleLogs(apiConnection: GenericAppApi, workspaceId: string, invocationId: string) {
    const response = await refetch<DownloadableResource>(apiConnection, `/v1/workspace/${workspaceId}/invocation/${invocationId}/consoleLogs`);

    return (await downloadResource(response.url))?.json<ConsoleLogsResponse>();
}

export async function getHttpLogs(apiConnection: GenericAppApi, workspaceId: string, invocationId: string) {
    const response = await refetch<DownloadableResource>(apiConnection, `/v1/workspace/${workspaceId}/invocation/${invocationId}/httpLogs`);

    return (await downloadResource(response.url))?.arrayBuffer();
}

export async function getLargeLogMessage(apiConnection: GenericAppApi, workspaceId: string, invocationId: string, logMessageId: string) {
    const response = await refetch<DownloadableResource>(apiConnection, `/v1/workspace/${workspaceId}/invocation/${invocationId}/largeLogMessage/${logMessageId}`);

    return (await downloadResource(response.url))?.arrayBuffer();
}

async function downloadResource(url: string): Promise<Response<any> | undefined> {
    const resourceDownloadResponse = await fetch(url);

    if (!resourceDownloadResponse.ok) {
        // Check if resource does not exists
        if (resourceDownloadResponse.status === 404) {
            // If so, return undefined
            return undefined;
        } else {
            // Otherwise throw error, since it is unexpected
            throw Error(`Unexpected response while downloading resource: ${resourceDownloadResponse.status} - URL: ${url} - ${await resourceDownloadResponse.text()}`);
        }
    }

    return resourceDownloadResponse;
}

/**
 * Custom fetch function that re-tries when being throttled
 */
async function refetch<T>(apiConnection: GenericAppApi, url: string): Promise<T> {
    while (true) {
        const response = await apiConnection.fetch(url);

        if (response.ok) {
            return await response.json();
        } else {
            if (response.status === 429) {
                await new Promise(resolve => setTimeout(resolve, 1000)); // Wait 1 second before retrying
                return await refetch(apiConnection, url);
            } else {
                throw Error(`Unespected response: ${response.status} - URL: ${url} - ${await response.text()}`);
            }
        }
    }
}

export interface WorkspaceResource {
    id: string;
    name: string;
}

export interface ScriptInvocationLog {
    invocationId: string;
    executionDuration: number;
    consoleLogsCount: number;
    httpLogsCount: number;
    workspace: WorkspaceResource;
    environment: WorkspaceResource;
    script: WorkspaceResource;
    startTime: string;
    invocationType: string;
    triggerType: string;
    rootTriggerType: string;
    executionStatus: 'RUNNING' | 'FINISHED' | 'ABORTED' | 'TIMED_OUT' | 'FUNCTION_ERROR' | 'RUNTIME_ERROR' | 'DENIED' | 'MALFORMED_PAYLOAD_ERROR';
    denialReason: string;
}

export interface GetScriptInvocationLogsResponse {
    invocations: ScriptInvocationLog[];
    nextToken: string;
}

interface DownloadableResource {
    url: string;
}

interface ConsoleLogsResponse {
    invocationLogs: {
        id: string;
        largePayload: boolean;
    }[];
}
Documentation ยท Support ยท Suggestions & feature requests