Search code examples
typescriptaws-lambdavscode-debuggeraws-sam

TypeScript debugging of AWS SAM serverless apps in VS Code


I have a serverless function I'm building using AWS SAM within Visual Studio Code. The runtime I'm using is nodejs12.x but I'm writing everything in TypeScript then compiling it to JS into a /dist directory. That's the directory that I point all of my CloudFormation templates to in order to find the handlers. For example, the right is the TS and the left is my compiled JS.

enter image description here

In the sidebar you can see the /dist directory where my JS files are placed after I run tsc while a little further down is my template and TypeScript source.

My template then looks like this:

  LibraryAddMangaFunction:
    Type: AWS::Serverless::Function
    Properties:
      Handler: dist/src/handlers.LibraryAddMangaHandler
      FunctionName: Foobar
      Runtime: nodejs12.x
      MemorySize: 256
      Timeout: 300

I followed Amazon's documentation for performing local debugging within VS Code and got it to work for the JS files. I can set break-points without any issues and step through the compiled JS code.

What I'm wondering is if it's possible do debug while stepping through the TypeScript code, similar to what I would do in Chrome with client-side apps in Angular/React. I know for those the frameworks handles mapping of the JS to the TS as part of the build process to support the Chrome debug tools needs. I'm not sure if what I'm wanting to do is possible without support from SAM/AWS.

Also, when my HTTP request completes the debug session ends instead of continuing to run and listening for the next request - similar to sam local start-api. Is there a way to configure the debug runner so it keeps the app running? I've seen some recommend using Thundra but that requires the Lambda to be deployed into AWS and I really like the rapid cycle of debug local, tweak, debug. I'd really prefer not to do a sam deploy every 15 seconds and wait for the stack to deploy.

Thanks!


Solution

  • Yes, it can be done: what you need are sourcemaps

    In my case I had my lambdas being compiled and bundled into 1 single index.js file under the dist folder. I shipped this along with a single .map file using webpack as a bundler.

    const path = require('path');
    const glob = require('glob');
    const webpack = require('webpack');
    const CreateFileWebpack = require('create-file-webpack');
    const nodeExternals = require('webpack-node-externals');
    
    // Credits: https://hackernoon.com/webpack-creating-dynamically-named-outputs-for-wildcarded-entry-files-9241f596b065
    const entryArray = glob.sync('./src/lambda/**/handler.ts');
    
    const entryObject = entryArray.reduce((acc, item) => {
      let name = path.dirname(item.replace('./src/lambda', ''));
      // conforms with Webpack entry API
      // Example: { ingest: './src/ingest/index.ts' }
      acc[name] = item;
      return acc;
    }, {});
    
    /** @type {import('webpack').Configuration} */
    module.exports = {
      cache: {
        type: 'memory',
      },
      entry: entryObject,
      devtool: false,
      target: 'node',
      // externals: [nodeExternals()],  // use if dependencies should not be bundled (like when using layers)
      module: {
        rules: [
          {
            test: /\.tsx?$/,
            loader: 'ts-loader',
            exclude: /node_modules/,
            options: {
              transpileOnly: true,
            },
          },
        ],
      },
      resolve: {
        extensions: ['.tsx', '.ts', '.js'],
      },
      plugins: [
        new webpack.IgnorePlugin({ resourceRegExp: /^pg-native$/ }),
        new webpack.IgnorePlugin({ resourceRegExp: /^hiredis$/ }),
        ...Object.keys(entryObject).map(lambda => {
          return new CreateFileWebpack({
            path: path.resolve(__dirname, 'build'),
            fileName: `${lambda}/package.json`,
            content: JSON.stringify({
              name: 'dummy_dependencies',
              dependencies: {},
              version: '1.0.0',
            }),
          });
        }),
        new webpack.SourceMapDevToolPlugin({
          columns: false,
          module: true,
          filename: '[file].map',
        }),
      ],
      // Output directive will generate build/<function-name>/index.js
      output: {
        filename: '[name]/index.js',
        path: path.resolve(__dirname, 'build'),
        devtoolModuleFilenameTemplate: '[absolute-resource-path]',
        // Credit to Richard Buggy!!
        libraryTarget: 'commonjs2',
      },
    };
    
    

    I don't know how would be in your case having multiple compiled files, but I assume it is a similar scenario with webpack or any other "compilation" tool (I've been looking lately into esbuild).

    Even a plain tsc with sourcemaps activated should be sufficient.

    Having done that, you should also use the correct .vscode launch configuration so vs code can map the remote files of the container with the local ones, while also connecting its corresponding typescript files through the sourcemaps.

        {
          "name": "[Serverless] <lambda-name> attach",
          "type": "node",
          "request": "attach",
          "address": "localhost",
          "port": 5678,
          "localRoot": "${workspaceRoot}/.serverless/build/<lambda-folder>",
          "remoteRoot": "/var/task",
          "protocol": "inspector",
          "stopOnEntry": false,
          "outFiles": ["${workspaceRoot}/.serverless/build/<lambda-folder>"],
          "sourceMaps": true
        }
    

    Is there a way to configure the debug runner so it keeps the app running?

    If I got it correctly, what you're looking for is keeping the debugger connected between requests so you don't have to connect to it manually every time you INVOKE a lambda on local.

    If that's the case, you can accomplish that by running sam local start-lambda using --shutdown and --warm-containers LAZY flags (reference):

    sam local start-lambda -d 5678 --host 0.0.0.0 --shutdown --debug --warm-containers LAZY
    

    This will create an api similar to the one AWS uses on the cloud, so you can hit it using either aws-sdk or aws cli

    aws lambda invoke --function-name \"<sam-template-lambda-id>\" --payload <stringified-json-input> --endpoint-url \"http://127.0.0.1:3001\" lambda_invoke_output.log
    

    aws sam will keep the container warm until the lambda code changes, and the debugger would remain activated across all the INVOKE events you trigger against it, effectively stopping on the typescript breakpoints you set each time you call the lambda

    TL;DR

    Use sourcemaps, and search the whole internet on how to glue all the things together


    Personal note: The first time I was dealing with this it took me a looong and painful sail through the internet to get it done.

    Life shouldn't be like that :')