Infrastructure as Code with AWS CDK

In a previous article I showed you how to manage multiple lambdas in a mono repo using webpack and how to deploy them to AWS using the CLI. Of course for that deployment command to work the resources had to be previously created on AWS. Behind the curtain I created the lambdas and a REST API using API Gateway to expose those lambdas to client code using the AWS Console (web UI) but I didn’t show you how I did it.

Infrastructure as Code

Although creating and configuring resources using the AWS Console is a good way to understand how AWS works, it’s not a recommended approach for an app in production. Instead, your app’s infrastructure should be defined as code and tracked by git as any other part of your system. This practice is known as Infrastructure as Code (IaC) and allows your infrastructure to be explicitly defined and auditable while making your deployments repeatable.

At least to my knowledge there are four different ways to define IaC for an app: CloudFormation, SAM, Terraform and CDK.

CloudFormation is specific to AWS and uses a YAML configuration file to model the infrastructure. Writing CloudFormation templates is daunting as it requires you to define every single detail of every resource using complex configuration options. 

SAM is also an AWS product that tries to overcome CloudFormation complexity, specifically for serverless resources, by providing a higher abstraction configuration layer also written in YAML that eventually compiles to a CloudFormation template. 

Terraform is not specific to AWS and can be used with multiple cloud providers. It uses a configuration language that gives you more flexibility than a YAML file.

CDK (Cloud Development Kit) is another AWS product that allows you to use a regular programming language like Typescript, Javascript, Python, Java, C# or Go to model your infrastructure. 

AWS CDK

As a front-end developer, modeling my infrastructure with AWS CDK is by far my favourite option as I can use the same language (Typescript) that I use for every other part of my app and take advantage of tools like autocompletion. 

The CDK comes with sensible defaults making modeling a resource very easy as you only need to configure what you want to be different from the recommended AWS setup. This greatly reduces the learning curve for a developer to start managing the infrastructure and further brings down the wall between dev and ops to achieve a real devops mindset.

Another advantage is how easy it is to define roles and permissions with the CDK. For me, roles and permissions are usually a headache to do right on AWS as defining policies is quite complex. The CDK makes this very easy by providing convenient APIs again with sensible defaults as we will see later.

Last but not least, the CDK also allows you to model your CI/CD pipeline. The pipeline updates itself whenever a change of its workflow is added to the code and the app is redeployed. For a developer like me it’s a win-win scenario.

Example Project

In this article I will show you how to use AWS CDK to model and deploy to AWS a couple of lambda functions that are made public through a REST API. The two lambdas that I’m going to use are the ones defined here. I’ll also create a sample client code that connects to those REST endpoints to consume data from the lambdas and I’ll model the whole project as a monorepo using npm workspaces and typescript project references.

Initial Setup

Just as an FYI, I’ll be using Node 16.7 and npm 8.1. As the usual first step let’s create a folder for our project.

~ $ mkdir cdk-project
~ $ cd cdk-project

Once in the project folder we can create the root package.json file.

$ npm init -y

I always like to have a minimalistic configuration for my projects so I’ll remove any property that it’s not absolutely needed.

// package.json (before)
{
  "name": "cdk-project",
  "version": "1.0.0",
  "description": "",
  "main": "index.js",
  "scripts": {
    "test": "echo \"Error: no test specified\" && exit 1"
  },
  "keywords": [],
  "author": "",
  "license": "ISC"
}
// package.json (after)
{
  "name": "cdk-project",
  "author": "David Barreto <barretollano@gmail.com>",
  "license": "MIT",
  "private": true
}

I don’t ever plan to publish this code to npm as a package so I made it private and removed the main entry file reference along with other properties that only make sense for indexing the package on npm.

As I mentioned earlier, our project will be a mono repo that consists of three internal packages, the lambdas, the client code, and the cdk. We can use the property workspaces to define those internal packages for our project.

// package.json
{
  ...
  "workspaces": [
    "cdk",
    "client",
    "lambdas"
  ]
}

The order in the array is not relevant for us right now so you can define it differently. With this setup we can now require an internal package from any other internal package without using relative paths. It works as if it were a package installed from npm and stored in node_modules.

The example code below shows you how you could reference the internal package cdk from code defined in the package client.

// client/example.js
const { something } = require("cdk");

Because now these internal packages are used in the same way as any other package installed from npm, we need to be careful about name collisions. If you ever want to install a package from npm that’s also called cdk you might have a problem. To avoid this, I’m going to create a @local scope as a way to namespace our internal packages.

// package.json
{
  ...
  "workspaces": [
    "@local/cdk",
    "@local/client",
    "@local/lambdas"
  ]
}

With this new definition our example code shown before will change to use the @local scope to reference the internal package and thus avoid name collisions.

// @local/client/example.js
const { something } = require("@local/cdk");

Keep in mind that for this to work your folder structure needs to follow the same structure defined by the workspaces property. In our case, the folders for our internal packages will need to be nested inside a @local folder as shown below.

.
├── @local
│   ├── cdk
│   ├── client
│   └── lambdas
└── package.json

Creating the “Lambdas” Internal Package

We can focus our attention now on creating our first internal package inside the @local/lambdas folder with the code from my previous article defined in this Github repo. That code defines two very simple lambdas (lambda-a and lambda-b) that are created with typescript but compiled to javascript as a bundle with the help of webpack.

Let’s clone the code in the corresponding folder.

$ git clone git@github.com:barretodavid/lambdas-webpack.git @local/lambdas

Cloning a project creates a .git folder with previous changes. I don’t want to deal with git submodules as my goal is to have the simplest setup possible and submodules are anything but easy. I’ll delete that .git folder inside @local/lambdas and simply initialize a new git repo at the root of the project.

$ rm -rf @local/lambdas/.git
$ git init
$ git add -A
$ git commit -m “Initial commit”

Similarly, I’ll move the VSCode configuration settings out of the internal package and into the root folder so it applies to every other package to avoid duplication.

$ mv @local/lambdas/.vscode .vscode

Although not required, I’m going to change the name of the package from lambdas-webpack to @local/lambdas in the package.json file to have it aligned with our workspaces definition in our root configuration file.

// @local/lambdas/package.json
{
  "name": "@local/lambdas",
  ...
}

When having a monorepo, the command npm install will only ever be run from the root folder and most, if not all dependencies, will be stored in the root node_modules folder. Conversely, there will only be a single package-lock.json file at the root folder as well. To avoid any conflicts, let’s remove the nested package-lock.json file from @local/lambdas.

$ rm @local/lambdas/package-lock.json

Keep in mind that there might be more than one node_modules folder on the project. The main one will be located at the root folder and in the case that two internal packages use the same library but with two different versions, a nested node_module will be created for that internal package to allow for disambiguation.

Luckily for us, folders and files defined in our root .gitignore will apply recursively to any internal package.

// .gitignore
node_modules

Setting Up the CDK Internal Package

Having the code for the lambdas, the next step is to define our infrastructure to deploy those lambdas to AWS with the help of the AWS CDK. 

We will need to start by installing the package from npm globally so we can use the cli to bootstrap our internal package.

$ npm install -g aws-cdk

We can now create a CDK project inside the @local/cdk folder without initializing git or installing dependencies by using the flag generate-only. The command has to be run inside of the target folder so we have to cd into it first.

$ cd @local/cdk
$ @local/cdk> cdk init app --language typescript --generate-only
$ @local/cdk> cd ../..

The created package.json file has a bin property that will attempt to add the file bin/cdk.js to our PATH when running npm install. I’m not quite sure why this is needed but it creates some problems. As with any typescript project you don’t add javascript files to source control which means that when you clone this project later on and try to run npm install for the first time, the installation will fail because the file cdk.js doesn’t exist as the file cdk.ts hasn’t been compiled yet but you can’t compile it if you can’t install the dependencies (tsc).

This is a strange chicken and egg situation and given that I couldn’t find a reason why this bin property is needed for the cdk, I just went ahead and removed it from the configuration file along with some other minor changes.

// @local/cdk/package.json
{
  "name": "cdk",         // rename to @local/cdk
  "version": "0.1.0",    
  "bin": {               // delete        
    "cdk": "bin/cdk.js"  // delete
  },                     // delete
  "private": true,         // add
  "scripts": {
    "build": "tsc",
    "watch": "tsc -w",
    "test": "jest",      
    "cdk": "cdk"
  },
  "devDependencies": {
    "@aws-cdk/assertions": "1.132.0",
    "@types/jest": "^26.0.10",
    "@types/node": "10.17.27",
    "jest": "^26.4.2",
    "ts-jest": "^26.2.0",
    "aws-cdk": "1.132.0",
    "ts-node": "^9.0.0",
    "typescript": "~3.9.7"
  },
  "dependencies": {
    "@aws-cdk/core": "1.132.0",
    "source-map-support": "^0.5.16"
  }
}

Another strange finding is that the tsconfig.json file that’s created by the cdk has redundant properties. Setting the strict flag to true automatically enables all related properties like noImplicitAny, noImplicitThis, etc. so there’s no need to enable them explicitly as they have done. 

Also, the value set for the property typeRoots is the default value of the property so having it defined explicitly is again not needed. Because I like minimalism, I’ll delete those redundant properties from tsconfig.json.

// @local/cdk/tsconfig.json
{
  "compilerOptions": {
    "target": "ES2018",
    "module": "commonjs",
    "lib": ["es2018"],
    "declaration": true,
    "strict": true,
    "noImplicitAny": true,     // delete (redundant)
    "strictNullChecks": true,  // delete (redundant)
    "noImplicitThis": true,    // delete (redundant)
    "alwaysStrict": true,      // delete (redundant)
    "noUnusedLocals": false,
    "noUnusedParameters": false,
    "noImplicitReturns": true,
    "noFallthroughCasesInSwitch": false,
    "inlineSourceMap": true,
    "inlineSources": true,
    "experimentalDecorators": true,
    "strictPropertyInitialization": false,
    "typeRoots": ["./node_modules/@types"] // delete (auto discovered)
  },
  "exclude": ["node_modules", "cdk.out"]
}

The simplified configuration file now looks like this:

// @local/cdk/tsconfig.json
{
  "compilerOptions": {
    "target": "ES2018",
    "module": "commonjs",
    "lib": ["es2018"],
    "declaration": true,
    "strict": true,
    "noUnusedLocals": false,
    "noUnusedParameters": false,
    "noImplicitReturns": true,
    "noFallthroughCasesInSwitch": false,
    "inlineSourceMap": true,
    "inlineSources": true,
    "experimentalDecorators": true,
    "strictPropertyInitialization": false,
  },
  "exclude": ["node_modules", "cdk.out"]
}

Consolidation of Dependencies

One of the advantages of having a mono repo with npm workspaces is that shared dependencies can be consolidated at the root level to ensure that the same version is used on all internal packages that depend on it. This reduces the amount of disk space needed for the project and also simplifies maintenance as you no longer have to maintain different versions of the same package.

To take advantage of this feature we have to start by looking at the shared dev dependencies across both internal packages.

LibraryVersion on @local/cdkVersion on @local/lambdas
@types/jest^26.0.10^27.0.2
@types/node10.17.27^16.11.6
jest^26.2.0^27.3.1
ts-jest^26.2.0^27.0.7
typescript~3.9.7^4.4.4

There are five shared dependencies with different minimum versions for each. I’ll consolidate these dependencies at the root level using the latest version for each. This means that we will bump some dependencies to the next major version that technically includes a breaking change. I didn’t find any issue with this approach but if you do, consider using a version that’s compatible for both internal packages.

In short, shared dependencies will be removed from the internal packages configuration file and into the root package.json file. The result of those changes is shown below.

// package.json
{
  ...
  "devDependencies": {
    "@types/jest": "^27.0.2",
    "@types/node": "^16.11.6",
    "jest": "^27.3.1",
    "prettier": "^2.4.1",
    "ts-jest": "^27.0.7",
    "typescript": "^4.4.4"
  }
}
// @local/cdk/package.json
{
  ...
  "devDependencies": {
    "@aws-cdk/assertions": "1.132.0",
    "aws-cdk": "1.132.0",
    "ts-node": "^9.0.0"
  },
  "dependencies": {
    "@aws-cdk/core": "1.132.0",
    "source-map-support": "^0.5.16"
  }
}
// @local/lambdas/package.json
{
  ...
  "devDependencies": {
    "@types/aws-lambda": "^8.10.84",
    "@types/ramda": "^0.27.46",
    "clean-webpack-plugin": "^4.0.0",
    "ts-loader": "^9.2.6",
    "webpack": "^5.60.0",
    "webpack-cli": "^4.9.1"
  },
  "dependencies": {
    "ramda": "^0.27.1"
  }
}

With this setup we can finally install our project dependencies.

$ npm install

After the installation completes we will have a single node_modules folder and package-lock.json file at the root level of the project.

Extending tsconfig

In the same way that we have centralized npm dependencies, we should also centralize shared typescript configuration options by defining a root tsconfig.json file and then using the extends keyword to inherit those properties on each nested typescript configuration file found in the internal packages.

// tsconfig.json
{
  "compilerOptions": {
    "target": "ES2020",
    "lib": ["ES2020"],
    "module": "commonjs",
    "esModuleInterop": true,
    "forceConsistentCasingInFileNames": true,
    "strict": true,
    "skipLibCheck": true
  }
}
// @local/lambdas/tsconfig.json
{
  "extends": "../../tsconfig.json"
}
// @local/cdk/tsconfig.json
{
  "extends": "../../tsconfig.json",
  "compilerOptions": {
    "declaration": true,
    "noUnusedLocals": false,
    "noUnusedParameters": false,
    "noImplicitReturns": true,
    "noFallthroughCasesInSwitch": false,
    "inlineSourceMap": true,
    "inlineSources": true,
    "experimentalDecorators": true,
    "strictPropertyInitialization": false
  },
  "exclude": ["node_modules", "cdk.out"]
}

With this arrangement we can now create our first project level script that performs a build on both internal packages. We can use the flag -w to indicate that a given script is found and should be run in the context of the provided workspace.

// package.json
{
  ...
  "scripts": {
    "build": "npm run bundle -w @local/lambdas && npm run build -w @local/cdk"
  },
  ...
}

We can now run the script and see how webpack generates the files for our lambdas and the cdk compiles our ts files into js.

$ npm run build

Modeling Infrastructure with the CDK

We have finally arrived at the central point of this tutorial, the modeling of our infrastructure using code. 

The infrastructure that we want to model is shown below. A client code (node, React, postman, etc.) performs an HTTP call to an AWS endpoint provided by the API Gateway that, depending on the path of the url, invokes either lambda-a or lambda-b.

AWS CDK has three main entities for modeling our infrastructure, the App sitting at the top level, the Stack where different environments can be configured (dev, prod, staging, etc.) and Constructs that can come from the AWS Construct Library or custom made by us.

At a high level, to model the infrastructure shown below, we will need to create the hierarchy shown below where the inner most elements are Constructs.

To model our lambdas we will need to make use of the AWS Construct Libraries. The library is available as independent npm packages grouped by resource type so, in order to model a lambda, we will need to install the package @aws-cdk/aws-lambda.

$ npm install @aws-cdk/aws-lambda -w @local/cdk

We can now modify the file cdk-stack.ts to model both of our lambdas using the node runtime as shown below.

// @local/cdk/lib/cdk-stack.ts

import { Stack, Construct, StackProps } from "@aws-cdk/core";
import { Function, Runtime, Code } from "@aws-cdk/aws-lambda";
import * as path from "path";

export class CdkStack extends Stack {
  constructor(scope: Construct, id: string, props?: StackProps) {
    super(scope, id, props);

    const lambdaA = new Function(this, "lambda-a", {
      runtime: Runtime.NODEJS_14_X,
      handler: "index.handler",
      code: Code.fromAsset(
        path.resolve(__dirname, `../../lambdas/dist/lambda-a/`)
      ),
    });

    const lambdaB = new Function(this, "lambda-b", {
      runtime: Runtime.NODEJS_14_X,
      handler: "index.handler",
      code: Code.fromAsset(
        path.resolve(__dirname, `../../lambdas/dist/lambda-b/`)
      ),
    });
  }
}

If this is your first time using the CDK you will need to run the bootstrap command before you can do any deployment so let’s create an npm script for both operations.

// @local/cdk/package.json
{
  ...
  "scripts": {
    ...
    "bootstrap": "cdk bootstrap",
    "deploy": "cdk deploy --require-approval never"
  },
  ...
}

Bootstrapping the CDK will create resources that are required for the CDK to create, update or destroy any other stack you create from now on.

$ npm run bootstrap -w @local/cdk

Running this command creates a default stack named CDKToolkit on CloudFormation as shown below from the AWS Console.

We can now safely deploy our app with the CDK.

$ npm run deploy -w @local/cdk

This should create a new stack on CloudFormation called CdkStack.

You can test both lambdas on the AWS Console to verify that they actually work.

Deployment Refactoring

The code that we copied over to @local/lambdas included npm scripts to directly update our lambdas code on AWS. This is no longer needed as we want to delegate that process to the CDK. Let’s remove all those scripts to avoid future confusion.

// @local/lambdas/package.json
{
  ...
  "scripts": {
    "test": "jest",
    "bundle": "webpack",
    "zip:lambda": "...",    // delete
    "zip:all": "...",       // delete
    "update:lambda": "...", // delete
    "update:all": "...",    // delete
    "deploy:lambda": "...", // delete
    "deploy:all": "..."     // delete
  },
  ...
}

Another change we could make is to reduce code duplication in our stack code by creating a custom construct for our lambdas with the base configuration we want. This way we don’t have to define things like the runtime or handler twice.

To do this, we have to create a new file with a class that inherits from the base class Construct and where we define our lambda base on the id field passed down from the cdk-stack.ts file.

// @local/cdk/lib/lambda.ts

import { Construct } from "@aws-cdk/core";
import { Function, Runtime, Code } from "@aws-cdk/aws-lambda";
import * as path from "path";

export class Lambda extends Construct {
  ref: Function;

  constructor(scope: Construct, id: string) {
    super(scope, id);

    this.ref = new Function(this, id, {
      runtime: Runtime.NODEJS_14_X,
      handler: "index.handler",
      code: Code.fromAsset(
        path.resolve(__dirname, `../../lambdas/dist/${id}/`)
      ),
    });
  }
}

We can now refactor cdk-stack.ts file to make use of our custom construct as shown below.

// @local/cdk/lib/cdk-stack.ts

import { Stack, Construct, StackProps } from "@aws-cdk/core";
import { Lambda } from "./lambda";

export class CdkStack extends Stack {
  constructor(scope: Construct, id: string, props?: StackProps) {
    super(scope, id, props);

    const lambdaA = new Lambda(this, "lambda-a");
    const lambdaB = new Lambda(this, "lambda-b");
  }
}

Creating the REST API

To expose our lambdas to the world we will use the CDK to configure an API Gateway that reaches out to our lambdas based on the path defined in the URL.

As before, to be able to model this REST endpoint with the API Gateway we need to install the related package from the AWS Construct Library.

$ npm install @aws-cdk/aws-apigateway -w @local/cdk

In our stack we can now define our endpoint using the RestApi construct with CORS enabled so it can be reached from any domain. We also define the mappings between the url path used and the lambda to invoke.

// @local/cdk/lib/cdk-stack.ts
...

export class CdkStack extends Stack {
  constructor(...) {
    ...
    
    const api = new RestApi(this, "example-api", {
      defaultCorsPreflightOptions: {
        allowOrigins: ["*"],
      },
    });

    api.root
      .addResource("lambda-a")
      .addMethod("GET", new LambdaIntegration(lambdaA.ref));
    api.root
      .addResource("lambda-b")
      .addMethod("GET", new LambdaIntegration(lambdaB.ref));    
  }
}

As we have changed the infrastructure we will need to deploy our changes to AWS using the CDK to create a changeset for our stack.

$ npm run deploy -w @local/cdk
>>>
...
Outputs:
CdkStack.exampleapiEndpoint9C2B6BF5 = https://s3icllvorc.execute-api.us-east-1.amazonaws.com/prod/

On every deployment, the URL for our endpoint changes and is shown as part of the output of the command.

We can head out to the API Gateway section on the AWS Console to verify that our new API has been created and it’s connected to the appropriate lambda.

Client App

So far we have only interacted with the lambdas and the REST API using the AWS Console but you most likely will have a client application (node or frontend) that will need to talk to the backend we just deployed.

The challenge here is that the url of the backend changes on every deployment so we need to find a way to keep our client code in sync with the new url everytime we do a deployment.

Let’s start by creating the final internal package inside the folder @local/client starting with it’s package.json file.

// @local/client/package.json
{
  "name": "@local/client",
  "private": true,
  "main": "dist/main.js",
  "scripts": {
    "start": "node dist/main.js",
    "clean": "rm -rf dist",
    "build": "tsc"
  },
  "dependencies": {
    "axios": "^0.24.0"
  }
}

The only interesting thing here is that our client app will be a nodejs app written in typescript that will need to be compiled to javascript and that it makes use of axios to make the HTTP requests.

Let’s now create the package’s tsconfig.json file that as before inherits some base properties from the root configuration file.

// @local/client/tsconfig.json
{
  "extends": "../../tsconfig.json",
  "references": [{ "path": "../cdk" }],
  "compilerOptions": {
    "rootDir": "src",
    "outDir": "dist"
  }
}

Our client code will need to reach out to some code from the cdk internal package to obtain the new url created between deployments. To allow for this cross package communication similar to how npm workspaces enables it for javascript code, we will need to use a project reference to point to the related internal package.

For this wiring to work, we will have to enable the property composite on our base tsconfig.json file as the documentation recommends.

// tsconfig.json
{
  "compilerOptions": {
    ...
    "composite": true
  }
}

We can now define the code for our client that uses the URL created by the CDK on every deployment and talk to our endpoints using Axios.

// @local/client/src/main.ts

import { url } from "@local/cdk";
import axios from "axios";

const init = async () => {
  const urlA = `${url}lambda-a`;
  const urlB = `${url}lambda-b`;
  const { data: responseA } = await axios.get(urlA);
  const { data: responseB } = await axios.get(urlB);
  console.log(`lambda-a: ${responseA}`);
  console.log(`lambda-b: ${responseB}`);
};

init();

Exporting the URL from the CDK

We instructed the client to reach out to our internal package @local/cdk to obtain the url variable but no one is yet exporting that value. For this to work we need to define the main entry point for the internal package inside package.json

// @local/cdk/package.json
{
  ...
  "main": "dist/main.js",
  ...
}

Next, we need to create the file main.ts that we will eventually compile to main.js inside the dist folder.

// @local/cdk/dist/main.ts

import output from "./output.json";

const stackInfo = output["CdkStack"];
export const url = Object.values(stackInfo)[0];

At this point typescript will be complaining that it’s not able to import a json file, to fix it we need to make some changes to the package’s tsconfig.json file.

// @local/cdk/dist/main.ts
{
  ...
  "compilerOptions": {
    ...
    "resolveJsonModule": true
  },
  "include": [
    "bin/**/*",
    "lib/**/*",
    "test/**/*",
    "dist/**/*",
    "dist/**/*.json"
  ],
  "exclude": ["node_modules", "cdk.out"] // delete
}

Typescript is now able to import json files but it will keep complaining because the file output.json doesn’t yet exist. We will instruct the CDK to output the new REST API URL into a file.

@local/cdk/package.json
{
  ...
  "scripts": {
    ...
    "deploy": "cdk deploy --require-approval never --outputs-file dist/output.json"
  },
  ...
}

To create the file we will have to run the deployment script again. Because we haven’t made any changes to the infrastructure since the last time we ran the command, no changeset will be created but the output.json file will be created inside the dist folder.

$ npm run deploy -w @local/cdk

To close the loop we have to compile our dist/main.ts file into javascript so the main property of package.json is pointing to a valid (existing) file that can be referenced from another internal package.

$ npm run build -w @local/cdk

Reducing Compilation Time

So far we have compiled typescript files to javascript by running an npm script called build using the -w flag to target the related workspace. This works but as your codebase starts to grow it will become slower and slower. To overcome that, the typescript team created a new way to compile typescript files in a mono repo setup that improves its performance.

To enable this improved mode, we will create a separate tsconfig file that will only have references to the projects we want to compile in tandem.

// tsconfig.build.json
{
  "files": [],
  "references": [{ "path": "./@local/client" }, { "path": "./@local/cdk" }]
}

We didn’t include the package @local/lambdas because we aren’t using tsc to compile the typescript files. Instead we are using webpack with ts-loader to create our bundles so we leave it out of this performance enhancement.

To use this new enhanced compilation mode we can use the tsc command with the flag -b along with our new configuration file. Typescript will make sure to do the rest. While we are at it, let’s create another two npm scripts to simplify our building and deployment process of the whole project.

// package.json
{
  ...
  "scripts": {
    "compile": "tsc -b tsconfig.build.json",
    "build": "npm run bundle -w @local/lambdas && npm run compile",
    "deploy": "npm run build && npm run deploy -w @local/cdk"
  },
  ...
}

Once everything is wired up we can now run our client code and verify that it’s able to talk to our lambdas through the configured API Gateway.

$ npm run deploy
$ npm run start -w @local/client
>>>
lambda-a: Hello DAVID
lambda-b: Hello MARIA

And there you have it, it works.

Conclusion

After an incredible long tutorial we have seen how to create a mono repo project using npm workspaces and the cdk to define the project’s infrastructure. 

Working with the CDK has been an amazing experience as being able to use Typescript to model the infrastructure is for me a game changer. Yet I can see the project is still in its early stages as it feels a little rough around the edges. The way the cdk init command scaffolds the app could be improved to have a cleaner setup for example. I had to make a lot of small changes in its configuration as there was some redundancy and the chicken and the egg situation with the bin property of the package.json file was confusing. Also, having the compiled files (js and d.ts) side by side the source code (ts) is not a clean approach IMO.

I like working with npm workspaces but I was surprised how much extra configuration was needed to make typescript behave in the same way. I really hope that in the future typescript is able to read the same workspace property to self configure for a mono repo setup instead of relying on some cli flags that feels like a workaround instead of a proper solution. I think I spent more time understanding how to instruct Typescript to handle workspaces than understanding how to use the CDK.

If everything goes to plan, I’ll write a follow-up post on how to use AWS CodePipeline with the CDK to create a CI/CD workflow. See you then. 

As always, you can find the source code for this tutorial here.

So, what do you think?

This site uses Akismet to reduce spam. Learn how your comment data is processed.