Engineering
Engineering

Managing AWS Accounts at Scale

Picture of a man writing on a dry-erase board with a red marker

At Enigma we store and process sensitive data for our clients that we're committed to protecting. In order to meet a wide range of client and compliance security requirements we manage more than 30 AWS accounts, each with a different function and purpose. This blog post will outline how the Infrastructure team manages AWS accounts at scale while still providing a simple interface for our developers and clients to interact with the data.

AWS Organizations

We use a single AWS Organization to manage all of our AWS accounts. Within our organization there are three kinds of accounts: management, internal, and client. Management accounts are tightly protected and contain company wide resources like VPN, monitoring, and source control. Internal accounts contain all Enigma infrastructure, such as the resources running Enigma Public and our shared data pipelines. Lastly, client accounts contain all client specific data and resources. These accounts are the most siloed, and access is only granted to those directly involved in the work.

Managing Accounts with Terraform

In order to manage a large and growing list of clients we have developed a controlled workflow with terraform that empowers any developer to request a new account and quickly get one back that meets all of Enigma's strict compliance and security requirements.

All of our accounts and their shared resources are managed in Terraform, in a single repo that is applied with Gitlab CI/CD on merge to master. Commits to master on this repo are forbidden and merge requests must be approved by a member of the Infrastructure team.

The first step in the new account process is either a ticket or a direct merge request to the subaccounts repo. The terraform for the MR looks something like this—

<div class="code-wrap"><code>##########################
### Client - Example   ###
##########################
module "subaccount_client_example" {
 source = "git::https://git.com/terraform-modules/subaccount.git//"
 name  = "client-example"
 email = "aws+client-example@example.com"
}
module "subaccount_client_example_base" {
 source = "git::https://git.com/terraform-modules/subaccount-base.git//"
 account_id      = "${ module.subaccount_client_example.id }"
 root_account_id = "${ var.root_account_id }"
 account_alias   = "client-example"
 account_type    = "client"
 tags = "${ var.tags }"
}</code></div>

You can see that we have two core modules that are consumed by our subaccounts repo. The first is the subaccount module, which creates a new AWS Organization subaccount using the aws_organizations_account resource and outputs a list of account attributes.

The second module, subaccount-base does the work of provisioning the account with all of Enigma's base account resources using the new OrganizationAccountAccessRole role. Here we define the account alias, IAM roles, SSO provider, KMS keys, AWS Config rules, and other mandatory resources that secure our accounts

Managing Access to Subaccounts

Once the merge request is approved and merged into master GitLab will apply the terraform and output any changes. The account is created and compliant with all Enigma policies, but how do we grant access to the account?

At Enigma we use Okta as an SSO solution and terraform has a third party module that supports Okta resources. With this combination we can set up an Okta identity provider in every new account, Okta groups that provide access to that account, and Okta rules that add predefined end users to those groups.

After applying subaccount-base end users can request access to one of 5 roles through Okta: Admin, EnigmaAdmin, PowerUser, ReadOnly, or ViewOnly.

A critical detail here is that no one outside of the Infrastructure team is granted Admin access, and that the EnigmaAdmin role has admin permissions except for an IAM Permissions Boundary that prevents the role from removing any subaccount-base resources (cloudtrail, config) or creating any resources that we already have modules for (vpc).

Because of the fast moving and ad hoc nature of Enigma's client commitments most engineers and data scientists are given PowerUser access, which lets them rapidly prototype with client data while still maintaining a secure AWS environment that separates billing, networking, and API access.

Encouraging Developer to Write Terraform

Terraform is a great tool that can make infrastructure easier to manage and reason about, however there's a significant language learning curve that is often too much for a developer who just wants to create a database and start analyzing data.

At Enigma we want our developers to write their own terraform for any long-lived production infrastructure, so as part of the subaccount-base module we create a new GitLab repo with pre-templated terraform code that bootstraps remote state management and some other non-critical resources in the new AWS account. The new repo is set up just like all of our other terraform repos, with a protected master branch and CI/CD that plans MRs and applies merges to master.

This lets our developers get started with terraform easily, they don't have to worry about local setup or remote state and they can copy from existing templates in the repo.

Next Step

As Enigma continues to grow, and we move more and more client commitments to the cloud, the infrastructure challenges will only get more complex.

How do we manage access to shared APIs? How do we keep data segregated but still accessible? How can we scale beyond Terraform, to thousands of different customers? The Infrastructure team is responsible for all these challenges. Come join us!

Related Resources