Alibaba​
To track a new Alibaba Data Source in your FinOps account, please select the Alibaba Cloud tab at the Data Source Connection step during the initial configuration or later on in the Settings section of the main page.
Name
​
In the first field, you can specify any preferred name to be used for this Data Source in FinOps.
​
Alibaba Cloud Access key ID
​
The Cloud Access key ID is a unique string that is assigned to your RAM user.
To find it:
-
Sign in to the Alibaba Cloud Portal as the RAM User.
-
Search for Resources Access Management.
-
Navigate to Users→your_user→User AccessKeys page.
Access Secret
Secret access key for the RAM user can be found in the AccessKey.csv file downloaded from the console during access key creation. Information about the secret will not be accessible through the UI after it has been created.
​​
(Optional) Creating a new RAM user with Secret#
Alternatively, you can create a separate user in your Alibaba cloud account for FinOps to operate with.
​
To do this:
-
Sign in to the Alibaba Cloud Portal as the RAM User.
-
Search for Resources Access Management.
-
Navigate to Users→Create User.
-
Provide a unique name for the new RAM user and enable Programmatic access by checking the corresponding box.
-
Copy and save Access Key ID and Secret or download the AccessKey.csv file to your computer.
Required policy
The account must have the following single permission to support FinOps: Read-only access to Alibaba Cloud services.
To add it:
-
Click on the "Add Permissions" button to the right of your RAM user.
-
Find ReadOnlyAccess in the list and click on it. It will appear in the Selected section.
-
Click OK.
Your Alibaba Data Source account should now be ready for integration with FinOps! Please contact our Support Team at support@megaops.io if you have any questions regarding the described configuration flow.
AWS Root account that has Data Export already configured
FinOps supports the AWS Organizations service that allows linking several Data Sources in order to centrally manage data of multiple users while receiving all billing exports within a single invoice. The Root account (payer) will be the only one having access to collective data related to cloud spendings. When registering this type of profile in FinOps, the user is given an option for Data Exports to be detected automatically.​
To track a new AWS Data Source in your FinOps account, please select the AWS Root Account tab at the Data Source Connection step during the initial configuration.
Automatic Billing Data Import in AWS
Step 1. Having Data Exports configured for your cloud account is the main prerequisite in order to proceed with the remaining actions. If Data Export hasn't been configured, refer to the following section:
Connecting an AWS Root account that doesn't have Data Export configured yet
​
Step 2. Update bucket policy
-
Navigate to the Permissions tab of your AWS S3 bucket and select Bucket Policy.
-
Replace <bucket_name> with the name of the bucket.
-
Replace <AWS account ID> with the AWS Account ID (12 digits without “-”):
Step 3. Create user policy for read only access
Go to Identity and Access Management (IAM) → Policies.
Create a new user policy for read only access to the bucket (<bucket_name> must be replaced in policy):
Step 4. Create user and grant policies
Go to Identity and Access Management (IAM) → Users to create a new user.
-
Attach the created policy to the user:
-
Confirm creation of the user.
-
Create access key for user (Identity and Access Management (IAM) → Users → Created user → Create access key):
-
Download or copy Access key and Secret access key. Use these keys when connecting a Data Source in FinOps as the AWS Access Key ID and AWS Secret Access Key, respectively (at step 5).
Step 5. Create Data Source in FinOps
Go to FinOps.
Register as a new user.
Log in as a registered user.
Create a Data Source.
Provide user credentials (see screenshot for more details).
AWS Access key ID
AWS Secret access key
Select Export type: AWS Billing and Cost Management → Data Exports → Find the report configured earlier → Export type.
Select “Connect only to data in bucket”.
Provide Data Export parameters:
Export Name: AWS Billing and Cost Management → Data Exports table → Export name.
Export S3 Bucket Name: AWS Billing and Cost Management → Data Exports table → S3 bucket.
-
Export path: AWS Billing and Cost Management → Data Exports table → Click on Export name → Edit → Data export storage settings → S3 destination → last folder name(without “/”)
-
After creating a Data Source, you will need to wait for the export to be generated by AWS and uploaded to FinOps according to the schedule (performed on an hourly basis).
Discover Resources #​
FinOps needs to have permissions configured in AWS for the user Data Source in order to correctly discover resources and display them under a respective section of the dashboard for the associated employee.
Make sure to include the following policy in order for FinOps to be able to parse EC2 resources data:
Your AWS Data Source should now be ready for integration with FinOps! Please contact our Support Team at support@megaops.io if you have any questions regarding the described configuration flow.
AWS Linked
FinOps supports the AWS Organizations service that allows linking several Data Sources in order to centrally manage data of multiple users while receiving all billing exports within a single invoice.
​
Selecting a AWS Linked tab will make the registration flow easier eliminating the option to input bucket information for billing purposes since this will be received through the root account, whose user will then be able to distribute periodic reports individually if intended by the company management. In this case, only Access Key and Secret Key are required.
Use "Connect" to create a Data Source in FinOps. If some of the provided values are invalid, an error message will indicate a failure to connect.
Discover Resources
FinOps needs to have permissions configured in AWS for the user Data Source in order to correctly discover resources and display them under a respective section of the dashboard for the associated employee.
​
Make sure to include the following policy in order for FinOps to be able to parse EC2 resources data:
Your AWS Data Source should now be ready for integration with FinOps! Please contact our Support Team at support@megaops.io if you have any questions regarding the described configuration flow.
Migrating from CUR to Data Exports CUR 2.0#
The information on this page can be useful if an AWS Data Source (Legacy CUR export schema) has already been connected and you want to configure CUR 2.0 data and update the AWS Data Source.
​
Migrate CUR to CUR 2.0 (using a new bucket)#
Create a new Data Export with CUR 2.0 schema. Navigate to AWS Billing & Cost Management
→ Data Exports page.
​
Step 1. Export type
Select “Standard data export“ export type.
​
Step 2. Export name
Input export name. The content of the "Export name" field will be required when updating an AWS Data Source in FinOps.
Step 3. Data table content settings:
Select "CUR 2.0".
Select "Include resource IDs" checkbox.
Choose the time granularity for how you want the line items in the export to be aggregated.
Step 4. Data export delivery options:
Pick "Overwrite existing data export file".
Select compression type.
​
Step 5. Data export storage setting
Configure a new bucket. The content of the "S3 path prefix" and "S3 bucket name" fields will be required when updating an AWS Data Source in FinOps.
Step 6. Review
Confirm export creation. Data Export will be prepared by AWS during 24 hours.
​
Click on the existing AWS Data Source on the Data Source page. The page with detailed information opens. Click the UPDATE CREDENTIALS button to update the Data Source credentials. Switch on Update Data Export parameters to update info about the billing bucket.
Select “Standard data export (CUR 2.0)” export type. Enter "Export name" from the first step as "Export name", "S3 bucket name" as "Export Amazon S3 bucket name", and "S3 bucket name" as "Export path prefix".
​
Save and wait for a new export to import!
Migrate CUR to CUR 2.0 (using an existing bucket)#
​
Use this case if you have already connected an AWS Data Source (on Legacy CUR export schema) and want to configure CUR 2.0 data into the same bucket.
Create a new Data Export with CUR 2.0 schema. Navigate to AWS Billing & Cost Management → Data Exports page.
​
Step 1. Export type
Select “Standard data export“ export type.
​
Step 2. Export name
Input export name. The content of the "Export name" field will be required when updating an AWS Data Source in FinOps.
Step 3. Data table content settings:
Select "CUR 2.0".
Select "Include resource IDs" checkbox.
Choose the time granularity for how you want the line items in the export to be aggregated.
Step 4. Data export delivery options:
​
Pick "Overwrite existing data export file".
Select compression type.
​
Step 5. Data export storage setting:
​
Select an existing bucket in the Data export storage settings section.
-
Input NEW S3 path prefix.
Click on the existing AWS Data Source on the Data Source page. The page with detailed information opens.
Click the UPDATE CREDENTIALS button to update the Data Source credentials. Switch on Update Data Export parameters to update info about the billing bucket.
Select “Standard data export (CUR 2.0)” export type and update "Export name" and “Export path prefix” fields, save and wait for a new export to import!
AWS Root account that doesn't have Data Export configured yet#
FinOps supports the AWS Organizations service that allows linking several Data Sources in order to centrally manage data of multiple users while receiving all billing reports within a single invoice. The Root account (payer) will be the only one having access to collective data related to cloud spendings. When registering this type of profile in FinOps, the user is given an option for Data Exports to be created automatically.
To track a new AWS Data Source in your FinOps account, please select the AWS Root Account tab at the Data Source Connection step during the initial configuration.
Automated Billing bucket and Data Export creation with FinOps#
Step 1. Create user policy for bucket and export creation access.
Go to Identity and Access Management (IAM) → Policies. Create a new policy for fully automatic configuration (both bucket and export are created) (<bucket_name> must be replaced in policy)
Step 2. Create user and grant policies
Go to Identity and Access Management (IAM) → Users to create a new user.
-
Attach the created policy to the user:
-
Confirm creation of the user.
-
Create access key for user (Identity and Access Management (IAM) → Users → Created user → Create access key):
-
Download or copy Access key and Secret access key. Use these credentials when connecting a Data Source in FinOps as the AWS Access Key ID and AWS Secret Access Key, respectively (at step 3):
Step 3. Create Data Source in FinOps:
Go to FinOps.
Register as a new user.
Log in as a registered user.
Create a Data Source.
Provide user credentials (see screenshot for more details):
AWS Access key ID
AWS Secret access key
Select Export type.
Select “Create new Data Export”.
Provide the parameters with which the bucket and Data Export will be created: “Export Name”, “Export S3 Bucket Name”(<new bucket name from user policy from step 1>) и “Export path prefix”.
-
After creating a Data Source, you will need to wait for AWS to generate the export and upload it to FinOps according to the schedule (approximately one day).
Discover Resources#
​
FinOps needs to have permissions configured in AWS for the user Data Source in order to correctly discover resources and display them under a respective section of the dashboard for the associated employee.
​
Make sure to include the following policy in order for FinOps to be able to parse EC2 resources data:
Your AWS Data Source should now be ready for integration with FinOps! Please contact our Support Team at support@megaops.io if you have any questions regarding the described configuration flow.
Create Data Export in AWS manually#
In order to utilize automatic / manual billing data import in FinOps, first, you need to create a Data Export in AWS. Please refer to their official documentation to become acquainted with the guidelines for Data Exports.
Navigate to AWS Billing & Cost Management → Data Exports.
Create a new Data Export.
Standard data export settings#
Step 1. Export type
Select Standard data export export type.
Step 2. Export name
Input export name.
Step 3. Data table content settings:
Select "CUR 2.0".
Select "Include resource IDs" checkbox.
Choose the time granularity for how you want the line items in the export to be aggregated.
Step 4. Data export delivery options:
Pick "Overwrite existing data export file".
Select compression type.
Step 5. Data export storage setting:
Create a new or use an existing bucket for the export.
Enter the S3 path prefix that you want prepended to the name of your Data Export.
Step 6. Review
Confirm export creation. Data Export will be prepared by AWS during 24 hours.
Legacy CUR export settings#
Step 1. Export type
Select Legacy CUR export (CUR) export type.
Step 2. Export name
Input export name.
Step 3. Export content
Select "Include resource IDs" and "Refresh automatically" checkboxes.
Step 4. Data export delivery options:
Choose the time granularity for how you want the line items in the export to be aggregated.
Pick "Overwrite existing report".
Select compression type.
Step 5: Data export storage setting:
Create a new or use an existing bucket for the export.
Enter the S3 path prefix that you want prepended to the name of your Data Export.
Step 6. Review
Confirm export creation. Data Export will be prepared by AWS during 24 hours.
​
When it's done, follow the steps from the section Connecting an AWS Root account that has Data Export already configured.
Azure#
To track a new Azure Data Source in your FinOps account, please select the Azure Subscription tab at the Data Source Connection step during the initial configuration or later on in the Settings section of the main page.
Name#
In the first field, you can specify any preferred name to be assigned to this Data Source in FinOps.
​
Subscription ID#
The Subscription ID is a unique string that identifies your Azure subscription. To find it:
​
Log in to the Microsoft Azure Portal.
Search for Subscriptions to access a list of all subscriptions associated with your Azure account. The list will include a subscription ID for each one.
When FinOps is programmatically signing in to Azure, it needs to pass a tenant ID and a application ID along with a secret, which is an authentication key.
​
Application (client) ID#
Application (client) ID has to be generated manually in Azure to allow API communication with FinOps:
​
Access the Azure Active Directory and navigate to App registrations
Click on + New registration and provide a name, e.g. FinOps, and then Register at the bottom of the page
A new Application ID will become available (as in the screenshot below)
​
To perform a Role assignment, from the Azure home page navigate to Subscriptions and select the one that you have provisioned to be linked to FinOps.
​
After being redirected to its dashboard, click Access control (IAM) in the left navigation bar and then go the Role assignments tab and click +Add, Add role assignment.
You will be prompted to input the Role, which has to be Reader, in the first field. The second one can be left unchanged. The third field should contain the name of a registered application from the previous steps, e.g. FinOps. Click Save to add the role assignment.
Directory (tenant) ID#
​
Directory (tenant) ID is a globally unique identifier (GUID) that is different from your organization name or domain. Its value is easily accessible in the overview of the application that has been added in the previous steps via App registrations.
Go to Home → App registrations → e.g. FinOps→ Overview → Directory (tenant) ID
Secret#
Secret should be created within the newly registered application:
Go to the App registrations, click on your application, e.g. FinOps
Select Certificates & Secrets in the left navigation bar and add a + New client secret
Once the required fields are filled out, you can click Connect to validate the information. Once you have connected to the account, the data will be pulled from the source shortly afterwards and become available in the UI.
​
Your Azure Data Source account should now be ready for integration with FinOps! Please contact our Support Team at support@megaops.io if you have any questions regarding the described configuration flow.
GCP#
Enable billing export#
Please follow the official GCP guide to enable billing data export - Set up Cloud Billing data export to BigQuery | Google Cloud.
As the result you should have a new table in your BigQuery project. Note the names of the dataset and the table. You will need them later when connecting your cloud account to FinOps.
Prepare a role for finops#
With a CLI command#
Run the following command in GCP CLI:
Via Google Cloud Console#
1. Go to Roles page and click Create Role.
2. Give the role any name and description.
3. Add the following permissions:
-
bigquery.jobs.create
-
bigquery.tables.getData
-
compute.addresses.list
-
compute.addresses.setLabels
-
compute.disks.list
-
compute.disks.setLabels
-
compute.firewalls.list
-
compute.instances.list
-
compute.instances.setLabels
-
compute.images.list
-
compute.images.setLabels
-
compute.machineTypes.get
-
compute.machineTypes.list
-
compute.networks.list
-
compute.regions.list
-
compute.snapshots.list
-
compute.snapshots.setLabels
-
compute.zones.list
-
iam.serviceAccounts.list
-
monitoring.timeSeries.list
-
storage.buckets.get
-
storage.buckets.getIamPolicy
-
storage.buckets.list
-
storage.buckets.update
Create service account#
Official documentation on service accounts - Service accounts | IAM Documentation | Google Cloud.
-
Go to Service accounts page and click Create Service Account
-
Give it any name and click Create and Continue.
-
Specify the role that you created earlier and click Continue and then Done
-
​
Generate API key for your service account#
​​
-
Find your service account in the service accounts list and click on its name to go to service account details page.
-
Go to Keys tab.
-
Click Add key -> Create new key
-
Service account API key will be downloaded as a .json file. You will need this file on the next stage when connecting your cloud account to FinOps.
​
Connect Data Source in FinOps#
Use the newly downloaded service account credentials json file with the billing dataset details to connect your GCP cloud account.
Kubernetes
To track a new Kubernetes cluster Data Source in your FinOps account, please select the Kubernetes tab on the Data Source Connection page.
Use "Connect" to create a Data Source in FinOps.
​
Click on the newly created Data Source on the 'Data Sources' page. The page with detailed information appears.
Use 'KUBERNETES INTEGRATION' or 'instructions' to get the instructions to install the software that collects information about running pods and converts them into cost metrics.
Installing the software to a cluster
To get cost metrics download and install helm chart on the Kubernetes cluster. Helm chart is created to collect Kubernetes resources information and share it with FinOps project. Install one release per cluster.
1. Download MegaOps repo #
Use this command to download repo:
2. Install helm chart #
There is a difference in instructions when a Kubernetes Data Source is connected on my.FinOps.com or on FinOps deployed from open source. In both cases, instructions given are adapted for a selected Data Source and deployed FinOps. All you need is just to copy-paste it and replace the <password_specified_during_data_source_connection> phrase with a user's password.
My.finops.com#
Open source FinOps #
For a more detailed description, see instructions#.