Difference between revisions of "AWS S3 VPC Flow Log Access"

From Observer GigaFlow Support | VIAVI Solutions Inc.
Jump to: navigation, search
Line 48: Line 48:
 
aws ec2 describe-flow-logs</code>
 
aws ec2 describe-flow-logs</code>
  
GigaFlow uses the following 3 scripts. You must copy them from the <strong style="color:gray;">GigaFlow\resources\docs\eventsscripts\aws\</strong> folder to the GigaFlow\resources\prepos\eventscripts folder to populate via the AWS VPC flow logs
+
GigaFlow uses the following 3 scripts. You must copy them from the <strong style="color:gray;">GigaFlow\resources\docs\eventsscripts\aws\</strong> folder to the <strong style="color:gray;">GigaFlow\resources\prepos\eventscripts</strong> folder to populate via the AWS VPC flow logs.
 +
 
 +
{| class="wikitable"
 +
|-
 +
! File !! Use !! Event Script Attributes !! Output File
 +
|-
 +
| AWSVPCFlow.js || Processes the downloaded VPC flow logs and ingests the data into the GigaFlow database. || <strong>vpcDownloadFolder</strong>:The folder with the downloaded flow logs (same as S3VPCFileDownload.js uses) || Example
 +
|-
 +
| S3EC2InstanceDetails.js || Gathers instance and interface information from AWS and uses it to add attributes to the created devices so that they can be searched on and displayed as appropriate. || Example || Example
 +
|-
 +
| S3VPCFileDownload.js || Downloads the most recent VPC flow logs from one or more S3 buckets. || Example || Example
 +
|}

Revision as of 16:28, 26 August 2022

Observer Gigaflow uses the Amazon Web Services (AWS) Command Line Interface (CLI) tools to access AWS services. You can install the latest version of the CLI tools for AWS system from https://aws.amazon.com/cli/.

Note: The CLI tools must be configured with the same user used to run Gigaflow for it to be able to access the configuration profile.

For Linux, you can use the "su" command to choose the correct user and then run the AWS CLI commands.
For Windows, you need to change the Gigaflow service (using the services manager) to run as a local user. Then log in as that user, install the CLI tools and configure your AWS access.

After installation, perform the following steps:

1. Add a role to your AWS instance with the following permissions:

  • S3 List/Read/Download
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:Get*",
"s3:List*",
"s3-object-lambda:Get*",
"s3-object-lambda:List*"
],
"Resource": "*"
}
]
}
  • EC2 List/Read/Describe
{
 "Version": "2012-10-17",
 "Statement": [
 {
 "Effect": "Allow",
 "Action": "ec2:Describe*",
 "Resource": "*"
 }
 ]
}

2. Generate an Access Key ID and password and add them to your AWS CLI configuration (see https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-quickstart.html).

3. To test your configuration, run the following commands from the CLI:

aws ec2 describe-instances

aws s3 ls s3://

aws ec2 describe-flow-logs

GigaFlow uses the following 3 scripts. You must copy them from the GigaFlow\resources\docs\eventsscripts\aws\ folder to the GigaFlow\resources\prepos\eventscripts folder to populate via the AWS VPC flow logs.

File Use Event Script Attributes Output File
AWSVPCFlow.js Processes the downloaded VPC flow logs and ingests the data into the GigaFlow database. vpcDownloadFolder:The folder with the downloaded flow logs (same as S3VPCFileDownload.js uses) Example
S3EC2InstanceDetails.js Gathers instance and interface information from AWS and uses it to add attributes to the created devices so that they can be searched on and displayed as appropriate. Example Example
S3VPCFileDownload.js Downloads the most recent VPC flow logs from one or more S3 buckets. Example Example