Difference between revisions of "AWS S3 VPC Flow Log Access"

From Observer GigaFlow Support | VIAVI Solutions Inc.
Jump to: navigation, search
Line 83: Line 83:
 
At the bottom of the <b>GigaFlow Event Scripts</b> page you can also find the variables required for these scripts. In the following figure you can find an example.
 
At the bottom of the <b>GigaFlow Event Scripts</b> page you can also find the variables required for these scripts. In the following figure you can find an example.
  
[[File:Gigaflow event scripts.png | 1200px]]
+
[[File:Global variables.png|thumbnail]]

Revision as of 13:21, 29 August 2022

Observer Gigaflow uses the Amazon Web Services (AWS) Command Line Interface (CLI) tools to access AWS services. You can install the latest version of the CLI tools for AWS system from https://aws.amazon.com/cli/.

Note: The CLI tools must be configured with the same user used to run Gigaflow for it to be able to access the configuration profile.

For Linux, you can use the "su" command to choose the correct user and then run the AWS CLI commands.
For Windows, you need to change the Gigaflow service (using the services manager) to run as a local user. Then log in as that user, install the CLI tools and configure your AWS access.

After installation, perform the following steps:

1. Add a role to your AWS instance with the following permissions:

  • S3 List/Read/Download
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:Get*",
"s3:List*",
"s3-object-lambda:Get*",
"s3-object-lambda:List*"
],
"Resource": "*"
}
]
}
  • EC2 List/Read/Describe
{
 "Version": "2012-10-17",
 "Statement": [
 {
 "Effect": "Allow",
 "Action": "ec2:Describe*",
 "Resource": "*"
 }
 ]
}

2. Generate an Access Key ID and password and add them to your AWS CLI configuration (see https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-quickstart.html).

3. To test your configuration, run the following commands from the CLI:

aws ec2 describe-instances

aws s3 ls s3://

aws ec2 describe-flow-logs

GigaFlow uses the below 3 scripts. You must copy them from the GigaFlow\resources\docs\eventsscripts\aws\ folder to the GigaFlow\resources\prepos\eventscripts folder to populate via the AWS VPC flow logs.

File Use Event Script Attributes Output File
AWSVPCFlow.js Processes the downloaded VPC flow logs and ingests the data into the GigaFlow database. vpcDownloadFolder: The folder with the downloaded flow logs (same as S3VPCFileDownload.js uses) /static/s3fileprocess.html
S3EC2InstanceDetails.js Gathers instance and interface information from AWS and uses it to add attributes to the created devices so that they can be searched on and displayed as appropriate. /static/ec2instancedetails.html
S3VPCFileDownload.js Downloads the most recent VPC flow logs from one or more S3 buckets. vpcBuckets: The folder to download flow logs to (same as AWSVPCFlow.js uses) /static/s3filedownload.html


After you copy these scripts, they will be visible on the GigaFlow Event Scripts page and you can use this page to monitor their status.

Gigaflow event scripts.png

On this page you can find the following information:

  • The filename of the script being run.
  • The size of the script.
  • The date and time the script was last updated on disk, in the eventscript folder.
  • The date and time the script last ran.
  • If the script is currently running.
  • The number of times the script has run (since last changed).
  • The average duration of the run time for the script.
  • If the script is waiting to run again.
  • If there are dropped events.
  • If the script is currently paused.
  • The errors thrown by the script which need to be addressed.
  • The description of the script.
  • The location on the disk of the script in relation to the location where GigaFlow is installed.

At the bottom of the GigaFlow Event Scripts page you can also find the variables required for these scripts. In the following figure you can find an example.

Global variables.png