Difference between revisions of "AWS S3 VPC Flow Log Access"

From Observer GigaFlow Support | VIAVI Solutions Inc.
Jump to: navigation, search
 
(20 intermediate revisions by 2 users not shown)
Line 1: Line 1:
 
+
__TOC__
 +
[[Category:Cloud]]
 +
[[Category:NetFlow]]
 
Observer Gigaflow uses the Amazon Web Services (AWS) Command Line Interface (CLI) tools to access AWS services.  
 
Observer Gigaflow uses the Amazon Web Services (AWS) Command Line Interface (CLI) tools to access AWS services.  
  
Line 12: Line 14:
 
After installation, perform the following steps:
 
After installation, perform the following steps:
  
'''1.''' On your AWS console, add a role to their AWS instance with the following permissions:
+
'''1.''' On your AWS console, add a role to the AWS instance with the following permissions:
 
* S3 List/Read/Download
 
* S3 List/Read/Download
 
  {
 
  {
Line 51: Line 53:
 
<br />
 
<br />
 
<br />
 
<br />
GigaFlow uses the below 3 scripts. You must copy them from the <strong style="color:gray;">GigaFlow\resources\docs\eventsscripts\aws\</strong> folder to the <strong style="color:gray;">GigaFlow\resources\prepos\eventscripts</strong> folder to populate via the AWS VPC flow logs.
 
 
{| class="wikitable"
 
|-
 
! File !! Use !! Event Script Attributes !! Output File
 
|-
 
| AWSVPCFlow.js || Processes the downloaded VPC flow logs and ingests the data into the GigaFlow database. || <strong>vpcDownloadFolder</strong>: The folder with the downloaded flow logs (same as <i>S3VPCFileDownload.js</i> uses) || <strong style="color:blue;">/static/s3fileprocess.html</strong>
 
|-
 
| S3EC2InstanceDetails.js || Gathers instance and interface information from AWS and uses it to add attributes to the created devices so that they can be searched on and displayed as appropriate. ||  || <strong style="color:blue;">/static/ec2instancedetails.html</strong>
 
|-
 
| S3VPCFileDownload.js || Downloads the most recent VPC flow logs from one or more S3 buckets. || <strong>vpcBuckets</strong>: The comma separated list of buckets containing vpc flow files.
 
<strong>vpcDownloadFolder</strong>: The folder to download flow logs to (same as <i>AWSVPCFlow.js</i> uses)
 
|| <strong style="color:blue;">/static/s3filedownload.html</strong>
 
|}
 
<br />
 
After you copy these scripts, they will be visible on the <b>GigaFlow Event Scripts</b> page and you can use this page to monitor their status.
 
 
[[File:Gigaflow event scripts.png | 1200px]]
 
 
On this page you can find the following information:
 
* The filename of the script being run.
 
* The size of the script.
 
* The date and time the script was last updated on disk, in the <strong>eventscript</strong> folder.
 
* The date and time the script last ran.
 
* If the script is currently running.
 
* The number of times the script has run (since last changed).
 
* The average duration of the run time for the script.
 
* If the script is waiting to run again.
 
* If there are dropped events.
 
* If the script is currently paused.
 
* The errors thrown by the script which need to be addressed.
 
* The description of the script.
 
* The location on the disk of the script in relation to the location where GigaFlow is installed.
 
 
At the bottom of the <b>GigaFlow Event Scripts</b> page you can also find the variables required for these scripts. In the following figure you can find an example.
 
 
[[File:Global variables.png | 600px]]
 
 
The AWS scripts require the following 3 variables:
 
 
*"<strong>vpBackInTimeAllowed</strong>" field allows you to specify the period in milliseconds for which GigaFlow accepts historic data. Any older data is reset to the current time. The default value is 900000 (15 minutes).
 
 
 
*"<strong>vpcBuckets</strong>" - the related field must contain the name of your AWS bucket to which you are sending flow records. For example, in the following AWS page, the vpc bucket is called "kwvpc2", consequently this is all you need to enter in the related field on the <b>Global Variables</b> page.
 
{| class="wikitable"
 
|-
 
| '''Note:''' If you use multiple buckets, then separate their designation by commas.
 
|}
 
 
[[File:Buckets.png | 800px]]
 
 
  
*"<strong>vpcDownloadFolder</strong>" - the location where the flow logs are downloaded on your local machine. You can changed this location, but the default <strong style="color:gray;">./vpcflowlogsdownloaded/</strong> is acceptable.
+
=== AWS Flow Log format (minimal recommendations) ===
  
Once running, GigaFlow automatically creates devices only for the ones that have VPC flows and instance information. If there is no instance information, GigaFlow creates virtual devices using the interface ID as the device identifier and treats it as a normal device.
+
Below you can find the minimal recommendations on the AWS Flow Log format for the customer flow logs.
  
As a result, all normal GigaFlow features and reporting should be available.
+
  ${version} ${account-id} ${instance-id} ${interface-id} ${flow-direction} ${srcaddr} ${dstaddr} ${srcport} ${dstport} ${protocol} ${packets} ${bytes} ${start} ${end} ${action} ${log-status}

Latest revision as of 18:48, 14 March 2024

Contents

Observer Gigaflow uses the Amazon Web Services (AWS) Command Line Interface (CLI) tools to access AWS services.

You can install the latest version of the CLI tools for AWS system from https://aws.amazon.com/cli/.

Note: The CLI tools must be configured with the same user used to run Gigaflow for it to be able to access the configuration profile.

For Linux, you can use the "su" command to choose the correct user and then run the AWS CLI commands.
For Windows, you need to change the Gigaflow service (using the services manager) to run as a local user. Then log in as that user, install the CLI tools, and configure your AWS access.

After installation, perform the following steps:

1. On your AWS console, add a role to the AWS instance with the following permissions:

  • S3 List/Read/Download
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:Get*",
"s3:List*",
"s3-object-lambda:Get*",
"s3-object-lambda:List*"
],
"Resource": "*"
}
]
}
  • EC2 List/Read/Describe
{
 "Version": "2012-10-17",
 "Statement": [
 {
 "Effect": "Allow",
 "Action": "ec2:Describe*",
 "Resource": "*"
 }
 ]
}

2. Generate an Access Key ID and password and add them to your AWS CLI configuration (see https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-quickstart.html).

3. To test your configuration, run the following commands from the CLI:

aws ec2 describe-instances

aws s3 ls s3://

aws ec2 describe-flow-logs

AWS Flow Log format (minimal recommendations)

Below you can find the minimal recommendations on the AWS Flow Log format for the customer flow logs.

 ${version} ${account-id} ${instance-id} ${interface-id} ${flow-direction} ${srcaddr} ${dstaddr} ${srcport} ${dstport} ${protocol} ${packets} ${bytes} ${start} ${end} ${action} ${log-status}