"body": json.dumps('Failed to fetch quotes') 'message': 'Failed to fetch quotes from db', I will only briefly explain its purpose since the focus here is on integrating CloudWatch, Kinesis Firehose, and S3 to stream logs in near realtime. The code for fetching and saving quotes is shown below. These libraries are specified in the src/quote_fetcher/requirements.txt file like so. The Python packages being used to scrape the quotes from the web which are Requests and BeautifulSoup4 along with the Boto3 library which will used to save quotes to DynamoDB. Lastly, there is a DynamoDB table SAM Resource of type AWS::Serverless::SimpleTable specified which, as mentioned previously, is used to save quotes fetched from the internet. Each function has a CloudWatch Log Group associated with them for capturing log messages produced in the application. The second lambda function named ListQuotesFunction is for retrieving previously saved quotes from the DynamoDB table and returning them back through API Gateway. One function is for fetching new quotes from the web and saving them to a DynamoDB table which I've named FetchQuoteFunction and is initiated by an HTTP GET request. Value: !Sub " This Infrastructure as Code SAM/CloudFormation template specifies two Lambda function resources of type AWS::Serverless::Function. Sample SAM Template for sam-app-kinesis-streamed-logsĭescription: "API Gateway endpoint URL for Prod stage for Fetch Quote API"ĭescription: "API Gateway endpoint URL for Prod stage for List Quotes API" The last bit of initial setup is to define the base SAM/CloudFormation template.yaml file's resources as shown below. To keep thing minimal for this demo I also removed the tests directory. I also remove the original _init_.py, app.py and requirements.txt files initially generated by the Hello World template. ![]() sam-app-kinesis-streamed-logs/įirst I rename hello_world directory to src then create two subdirectories within it, one named quote_fetcher and another named quote_lister, each get api.py and requirements.txt files. This produces the following app structure. sam init -name sam-app-kinesis-streamed-logs \ Go back to the CloudFormation stack and select Delete.To start I initialize a new SAM project with the Python 3.8 runtime and use the Hello World REST API template app.Select the Resources tab and navigate to the DeliveryBucket S3 bucket.In AWS Console, go to your CloudFormation stack.See Unsubscribe from log groups for details. You can use the log groups auto-discovery method described in Subscribe by reading log groups from file and pass the auto-discovery output file to the unsubscribe command. Select one of the listed log streams and look for exceptions in the logs.On the Lambda screen, select the Monitor tab and then select Logs.Select the Resources tab and then select the link next to Lambda.Inspect the dashboard for any obvious issues.It will have a name like DynatraceLogForwarder-SelfMonitoring-eu-north-1-dynatrace-aws-logs, where the middle part is the AWS region and the last part is the stack name you chose (the default is dynatrace-aws-logs). Find the self-monitoring dashboard for AWS log forwarding.In AWS Console, go to CloudWatch Dashboards.To verify AWS log forwarder connectivity and inspect operational logs In Events, look for any events with a failed status.In Parameters, check if the parameter values are consistent with the values you provided during deployment.In Stack info, check the stack status it should be CREATE_COMPLETE.If you find any issues or discrepancies in any of the fields below, select Delete to delete the stack, and then repeat the deployment process.Select your log forwarder stack from the list on the left by stack name (the default value is dynatrace-aws-logs).Syntax: TagKey1=TagValue1 TagKey2=TagValue2 … Optional A list of tags to associate with the stack that is created or updated. For values over 8192 there's also a change in Dynatrace settings needed - you need to contact Dynatrace One for that. If log exceeds this length it will be truncated. Optional The name of the CloudFormation stack where you want to deploy the resources. ![]() Optional If true, the log forwarder Lambda function verifies the SSL certificate of your Dynatrace environment URL. If you choose to use an existing environment ActiveGate, set it to your ActiveGate endpoint: Note: To determine, see environment ID. Required The API URL to your Dynatrace SaaS environment logs ingest target.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |