Commit 644824cb authored by Neil Spink's avatar Neil Spink

clarifying the readme

parent 3f13a7f8
Pipeline #65359880 failed with stages
in 1 second
#AWS Lambda Price Grabber
# AWS Lambda Price Grabber
## About
......@@ -10,6 +10,11 @@ list of URLs for scanning, then it triggers the second function which does the a
![system context diagram](./system.png)
Note: the cloud watch trigger is manually setup.
The web scraper is found in one module called grabber.py. All AWS Lambda functions start with the prefix name lambda_
and the unit tests with test_
## Prerequisites
### Developers Computer
......@@ -33,11 +38,29 @@ To run the crawler locally, you need to update/install the packages used by the
pip3 install -U -r requirements.txt requirements_unit_test.txt
## Getting started
Start by running the unit tests, if you ran the pip install from the prerequisites, then drop to the command line and
just run the command:
nose2
It should fail on one item 'test_logs_storage_s3.py', it is uploading a file to S3, rename the file by
prepending the name with something like the word disable_
Get unit test code coverage using the command:
nose2 --with-coverage
To manually upload the source code to AWS, you would need all packages with the source code, so run we download them
to the source directory, after which you need to ZIP everything:
pip3 install -r ../source/requirements.txt -t ../source
### AWS Setup
If you want to run the solution as lambda functions. In the AWS CloudFormation console or using the AWSCLI you need to
run the file deployment/aws-create-foundation.json which will create the required roles for the AWS lambda functions and
an S3 bucket.
In the AWS CloudFormation console or using the AWSCLI you need to run the file deployment/aws-create-foundation.json
which will create the required roles for the AWS lambda functions and an S3 bucket.
If you use the following AWSCLI command, change the 'ParameterValue' which is the S3 Bucket Name. The name must be
globally unique, no other AWS customer having the same name.
......@@ -63,28 +86,6 @@ is in the source directory, and AWS lambda doesn't like that, i.e.
curl --header "JOB-TOKEN':' $CI_JOB_TOKEN" -o source.zip https://gitlab.com/neilspink/aws-lambda-price-grabber/-/jobs/$CI_JOB_NAME/artifacts/download?job=$CI_JOB_NAME
## Getting started
Start by running the unit tests, if you ran the pip install from the prerequisites, then drop to the command line and
just run the command:
nose2
It should fail on one item 'test_logs_storage_s3.py', it is uploading a file to S3, rename the file by
prepending the name with something like the word disable_
Get unit test code coverage using the command:
nose2 --with-coverage
To manually upload the source code to AWS, you would need all packages with the source code, so run we install them
to the source directory, after which you need to ZIP everything:
pip3 install -r ../source/requirements.txt -t ../source
The web scraper is found in one module called grabber.py. All AWS Lambda functions start with the prefix name lambda_
and the unit tests with test_
## Lambda function input parameters on AWS:
The lambda functions can be run locally by going into the source directory, but they expect certain AWS resources to
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment