Projects with this topic
-
A SQL backup and restore manager that safely stores your database dumps in an S3 bucket and provides notifications via Discord webhooks, Pushover and Mailgun.
Updated -
This exercise guides you through creating an image gallery application using Node.js/Express and AWS S3. You will create an S3 bucket, configure the application with your AWS credentials, and implement the necessary code to upload images to S3 and list all images stored in the S3 bucket.
Updated -
Reliable PostgreSQL Backup & Restore
Updated -
Terraform module to manage a file upload infrastructure using S3 presigned URLs and Lambda.
Updated -
A microservice backend for document analysis using VLM specialized for the sybil project
Updated -
This module automates the build and deployment of AWS Lambda functions with a security-first approach. It generates signed ZIP artifacts, SBOMs, vulnerability reports, and integrity hashes to ensure that Lambda code is verifiable and tamper-resistant. The module is designed to be reusable and flexible by accepting an external IAM execution role and customizable Lambda source directories.
Updated -
-
Microservice as middle layer between DDRapp and CESNET S3 as part of DDRplatform
Updated -
-
Terraform module to build S3 bucket
Updated -
Serve a static HTML + CSS + JavaScript website from an S3 backend using AWS credentials so the bucket does not have to be public or have website hosting enabled.
Updated -
Upload files to S3 with presigned URLs.
Updated -
Terraform module to manage AWS Transfer Server, user, and the backend S3 Bucket for storage.
Updated -
Backing up whole systems including docker volumes from container perspective , auto-prune archives
Updated -
Angular app containing my personal website to display information sush as blog, photos and resume using angular material and bootstrap.
Updated -
Angular app containing my personal website to display information sush as blog, photos and resume using angular material and bootstrap.
Updated -
The project use traffic data from automatic measurement and corresponding weather data in order to support analyses aimed at answering the question: Do weather conditions impact traffic?
The aim of the project is to design and implement entire data data flow using ETL/ELT tools and methods. The steps include: csv data extract and initial processing (Python), load to AWS S3 Data Lake (Python boto3, AWS CLI), staging (Snowflake) and transformation (dbt core), data warehousing (Snowflake), data prep and EDA (Python, pandas) and visualisation (Streamlit).
Updated -