Objective / Problem Statement:

Currently, in Covid-19 pandemic situation, the entire country is under a social and economic shutdown, this has been tough for most but could be a very trying time for those who survive on daily-wages, the homeless and the under-served communities.

We do not have reliable data to identify those whose livelihood would have been seriously jeopardized because of the lockdown. They include landless agricultural labourers, petty traders, tailors, barbers, rickshaw/auto drivers, construction workers, and many others, amongst them most vulnerable are migrant workers.  

These workers need support to survive the crisis. While governments announced the relief, and have taken several measures but given their tight fiscal conditions and the magnitude of the problem, every citizen should step in.  Many people are providing monetary contributions to this cause by donating money to the PM, CM and NGOs fund. However, there is a scope of distribution of food kits (both ration & cooked food) and care kits (basic sanitation) etc. There are few NGOs already working towards it, partnering with government officials, and appealing individuals for support to distribute food and care kits to these vulnerability sections. We feel there are very limited options available for an individual to contact nearby NGO’s and donate voluntarily. 


We built this solution to help in filling the gap between NGO and donor by providing a collaboration platform so that underserved communities will get help. This platform allows donors to register and make donations to NGO. Once a list of donation items is submitted by the donor, a signal will be triggered to inform the NGO. So, NGO can review the donations made and can schedule a pick-up order for further processing.

Flow Diagram  

A picture containing text, map

Description automatically generated

Architecture Diagram 

The solution of our project looks as shown in the below diagram. We tried to make use of cloud-agnostic tools to facilitate migration to other cloud provider and serverless architecture as much as possible. 

A screenshot of a cell phone

Description automatically generated

Let us break the overall architecture and see how we have implemented the application using various AWS Services. 


We are using the Polyglot Microservice Architecture, twelve-factor app methodology and following the domain driven design to build the application. In our application we are using 2 microservices.

1.User management service (developed in Python): The following tasks will be executed using this microservice

  • Both the Donor & the NGOs can register and login.
  • Based on his location (providing state & city), donor can fetch the list of nearby govt approved NGO details.

2.Order Management Service (developed in Java):  The following tasks will be executed using this microservice

  • Donor can donate to selected NGO
  • NGO/Donor can get the donation history
  • NGO can view the list of the donations initiated by different users and schedule a pickup for further processing

AWS Elastic Beanstalk:

We are using AWS Elastic Beanstalk for deploying and scaling web applications (developed in Django) and services. It automatically handles the deployment, from capacity provisioning, load balancing, auto-scaling to application health monitoring with no additional charge.

AWS Fargate:

We are using AWS Fargate (a serverless compute engine for containers) for deploying microservices. Fargate allocates the right amount of compute, eliminating the need to choose instances and scale cluster capacity. We only pay for the resources required to run containers, so there is no over-provisioning and paying for additional servers. AWS Fargate supports private registry authentication for Amazon ECS. Hence, we are using Docker Hub (cloud-agnostic) registry to store our images instead of Amazon ECR. 

Amazon DynamoDB

We are using Amazon DynamoDB as a serverless data store (in key-value pairs) for consistent and fast performance. It is serverless with no servers to provision, patch, or manage and no software to install, maintain, or operate. DynamoDB automatically scales tables up and down to adjust for capacity and maintain performance. It also provides both provisioned and on-demand capacity modes so that we can optimize costs by specifying capacity per workload or paying for only the resources we consume.

AWS Lambda:

We are using AWS Lambda serverless computing platform for CSV file processing (From Amazon S3) and Dynamo DB streams are used for processing the business use cases without worrying about the infrastructure provisioning, compute & upfront cost.

Amazon SES to send E-mail to donor and NGO’s  

Amazon S3 to store static files (CSV file of NGO’ details)

AWS ElastiCache for efficient Search functionality

Amazon CloudWatch to Monitor AWS resources and applications. It natively integrates with most of AWS services that we are using (i.e. DynamoDB, S3, Lambda, ECS etc.)  

AWS Identity and Access Management (IAM) enables to manage access to AWS services and resources securely. Using IAM, we can create and manage AWS users and groups, and use permissions to allow and deny their access to AWS resources.

Amazon API Gateway to create RESTful APIs and WebSocket APIs that enable real-time two-way communication applications. It is a fully managed service to create, publish, maintain, monitor, and secure APIs at any scale. 

Amazon Rout53 is a highly available and scalable cloud Domain Name System (DNS) web service. It is designed to give developers and businesses an extremely reliable and cost-effective way to route end users to Internet applications.

Amazon Certificate Manager (ACM) to easily provision, manage, and deploy public and private Secure Sockets Layer/Transport Layer Security (SSL/TLS) certificates for use with AWS services and our internal connected resources, such as Elastic Load Balancing and API Gateway. 

Infra as code (Terraform):

We are using terraform infrastructure as code template to provision and manage our cloud infrastructure. This ensures faster infrastructure Onboarding and cost-effective.

We have considered terraform instead of AWS cloud formation because terraform is cloud-agnostic.   


We are using Jenkin Pipeline (a suite of plugins) to take code from Git (version control), build and deploy both Web applications & Microservices.

Future scaling options/implications:

  • Currently, we are using CSV file as the data source, hence we are using AWS lambda function, however in the future when data size increases, as a scale-up option we will be using AWS batch processing.
  • Currently, due to the time constraint, we are limiting the service over the web application and deployed in Elastic Beanstalk, however in future, a mobile application can be developed to reduce the cost of Elastic Beanstalk. 


This Capstone project is a part of Great Learning’s post graduate program in Cloud Computing.  has helped us in building practical expertise on various technology stacks Microservices in java and python, dev-ops tools like Jenkins, Terraform etc, AWS IAM, S3, Dynamo DB, Lambda, AWS ALB, ECS fargate, Elastic beanstalk and cloud watch etc.. We have also learnt the Django Python & HTML programming language to build UI Webapp.  AWS CLI has helped us to connect on-premise infrastructure with cloud services.


Abhishek Sharma: Working as Lead Product Engineer with Harman Connected Services and having 8.5 years of experience in development and building scalable solutions using Java, Python and AWS solutions

Janit Sachdeva: Working as a Senior Associate with Publicis.Sapient and has  6.6 years of experience in application development in Java/J2EE, spring & spring-boot microservices

Mohanjay: Working as Project manager managing end to end planning, execution of disaster recovery Infrastructure Projects for IBM’s cloud-managed service customers within the contractual timelines. Guiding the customer in the deployment of cost-effective disaster and system recovery solutions for customer applications in SAP and Oracle.

Mrutyunjaya Mishra: Working as Software Specialist in SAS R&D India and has 15 years of experience in deploying and working with SAS & Data Management products



Please enter your comment!
Please enter your name here

3 × 1 =