Comparison of MongoDB and PostgreSQL for processing and analyzing Big Data

Software Computing

Študent: Nejc Drobnič

Nejc Drobnič is a graduate of the Computer Science - Software Engineering module study program at Academia, College of Short-Cycle Higher Education. He successfully defended his thesis paper in June 2024.


Diploma paper Nejc Drobnič

In recent times, we have witnessed tremendous growth and development in databases, which have become crucial building blocks in processing vast amounts of data, known as big data. Big data not only encompasses enormous volumes of information but often includes diverse types of data, such as text, images, sound, video recordings, and numerous other formats.

Due to this diversity and scale, it is crucial to have appropriate databases that enable efficient storage, management, and analysis of such data. MongoDB and PostgreSQL are among the leading databases used to tackle the challenges of big data, each with its unique features, advantages, and limitations.

This thesis focuses on the comparative analysis of MongoDB and PostgreSQL databases, and their performance and efficiency in processing and analyzing big data. MongoDB, as a representative of non-relational databases, proves to be a handy tool in the big data ecosystem.

Its ability to store unstructured data and easily scale horizontally places MongoDB among the popular choices for organizations dealing with diverse and rapidly growing data. On the other hand, PostgreSQL, a classic relational database, has demonstrated itself as a reliable solution for complex data analysis, with its flexibility and the power of the SQL language. Its integration with the transaction management system (ACID) ensures consistency and reliability in critical environments.

Within the framework of this research, we will conduct four key measurements. The first measurement will provide us with results on the speed of document insertion and queries per second. The second will focus on the speed of data insertion in both databases, allowing us to determine their capacity to handle large volumes of data.

The third measurement will address the efficiency of transactional insertion, with an emphasis on ensuring data consistency and integrity. The fourth and final measurement will examine the scope, scalability, and memory consumption during insertion into the database.

Our research aims to clearly present the strengths, limitations, and advantages of MongoDB and PostgreSQL in processing and analyzing big data.


Diploma paper Nejc Drobnič


Diploma paper Nejc Drobnič

Želite biti obveščeni o novicah na Academii?

Ko bo kaj novega vam to enostavno sporočimo na vaš e-naslov.