Posts

Showing posts with the label docker

Using Kafka with Docker and NodeJS

Image
Introduction to Kafka Kafka is an open-source , distributed messaging system that functions on a publish/subscribe model. It is widely used by numerous large companies for high-performance , real-time data streaming. Developed by LinkedIn since 2011, Kafka has grown into the most popular distributed streaming platform. It can handle vast amounts of records with high efficiency. Advantages of Kafka Open-source : Freely available and continuously improved by a large community. High-throughput, high-frequency : Capable of processing large volumes of data across topics continuously. Automatic message storage : Allows for easy message retrieval and verification. Large user community : Offers extensive support and shared resources. Basic Concepts If you're new to Kafka and Message Queues, here are some key concepts to understand: Producer : Creates and sends data to the Kafka server, where data is sent as messages in byte array format. Consumer : One or more consumers subscribe to

Deploying a NodeJS Server on Google Kubernetes Engine

Image
Introduction to GKE Google Kubernetes Engine (GKE) is a managed Kubernetes service provided by Google Cloud Platform , facilitating simple and efficient deployment of Docker images. We only need to provide some configuration for the number of nodes, machine types, and replicas to use. Some Concepts Cluster A Cluster is a collection of Nodes where Kubernetes can deploy applications. A cluster includes at least one Master Node and multiple Worker Nodes . The Master Node is used to manage the Worker Nodes . Node A Node is a server in the Kubernetes Cluster. Nodes can be physical servers or virtual machines. Each Node runs  Kubernetes , which is responsible for communication between the Master Node and Worker Node , as well as managing Pods and containers running on it. Pod A Pod is the smallest deployable unit in Kubernetes . Each Pod contains one or more containers, typically Docker containers. Containers in the same Pod share a network namespace, meaning they have the same

Using MongoDB on Docker

Image
Introduction MongoDB is a widely popular NoSQL database today due to its simplicity and several advantages over relational databases. Through this guide, you'll learn how to quickly use MongoDB within Docker without going through many complex installation steps. Note that before starting, you need to have Docker installed on your machine. Starting MongoDB on Docker You just need to execute the following command: docker run -e MONGO_INITDB_ROOT_USERNAME=username -e MONGO_INITDB_ROOT_PASSWORD=password --name mongo -p 27017:27017 -v /data/db:/data/db -d mongo Explanation of the command: - `-e MONGO_INITDB_ROOT_USERNAME=username -e MONGO_INITDB_ROOT_PASSWORD=password`: Sets environment variables for MongoDB initialization. You can replace "username" and "password" with your desired credentials. - `--name mongo`: Sets the name for the container. - `-p 27017:27017`: Exposes the MongoDB port for usage. - `-v /data/db:/data/db`: Mounts a volume from

Using Apache Superset, a Powerful and Free Data Analysis Tool

Image
Introduction Among data analysis tools, Apache Superset , provided as open-source software, is considered one of the best choices for deploying reports at a large scale efficiently and completely free of charge. In this article, I will guide you through installing, configuring Superset, and connecting data sources. This application was initiated by Maxime Beauchemin (the creator of Apache Airflow) as a hackathon project when he was working at Airbnb , and it joined the Apache Incubator program in 2017. Essentially, Superset's features are quite similar to other data analysis software, including: Creating and managing dashboards Supporting multiple database types: SQLite, PostgreSQL, MySQL, etc. Supporting direct querying Installation and Configuration Here, I will guide you through installing Superset using the following Docker command: docker run -d -p {outside port}:{inside port} --name {container name} apache/superset Example: docker run -d -p 8080 :8088 --name

Using JSONB in PostgreSQL

Image
Introduction JSONB , short for JSON Binary , is a data type developed from the JSON data type and supported by PostgreSQL since version 9.2. The key difference between JSON and JSONB lies in how they are stored. JSONB supports binary storage and resolves the limitations of the JSON data type by optimizing the insert process and supporting indexing. If you want to know how to install PostgreSQL and learn some basic knowledge about it, check out this article . Defining a Column The query below will create a table with a column of the JSONB data type, which is very simple: CREATE TABLE table_name ( id int , name text , info jsonb ); Inserting Data To insert data into a table with a JSONB column, enclose the content within single quotes ('') like this: INSERT INTO table_name VALUES ( 1 , 'name' , '{"text": "text value", "boolean_vaule": true, "array_value": [1, 2, 3]}' ); We can also insert into an array of o

Installing PostgreSQL with Docker

Image
Introduction In this guide, I'm going to walk you through installing PostgreSQL database and pgAdmin using Docker. The big advantage here is it's quick and straightforward. You won't need to go through a long manual installation process (and potentially spend time fixing errors if they arise). Installing Docker If you don't already have Docker installed on your machine, you'll need to do that first. At this step, you'll need to search Google because the installation process depends on the operating system you're using. Typically, installing Docker on Windows is simpler compared to using Ubuntu . Installing PostgreSQL Once Docker is set up, the next step is to install the PostgreSQL image. Here, I'm using postgres:alpine , which is a minimal version of PostgreSQL (it's lightweight and includes all the essential components needed to use PostgreSQL). docker run -- name postgresql - e POSTGRES_USER ={ username } - e POSTGRES_PASSWORD ={ passwor