Posts

About us

  About BlingTechs Welcome to BlingTechs, your go-to destination for all things tech! 🚀 Our Story BlingTechs was born out of a passion for innovation and a love for gadgets. Back in 2016, our founder, Viraj, decided to create a space where tech enthusiasts could geek out, learn, and stay updated on the latest breakthroughs. Fast forward to today, and BlingTechs has become a hub for curious minds seeking tech inspiration.

Contact Us

  house.thilina@gmail.com General Info

Privacy policy

By accessing www.blingtechs.blogspot.com you are agreeing to be legally bound by these terms as modified from time to time (“Terms”). Any use of the site by you after such changes are posted constitutes your agreement to these Terms as modified. All intellectual property rights, including copyright, in the content displayed on the Website (“Content”) belong to Blingtechs. All rights are hereby reserved. The Website and the Content may only be used for your personal, non-commercial use. You may use the Content online and solely for your personal, non-commercial use, and you may download or print a single copy of any portion of the Content for your personal, non-commercial use, provided you do not remove any trademark, copyright or other notice contained in such Content. No other use is permitted. Blingtechs obtains the Content from a wide range of sources and it includes facts, views, opinions and information likely to be of interest to users of the Website. While all reasonable

Hash Functions

Image
One way functions that transfer character inputs to compressed output value named as hash functions. The input can be infinite(arbitrary), but the output always has a finite(fixed) set of characters. This produces a fingerprint of the file/message/data source: Wikipedia You cannot reverse the function and get the input value in Hash functions. If you have the content, you can use the hash function to calculate the hash value, but the other way around is not possible. If you have the hash value and the function, you can’t use it to get the input text. Below are sample scenarios that hash functions may become useless. If the function is reversible, it’s not secure. Hash values can be exposed to the public. Hence we don’t want someone seeing the hash value. The values may not collision resistant. There may be use cases with two unique input values return the same hash value. Since digital certificates/Signatures using hash fun

How to setup MySQL Master-Slave replication

Image
Hello there! This time I’m going to discuss one of the projects I got involved recently. The team is expanding the operations to a new operator and we need to deploy the platform in new servers. One instance is to create database replications in MySQL. Below is the environment I’m working on Master and Slave are CentOS Linux release 7.4 Linux Servers.  Master IP Address is 192.168.100.1.  Slave IP Address is 192.168.100.2.  Master and Slave are on the same LAN network.  Let’s install MySQL on the master server and configure 1. For CentOS 7 and Red Hat (RHEL) 7 yum localinstall https://dev.mysql.com/get/mysql57-community-release-el7-11.noarch.rpm 2. CentOS 7.4/6.9 and Red Hat (RHEL) 7.4/6.9 yum install mysql-community-server 3. Start MySQL server and auto start MySQL on boot. Below commands can be used in Fedora 27/26/25 and CentOS 7.4 and Red Hat (RHEL) 7.4 systemctl start mysqld.service systemctl enable mysqld.service But I couldn’t log in to the MySQL by using below command

sqoop incremental import in cloudera hadoop

Image
In the last blog post , I described how we can import data from RDBMS to HDFS using sqoop. Now will discuss how we can do incremental import in cloudera hadoop user interface. If you know the basic functionalities on hadoop, this is a simple task! You need to consider ‘incremental’, ‘check-column’, and ‘last-value’ options to perform the incremental import in sqoop. Following syntax is using for the incremental import --incremental <mode> --check-column <column name> --last value <last check column value> Cloudera hadoop is a commercial version of the hadoop. I am using Oozie workflow UI provided by the cloudera to import data. When you are defining workflows in Oozie UI, you need to give the correct file path for the JDBC driver as well. If you didn’t include the drivers yet, please make sure you include all of those in a folder that can be accessed by everyone. Login to the Hue UI -> Workflows -> editors -> workflows

Import relational databases to hadoop using sqoop

Image
Hello there, This time will discuss how to import the data in to hadoop from the RDBMS. We are using sqoop as the import mechanism. What’s sqoop? It’s an open source software product of the Apache Software Foundation. The tool is designed to transfer data between relational databases and hadoop. It allows users to import data to a target location inside hadoop and export from hadoop as well. If you are not willing to use sqoop to transfer data, there are alternatives available such as spark. But there are some disadvantages like, Spark did not work well for complex data types. Before run the commands to import data, please make sure you installed, Java, Hadoop and sqoop on your workplace.                                                                 Source: severalnines.com When considering hadoop file system, there are two types of table you need to use in the process of importing data. 1. External tables We do create these tab