Now Hiring:Motivated Software Engineer?

Careers

Become a part of our big family to inspire and get inspired by professional experts.

Junior Software Engineer - Hadoop, Pyspark, Scala, Java, Python

Full time
Ability to understand existing ETL logic to convert into Spark/PySpark. Knowledge of Unix shell scripting, RDBMS, Hive, HDFS File System, HDFS File Types. Required hands on experience with big data tools like Spark, HDFS, Hive, Python, Spark, PySpark. Advanced working SQL knowledge and experience working with relational databases and query authoring (SQL). Exposure to public cloud platforms (Azure/AWS/GCP) is a major plus. Excellent attention to detail and communication/presentation skills. High enthusiasm, integrity, ingenuity, results-orientation, self-motivation, and resourcefulness in a fast-paced competitive environment.

Software Engineer - Hadoop, Pyspark,Scala,python

Full time
Should have hands on experience with big data tools like Spark, HDFS, Hive, Python, Spark, PySpark, Advanced working SQL knowledge and experience working with relational databases and query authoring (SQL), Exposure to public cloud platforms (Azure/AWS/GCP) is a major plus, Excellent attention to detail and communication/presentation skills, High enthusiasm, integrity, ingenuity, results-orientation, self-motivation, and resourcefulness in a fast-paced competitive environment, Experience / aspiration to learn Spark, Pyspark is added advantage, Experience in deploying ETL/data pipelines and workflows in cloud technologies and architecture such as Azure and AWS, The candidate should work independently with minimal supervision.

Senior/Lead Software Engineer - Hadoop, Pyspark,Scala,python

Full time
Should have hands on experience with big data tools like Spark, HDFS, Hive, Python, Spark, PySpark, Advanced working SQL knowledge and experience working with relational databases and query authoring (SQL), Exposure to public cloud platforms (Azure/AWS/GCP) is a major plus, Excellent attention to detail and communication/presentation skills, High enthusiasm, integrity, ingenuity, results-orientation, self-motivation, and resourcefulness in a fast-paced competitive environment, Basic programming experience using, Javaand python is needed, Experience / aspiration to learn Spark, Pyspark is added advantage, Experience in deploying ETL/data pipelines and workflows in cloud technologies and architecture such as Azure and AWS