Big Data Engineer
Industry experience: 8+ Years
- Developing and implementing data ingestion pipelines to collect, process, and store large and complex datasets using technologies such as Azure Data Explorer, Data Lake, Spark, and Kafka.
- Designing and implementing data warehousing and ETL (extract, transform, load) processes to clean and transform data for analysis.
- Developing and maintaining distributed computing systems and databases that can handle large-scale data processing and analysis.
- Collaborating with data scientists and analysts to design and implement data models and algorithms for analysis and machine learning.
- Ensuring the security, availability, and reliability of Big Data systems and infrastructure, and implementing disaster recovery and backup processes as needed.
- Staying up-to-date with the latest developments in Big Data technologies and integrating new tools and techniques into the workflow.
- Managing and mentoring junior team members, and providing guidance and support for best practices in Big Data engineering.
- Degree in computer science, information technology, or a related field
- Experience with Big Data technologies such as Data Lake, Azure Data Explorer, Cosmos DB, Spark, and Kafka
- Proficiency in programming languages such as Scala, Java or Python, and experience with distributed computing and database systems.
- Strong problem-solving and analytical skills, as well as excellent communication and collaboration skills, are also important for success in this role.
- Industry experience: 8 Years to 12 years