Job Description
Job Description Must Have Technical/Functional Skills: - Proficiency in programming languages such as Python, Scala, or Java
- Strong knowledge of Apache Spark and its ecosystem
- Hands-on experience with Databricks notebooks and Delta Lake for data lake management
- Experience with data orchestration tools like Apache Airflow, dbt, or similar
- Strong SQL skills and familiarity with data warehousing concepts
- Familiarity with containerization technologies like Docker and Kubernetes is a plus
- Knowledge of machine learning and artificial intelligence concepts is beneficial.
Roles & Responsibilities - Working on client projects as Data bricks Architect and successfully designing and implementing solutions on a scale
- Architect end-to-end data pipelines that integrate with Databricks and other cloud services.
- Ensure that the data architecture supports both batch and real-time data processing and analytics.
- Leverage Databricks Delta Lake for data storage, management, and processing.
Salary Range: $80,000-$150,000 a year
Job Tags