Data Ingestion and Integration: Responsible for ingesting data from various sources into the data lakehouse. This includes handling structured and unstructured data and ensuring continuous data ingestion to meet real-time business needs.
Data Transformation and Processing: Once the data is ingested, it needs to be transformed and processed to make it ready for use in AI, ML, data science, and analytics. This involves using Databricks to perform data transformations and ensuring the data is clean and usable.
Data Governance and Security: Implementing and managing data governance policies to ensure data quality, consistency, and security. This includes managing role-based access controls (RBAC) and ensuring data encryption at rest and in transit.
Performance Optimization: Ensuring the performance of the data lakehouse environment by optimizing data processing workflows and managing resources efficiently. This includes configuring Azure resources to minimize operational costs while meeting performance requirements.
Collaboration and Communication: Working closely with other teams, such as DevOps, to manage integrations and deployments. Effective communication is crucial for coordinating efforts and ensuring smooth operations.
Maintenance and Support: Providing ongoing maintenance and support for the data lakehouse environment. This includes monitoring system performance, troubleshooting issues, and implementing updates and improvements.
Tier II Support: Providing Tier II support for the data lakehouse.
Other Duties: Perform other duties as assigned
RequirementsEDUCATION Bachelor’s degree in a related field or the equivalent through a combination of education and related work experience.
EXPERIENCE
5+ years related work experience.
Certifications in Databricks and Azure data technologies preferred.
Experience in Databricks core components like DataFrames, Datasets, Spark SQL, Delta Lake, Databricks Notebook, DBFS, and Databricks Connect.
Strong proficiency in programming languages such as Python or Scala.
Experience in implementing enterprise-scale data platforms as part of a collaborative team.
Experience working in a fast-paced, collaborative, and team-based project environment.
Extensive experience with SQL Server, including database design, stored procedures, and performance tuning.
Hands-on experience using version control systems such as Git and CI/CD workflows and practices.
Hands-on experience with Azure Data Factory (ADF) for building and deploying data pipelines.
Familiarity with other Azure data services (e.g., Azure Data Lake Storage, Azure Synapse Analytics, Azure Databricks) is preferred.
Strong understanding of data warehousing concepts and dimensional modeling.
Experience with data quality management and data governance principles.
Experience working in a shared service, hybrid environment.
Benefits
The annual salary range is $85,000 — $95,000
Remote position with 2-3 days per month onsite in Lansing, Michigan. The rest of the month will be working from home, remote. Priority will be given to Michigan residents and candidates willing to relocate to within the State of Michigan. 100% remote positions are possible for staff out of state.
Here are some of our benefits:
● Group Medical, Dental, Life, HSA/FSA , and Vision Insurance
● SIMPLE IRA Accounts with an immediate vesting of 3% company match
● Paid company holidays and personal days
● Partial Internet and mobile phone expense reimbursement
● Professional development opportunities
● Great culture with excellent teams, collaborative, smart working, and innovative.
Seniority level
Associate
Employment type
Full-time
Job function
Information Technology
Industries
IT Services and IT Consulting
Referrals increase your chances of interviewing at AJ Boggs by 2x