Position Description
Established in 2005 and based in Charlotte, North Carolina, Snap One is a manufacturer and exclusive source of A/V, security, control, networking, and remote management products for professional integrators. An industry leader in the pro-install channel, Snap One helps integrators build their businesses by providing a wide range of high-quality products, easily accessible through an intuitive website and backed by award-winning service and support. With a vast catalogue of today’s most popular brands, Snap One is the premier choice for professional installers across the globe. With 28 pro stores in thee-commerce One blends the benefits of ecommerce with the convenience of local stores. Additional information about Snap One and its products can be found at www.snapone.com.
The Datawarehouse Data Engineer will report to the Data Services Director and will be responsible for leading and contributing to Snap One’s cloud-focused engineering teams to deliver end-to-end cloud transformation. This challenging role requires senior knowledge and experience across many facets of cloud technology: application development, hybrid environments / connectivity, and modern data services, while understanding the governance, security, monitoring and cost management aspects of the entire ecosystem.
The successful candidate will be responsible for the engineering and sustainment of the SQL Server based solutions, ensuring its operational readiness (security, health and performance), executing data extract, transformations, and loads, and performing data modeling in support of Snap One’s various application development and data management teams.
Specific Responsibilities:
- Implement modern data solutions with Databricks, SQL Server, Azure Data Factory, and Data Lake
- Work with the business partners to deliver end to end data analytics solutions using (SQL Server, Databricks, Analysis Services, Power BI, and Data Science tools)
- Support data migration efforts from SQL to Databricks on Azure.
- Design and development with an Enterprise Data Warehouse ecosystem
- Assist other team members and business insiders with complex query tuning and schema refinement
- Perform occasional scheduled after-hours tasks related to maintenance or release deployments after-hours support for critical production systems
- Refine and automate regular processes, tracks issues, and documents changes
- Collaborate with Data Engineers on data collection, data quality, and ETL projects feeding into the Enterprise Data Warehouse
Required Qualifications:
- 3+ years delivering solutions using Databricks on Microsoft Azure with a focus on data solutions and services
- 3+ years’ experience with dimensional modeling
- 5+ years’ SQL experience.
- 1+ years’ experience using Python.
- 1+ years’ experience in AI/ML models
Preferred Qualifications:
- Experience with the Microsoft SQL Server Stack (ADF, SSIS, PowerBI, SSAS, ADLS)
- Experience with data replication tools, such as Fivetran.
- Azure / Databricks Certifications
- Bachelor’s Degree in Information Systems, Data Management, or Computer Science and 5+ years’ experience of solution delivery experience or 8+ years of relevant IT experience