Overview
Launched in 2017 in New York, N.Y., UPSTACK is transforming the way infrastructure is sourced and sold. Through a powerful combination of the industry's leading advisors, advanced technology, and dedicated customer support resources—UPSTACK uses actionable business intelligence to architect and source customized technology solutions for businesses of all sizes. With UPSTACK, business buyers streamline IT procurement by tapping into a single source for mission-critical technology services from hundreds of proven providers, along with the professional guidance to identify and evaluate the best solutions.
UPSTACK recently secured a substantial equity investment from Berkshire Partners, a firm with over 35 years of investment experience and deep sector expertise in technology and communications. The investment is being used to evolve UPSTACK's intuitive technology platform, expand its product and service portfolio and accelerate its direct investments in the industry's top-producing sales agencies. To date, the company has acquired over twenty-five independent agencies to become the largest and fastest-growing agency in the technology industry.
About the Team
At UPSTACK, the Engineering team is responsible for building software to help manage complex business processes and implementing complex system integrations. The Engineering team also works closely with the Product organization to develop and drive the long-term roadmap for our suite of software.
Engineers at UPSTACK believe that clear is better than clever, sturdy is better than shiny, and done is better than perfect. We write testable, well-documented code that thoughtfully answers the needs of the company. We remember the human, whether that’s the user at the other end of the system or our fellow engineer.
About the Role
The Senior Data Engineer is responsible for connecting data sources and destinations (including, but not limited to the data warehouse), and other data engineering projects as needed. This is a full-time role.
Ideal Candidate Profile
You have modeled complex domains by working with non-technical folks to understand their real-life sphere and making a technical representation of them that is flexible and accurate. You build stable, maintainable pipelines, including both business logic and infrastructure. You have brought together disparate sources of data to form parts of a shared picture, unifying concepts at the appropriate levels. You prefer working across a broad range of tools and processes, rather than spending all your time in a single vertical. You have made both good and bad technical decisions and learned from all of them. You’re excited at the prospect of contributing to engineering best practices as the organization grows.
You are an enthusiastic teammate, who engages in collaboration and proactive discussion. You are an effective communicator who can explain technical concepts to technical and non-technical audiences. You work with confidence and without ego. You have deep knowledge and exercise a high degree of accountability in your daily work. You have loosely-held, defensible ideas, and advocate for what you believe is right. You can surface your unarticulated assumptions. You are also adept at identifying and evaluating trade-offs, willing to be proven wrong, and quick to support your fellow teammates.
Responsibilities
- Own data projects from design through delivery
- Collaborate with Product on projects, setting expectations and finding solutions
- Understand business domains and develop analytics according to their needs
- Model effective development practices
- Make tool recommendations for data tools
- Own data apparatus (pipelines, etc.) throughout their lifetime (maintenance, upgrades, etc.)
Current Stack
- Cloud Technologies
- AWS
- Lambda
- RDS
- Data Storage
- Postgres
- Redis
- Data Transformation & Analysis
- Fivetran
- dbt
- Tableau
- DevOps
- DataDog
- GitHub
- Languages
- Python
- Ruby
- JavaScript
- Web Frameworks
- Rails
- React
Requirements
- Able to work legally in the US
- Willing to work during standard work hours for the US East Coast
- Cloud technology experience (AWS, GCP, or similar)
- Experience with data modeling, domain-specific, high-code ETL, and tradeoffs in data storage
- Able to write production-level Python code
- Able to write and optimize queries in SQL (ideally Postgres)
- 3+ years experience in data engineering
Nice to Haves
- Experience designing and implementing analytic dashboards and visualizations
- IT Infrastructure industry experience
- Experience with Salesforce
- Experience with sales enablement technology
- Experience with any of the following: dbt, FiveTran, Tableau
- Experience with AWS
- B2B and/or enterprise software experience
- Experience with DevOps
- Experience with secure development practices
Education Requirements
- Bachelor's degree or equivalent work experience
What Else We’re Expecting
- History of operating successfully in a fast-paced, high-growth technology organization
- Exceptional core values – not only does the right thing, but does the thing right
- Excellent written and verbal communication skills
- Strong IT background; experience in network, voice and data center implementations a plus
- High attention to detail
- Curious, resilient self-starter with a “can-do” attitude
- Not only adapts to but embraces change
- Collaborative with a willingness to roll up one's sleeves and work on projects and tasks even if they fall outside of stated job responsibilities
- Solutions-oriented problem-solver that is focused on execution
- Entrepreneurial by nature. Not afraid to challenge the status quo in order to find better ways to get the job done
- Data fluent; leverages empirical evidence to inform decisions and opinions
- Demonstrated ability to work across multiple time zones and cultures
Preporuke se učitavaju...