Data Architect at BlackLine
BlackLine is currently seeking an experienced Data Architect. The successful candidate will be responsible for developing architectures and leading design of complex data services and capabilities to address complex business issues, as well as providing technical leadership within the Engineering team. The ideal applicant will thrive in a highly collaborative workplace and actively engage in the development process. This is an excellent career opportunity for a professional with an impressive data architecture and design background that possesses excellent interpersonal skills.
- Define core data processing architecture for BlackLine’s global accounting services, including data integrations and wrangling, ETL, data streaming and processing, data warehousing, and machine learning data.
- Lead and direct engineering teams toward optimal data architectures in hybrid cloud/on-premise environments.
- Design and build infrastructure and platform components to support complex data pipelines ingesting disparate data from multiple internal and external data sources, and processing.
- Participate in developing machine learning projects.
- Lead enterprise data modeling and governance.
- Provide data-related architecture and design consulting to cross-functional product and engineering teams.
- Manage key architectural data-related standards, best practices, and guidelines working with leads on cross-functional platforms / applications.
- Provide architectural blueprints and technical leadership to Engineering teams.
- Coach and mentor engineers.
- Evaluate and recommend tools, technologies and processes to simplify and accelerate innovation.
- BS degree in Computer Science, Engineering, or related discipline; MS preferred.
- Extensive knowledge of the software development process and its technologies.
- Knowledge of data-related architectural styles and design patterns.
- 15+ years of technical leadership and architecture experience in software development with a variety of languages, databases, data processing tools, especially big data, machine learning.6-8+ years’ experience designing and delivering large scale distributed systems (with at least 3-4 years on cloud services)Deep knowledge of big data technologies, e.g. Hadoop, Spark, Kafka, Flink, etc.Deep knowledge of public cloud technologies (GCP, AWS, Azure) Ability to work with all levels within the organization.Excellent written and verbal communication skills.Familiarity with UI security.
- Familiarity with major ERP systems like SAP, Oracle, Netsuite, Microsoft Dynamics, and Workday.