Alterra Mountain Company

Receive alerts when this company posts new jobs.

Similar Jobs

Job Details

Data Engineer

at Alterra Mountain Company

Posted: 1/20/2019
Job Status: Full Time
Job Reference #: 86308168-66ce-4192-a663-e86fd44c7c45

Job Description

Alterra Mountain Company 


Alterra Mountain Company is a community of 12 iconic year-round destinations, including the world’s largest heli-ski operation. The company owns and operates a range of recreation, hospitality, real-estate development, food and beverage and retail businesses. Headquartered in Denver, Colorado with destinations across the continent, we are rooted in the spirit of the mountains and united by a passion for outdoor adventure. Alterra Mountain Company’s family of diverse playgrounds spans five U.S. states and three Canadian provinces: Steamboat and Winter Park Resort in Colorado; Squaw Valley Alpine Meadows, Mammoth Mountain, June Mountain and Big Bear Mountain Resort in California; Stratton in Vermont; Snowshoe in West Virginia; Tremblant in Quebec, Blue Mountain in Ontario; Deer Valley in Utah; and CMH Heli-Skiing & Summer Adventures in British Columbia. We honor each destination’s unique character and authenticity and celebrate the legendary adventures and enduring memories they bring to everyone.


EIM team members will require strong business acumen, high energy, and a high level of sociality.  They must bring a combination of technical depth, with a mind for business process and business strategy.  They should also possess a strategic personality with a sense of entrepreneurial spirit and drive for innovation.  Each must be self-directed, but not rogue, with the strategic business goals and EIM program vision in mind.

Employees enjoy many different perks related to the portfolio of companies, including free ski passes, discounted access to CMH and other ski resorts, ECO pass, and a great office located in the heart of Lower Downtown.


Alterra is a distributed organization, and enterprise data capabilities will be a core technology discipline to enable growth and success.  This position will help build the data platform of the future for Alterra, utilizing progressive ideas and strategies for managing the flow and storage of information across the enterprise.

The new company is working to aggregate and unify data to achieve tactical and strategic business decision making; support innovative guest and employee experience applications; and drive guest & business analytics. You will be responsible for executing the technical architecture and development of the data processing, storage, access capabilities. You will work closely with the Business Intelligence and Enterprise/Cloud Infrastructure teams, as well as various business partners and executive stakeholders across sales, marketing, finance, operations, security and compliance.



  • Provide input to and execute the design, implementation, maintenance, enhancement, monitoring and governance of enterprise data repositories including the Alterra data lake, data warehouse, LOB specific data marts, enterprise master data (customer, product, location), enterprise reference data, and raw data archives.
  • Monitor and provide input into the enterprise IT architecture including cloud infrastructure and connectivity; database architecture and connectivity; external data connectivity (S/FTP, APIs); security, backup and DR as these subject areas relate to the enterprise information architecture.
  • Ensure data ingestion processes catalog and tag arriving data and provide data life cycle and version management across landing, near-term archive, long term cold storage, and data destruction events based on corporate security, compliance, and data retention policies.
  • Ensure data pipelines provide appropriate access security, encryption (at rest and in motion) and data masking/stripping based on content and corporate security and compliance guidelines.
  • Ensure proper governance of enterprise data assets including data access at the subject and row level, enforcement of data privacy (PII), protection of financial data (PCI), and country specific treatment and regional storage of data.
  • Provide input to and execute development of design patterns and best practices for data integration and data analysis across the enterprise.
  • Provide input to and execute development and enforcement of naming conventions for enterprise data assets including data models; database, schema, table, view, index, trigger, stored proc/function names; object level storage container names and paths; file names; and ETL scripts.
  • Collaborate with teams that manage operational data masters and lead the design and development of data mastering processes.
  • Provide input to and execute development of logical and physical enterprise data models; enterprise master data and reference data models; metadata models; and data catalogs. Support the governance and stewarding of master and reference data.
  • Lead the ongoing development and code reviews of data acquisition, data movement, data cleansing, data transformation, data mapping, data quality screens, ETL jobs and schedules, and other ETL and data integration activities.
  • Ensure that ETL jobs are scheduled, monitored and generate detailed logs sufficient to support ongoing diagnostics, exception processing, and audit trails for compliance.
  • Ensure data related design and code assets are tested and documented and managed in a version control repository. Assets should include models, code, run-books, and infrastructure-as-code scripts.
  • Manage the packaging of code assets, models, configurations, schemas and migration instructions to support updates to development, test and production environments.
  • Coordinate with the change control approval board and operations teams to schedule the migration of code assets, models, configurations and schemas across test and production environments.
  • Mentor and support data integration and business intelligence teams.


EDUCATION & EXPERIENCE REQUIREMENTS (state minimum required and preferred)


  • Undergraduate Major preference: Business, IS/IT, Computer Science, related field and/or experience


The ideal candidate has experience:

  • Organizing data at scale including data lakes, data marts, and data warehouses.
  • Data Architecture, Data Management Services, Data Governance, Data Quality processes and Data Lifecycle.
  • Design and construction of information architectures that enable well-integrated transactional, collaborative and analytical systems.
  • Enterprise level data modeling at the logical and physical levels for 3NF, star schema, slowly changing data.
  • Data cataloging and metadata management.
  • Cloud-based data platforms such as Azure and AWS; cloud security (accounts, users, groups, roles); logging and monitoring; lambdas and functions; data versioning and life cycle management; infrastructure-as-code.
  • Commercial ETL tools such as SSIS, Talend, Informatica and/or the use of 3GL languages such as C#/Python to implement ETL services.
  • ETL job scheduling tools/techniques, job control, exception handling, logging, and monitoring.
  • ETL workflow design including change data capture, transformations, mapping, and data quality screens.
  • Code and infrastructure-as-code testing techniques including unit, integration, system, performance/stress, and acceptance tests.
  • SQL databases, DDL, DML, DQL, DCL, and TCL, triggers, stored procedures/functions, indexes, sequences, table schema types, transactions, and replication.
  • Database schema migration techniques.
  • Schema-on-read query paradigm, columnar file formats such as Parquet/AVRO, compression and partitioning techniques.
  • SDLC concepts including application lifecycle management, release management, and optionally continuous delivery.
  • Version management via a version control system/repository.
  • Conceptual understanding of advanced analytic and machine learning data processing requirements.
  • Various data connectivity techniques including FTP and APIs,
  • Master data management using mapping tables and/or commercial MDM tools.
  • Serverless and microservice architectures and techniques.


  • 3 years building cloud-based data platforms (architecture, storage, management, monitoring)
  • 1 years with cloud-based development technologies such as Python, Lambda, Azure Functions
  • 3 years of experience building information architectures
  • 3 years of experience data profiling & data mining
  • 3 years of experience building data warehouses and/or data marts
  • Business Process Management process and application experience
  • Experience with Informatica & SSIS is a plus
  • Experience with metadata applications and solutions
  • Experience with master data management




  • Strong project management and organizational skills.
  • Collaborative and mentoring work style.
  • Strong analytic skills and hands-on attention to detail.
  • Mechanical tendencies and a curiosity to know how things work and how to make them better.
  • Passion for statistics & analytics, process engineering, and information management.
  • Loves being hands-on in a fast-paced, entrepreneurial environment.


  • Some limited travel may be required.


An Equal Opportunity Employer