Sr. Data Engineer (Cloud) - Waltham, MA

 constant-contact-png

An immediate opportunity exists for a Principal/Senior level Engineer with a passion for creating outstanding products to help take Endurance, and our best in class brands (Constant Contact, BlueHost, SiteBuilder.com, Domain.com, etc.) to the next level.  

Today’s enterprises are leveraging data lake architectures to enable new ways to empower analytics, business intelligence, and new product features. At Endurance, we’re using this data lake design to build a Data Platform in the cloud. We’re leveraging the latest AWS services to build a cutting-edge, highly scalable and cost-effective platform.

 

Responsibilities:

  • Be a key leader and contributor to the design and development of a scalable and cost-effective cloud-based data platform based on a data lake design
  • Be collaborative with team members, Product Management, Architects, data producers and data consumers throughout the company
  • Develop data platform components in a cloud environment to ingest data and events from cloud and on-premises environments as well as third parties
  • Build automated pipelines and data services to validate, catalog, aggregate and transform ingested data
  • Build automated data delivery pipelines and services to integrate data from the data lake to internal and external consuming applications and services
  • Build and deliver cloud-based deployment and monitoring capabilities consistent with DevOps models
  • Keep knowledge and skills current with the latest cloud services, features and best practices

 

Technical Skills:

  • Extensive experience developing big data, business intelligence, marketing automation and/or other analytics infrastructure or pipelines - data lake experience preferred
  • 10 years’ experience in developing and architecting solutions using big data, data warehousing
  • 3 ++ years hands-on experience developing data lake solutions using Amazon’s AWS (certification preferred)
  • Experience with data streaming technologies (Kinesis, Storm, Kafka, Spark Streaming) and real time analytics
  • Working experience and detailed knowledge in Java, JavaScript, or Python
  • Knowledge of ETL, Map Reduce and pipeline tools (Glue, EMR, Spark)
  • Experience with large or partitioned relational databases (Aurora, MySQL, DB2)
  • Experience with NoSQL databases (DynamoDB, Cassandra)
  • Agile development (Scrum) experience
  • Other preferred experience includes working with DevOps practices, SaaS, IaaS, code management (CodeCommit, git), deployment tools (CodeBuild, CodeDeploy, Jenkins, Shell scripting), and Continuous Delivery
  • Primary AWS development skills include S3, IAM, Lambda, RDS, Kinesis, APIGateway, Redshift, EMR, Glue, and CloudFormation‎

Apply Below!