Principal Data Scientist I

Broomfield, Colorado
Nov 19, 2020
Nov 25, 2020
Organization Type
The Principal Data Scientist I will serve as a technical lead for the experimentation data science team. This role will support experiment planning and set-up across Charter product teams including video, buyflow, and support portals and own experimentation strategy including validation development of experimentation plans, frameworks, and experimentation metrics. The individual will additionally be responsible for executing advanced analytic projects with the objectives of creating causal inferences, correlations, predictions, and recommendations for improvements to business processes, and the metrics to monitor those improvements. Requires a strong command of statistical and data science oriented techniques and algorithms as well as a demonstrated practical ability to determine where to invest time, synthesize actionable findings across diverse assignments, and present these findings to an audience with varying levels of background in analytics.

Actively and consistently support all efforts to simplify and enhance the customer experience
  • Collaborate with stakeholders to understand product / feature plans for development and translate those plans into experimentation objectives and testable hypotheses
  • Assist product teams with experimental design and analyses, including development of analysis plans and ad hoc reports
  • Develop experimentation strategy and project plans for stakeholders and lead execution of plans involving members across multiple teams
  • Synthesize data from experimentation and other data sources to provide stakeholders with answers to research questions and help determine next steps for a product / feature using a data driven approach
  • Create and present materials communicating quantitative findings and relevant business impacts to internal and external teams as well as an executive audience
  • Supervise and integrate the work of junior data scientists on experimentation and data science team to produce high quality work products
  • Refine experimentation processes including leverage of existing data science tools and development of new tools where applicable
  • Identify new methodologies for advancing experimentation program, develop plans to evaluate these methodologies through simulation studies, and work with software and data engineering teams on implementation strategies
  • Leverage multiple data sources to produce intelligent data products that solve the needs of business and engineering for business intelligence, operational analytics, descriptive models, predictive models, diagnostic models, and prescriptive models.
  • Make technical decisions for the organization
  • Coordinate technical activities across projects and the organization
  • Interact with various stakeholders to understand their business needs, communicate project status and develop relationships to ensure satisfaction
  • Develop and deliver materials on experimentation techniques and tools to a broad set of business-intelligence, data, and analytics professionals with varied backgrounds
  • Exercise thought leadership and discretion in tailoring the tools, approaches, and data used to meet the needs of the particular problem
  • Deliver results using metrics-driven analysis, and communicating the costs and tradeoffs of ideas to stakeholders and top management

Skills/Abilities and Knowledge
  • Ability to read, write, speak and understand English
  • Able to communicate complex technical, statistical and quantitative concepts to individuals with a variety of backgrounds, explain the importance and relevance of data, and suggest solutions
  • Works with minimal guidance to set project plans and timelines and prioritize among multiple competing interests
  • Working knowledge of current industry best practices for online experimentation
  • Command of advanced statistical concepts including experimental design, hypothesis testing, common and more advance analytic methods
  • Fluency with regression modeling and the ability to develop and evaluate a model using simple and complex regression, working knowledge of correlated data analysis and ability to identify when more sophisticated ML modeling may be warranted over common approaches
  • Knowledge of tools to obtain, transform, and store data on Big Data and Streaming systems
  • Ability to design and implement ETL projects from scratch
  • Knowledge of tools to develop, test, and deploy production code, modify CICD pipelines and deployment scripts
  • Background with Cable systems and operations
  • Experience with Hadoop, HIVE, SPARK, and/or Snowflake
  • Knowledge of other relevant tools such as SAS, SPSS, Alteryx, Linux
  • Knowledge of other relevant techniques such as text analysis and text mining
  • Familiarity with the open-source ecosystems surrounding R (CRAN), Python (PyPi), and/or Hadoop
  • Broad experience and solid theoretical foundation on the modeling process using a variety of algorithms
  • Data profiling, distributions, confidence intervals, hypothesis testing
  • Data pre-processing, exploratory data analysis using a variety of techniques
  • Knowledge of classification using techniques such as linear models, GLMs, and tree-based methods
  • Interpretation of model results, consideration of causality, treatment of multicollinearity
  • Strong synthesis and presentation skills
  • Ability to communicate results and recommendations to a wide variety of audiences including executive leadership
  • Understanding of data architecture, data warehouse and data marts
  • Demonstrated ability and desire to continually expand skill set, and learn from and teach others
  • Experience with other database and data store technologies, such as NoSQL, key-value, columnar, graph, and document.
  • Experience in receiving, converting, and cleansing big data
  • Program, product, or project management experience delivering analytics results
  • Strong background in Linux/Unix/CentOS and Windows installation and administration
  • Ability to identify and resolve end-to-end performance, network, server, cloud, and platform issues
  • Pattern recognition and predictive modeling skills
  • Keen attention to detail with the ability to effectively prioritize and execute multiple tasks

Bachelor's Degree in a statistics, data science, applied math or related field.

Related Work Experience
3 plus years user and system admin ownership with Linux/Unix/CentOS and Windows
10 plus years Statistical Analysis
8 plus years Business Analysis
8 plus years SQL/R/SAS Programming
2 plus years Database Design or Database Modeling

Skills/Abilities and Knowledge

Operations-research background, in particular focused on large labor operations such as field ops, technical support, and sales
Background with cable and/ or telecommunications

Master's degree in a statistics, data science, applied math or related field.

Office environment
Charter Technical Engineering Center
Highly collaborative and innovative work space
Occasional Travel

Similar jobs

Similar jobs