Leverage data and business principles to create and drive large scale FB Data Center programs.
Define and develop the program for metrics creation, data collection, modeling, and reporting the operational performance of Facebook’s data centers.
Work cross-functionally to define problem statements, collect data, build analytical models and make recommendations.
Be a self-starter, motivated by a passion for developing the best possible solutions to problems.
Identify and implement streamlined processes for data reporting and communication.
Use analytical models to identify insights that are used to drive key decisions across the organization.
Routinely communicate metrics, trends and other key indicators to leadership.
Provide leadership and mentorship to other members of the team.
Lead and support various ad hoc projects, as needed, in support of Facebook’s Data Center strategy.
Build and maintain data driven optimization models, experiments, forecasting algorithms and capacity constraint models.
Leverage tools like R, PHP, Python, Tableau, Hadoop & SQL to drive efficient analytics.
Degree in an analytical field (e.g. Computer Science, Engineering, Mathematics, Statistics, Operations Research, Management Science).
8+ years of experience in a role with emphasis on data analysis and metrics development.
5+ years of hands-on experience analyzing and interpreting data, drawing conclusions, defining recommended actions, and reporting results across stakeholders.
5+ years of SQL development experience and drafting queries.
5+ years of hands-on project management experience.
5+ years of experience with data visualization tools such as Tableau.
5+ years of experience with packages such as R, SPSS, SAS, STATA, etc.
3+ years of experience in scripting with languages such as Python or PHP.
Proven track record of leveraging data driven models to drive business decisions.
Experience using data access tools and building visualizations using large datasets and multiple data sources.
Experience thinking analytically.
Experience distilling and communicating data to all organizational levels.
Experience with packages such as NumPy, SciPy, pandas, scikit-learn, dplyr, ggplot2.
Knowledge of statistics and optimization techniques.
Hands-on experience with medium to large datasets (i.e. data extraction, cleaning, analysis and presentation).