As a Data Engineer working for Amazon's Data Center Global Services team, you will be working with data on a scale unique to the AWS data center platform. Collaborating with our research scientists you will be turning this data into information. This information will drive solutions to problems regarding the optimization of data center cooling and power distribution.
The right candidate knows how visualize information and can quickly create effective visualizations that tell a story. Working with ease in a Linux or UNIX environment is a must. The candidate can construct efficient SQL queries, create ETL jobs, and write Python scripts to move and prepare data. They should understand how to pick the right tool for the job, whether that’s working with small data sets in a Jupiter Notebook, or tackling big data problems in a MapReduce job. They should have a portfolio of code or visualizations to demonstrate their work. Most importantly the candidate must possess excellent written and verbal communication skills.