Senior Data Architect, Professional Services – Telco Segment
Job Description
Job summary
Possible locations: Dubai, United Arab Emirates
DESCRIPTION
Are you a Data Analytics specialist Do you have Data Warehousing, Hadoop/Data Lake experience Do you like to solve the most complex and high scale (billions + records) data challenges in the world today Do you like to work on-site in a variety of business environments, leading teams through high impact projects that use the newest data analytic technologies Would you like a career path that enables you to progress with the rapid adoption of cloud computing
At Amazon Web Services (AWS), we're hiring highly technical cloud computing architects to collaborate with our customers and partners on key engagements. Our consultants will develop, deliver and implement AI, IOT, and data analytics projects that help our customers leverage their data to develop business insights. These professional services engagements will focus on customer solutions such as machine learning, IoT, batch/real-time data processing, data and business intelligence.
AWS Professional Services is a unique organization. Our customers are large organizations. We build for them world-class, -native IT solutions to solve real business problems and we help them get business outcomes with AWS. Our projects are often unique, one-of-a-kind endeavors that no one ever has done before.
Amazon is committed to a diverse and inclusive workplace. Amazon is an equal opportunity employer and does not discriminate on the basis of race, national origin, gender, gender identity, sexual orientation, protected veteran status, disability, age, or other legally protected status. For individuals with disabilities who would like to request an accommodation, please visit
https://www.amazon.jobs/en/disability/us.
Key job responsibilities
. Expertise – Collaborate with AWS field sales, pre-sales, training and support teams to help partners and customers learn and use AWS services such as Athena, Glue, Lambda, S3, DynamoDB, NoSQL, Relational Database Service (RDS), Amazon EMR and Amazon Redshift.
. Solutions – Deliver on-site technical engagements with partners and customers. This includes participating in pre-sales on-site visits, understanding customer requirements, creating packaged Data & Analytics service offerings.
. Delivery – Engagements include short on-site projects proving the use of AWS services to support new distributed computing solutions that often span private cloud and public cloud services. Engagements will include migration of existing applications and development of new applications using AWS cloud services.
. Insights – Work with AWS engineering and support teams to convey partner and customer needs and feedback as input to technology roadmaps. Share real world implementations and recommend new capabilities that would simplify adoption and drive greater value from use of AWS cloud services.
. Innovate – Engaging with the customer's business and technology stakeholders to create a compelling vision of a data-driven enterprise in their environment
This is a customer facing role. You will be required to travel to client locations and deliver professional services when needed.
A day in the life
As a Big Data & Analytics Consultant, you will architect, (re)design and build -native, business-critical Big Data solutions with our Customers. You will take advantage of the global scale, elasticity, automation and high-availability features of the AWS platform. Your will build customer solutions with Amazon Elastic Compute (EC2), Amazon Data Pipeline, Glue, Amazon S3, Amazon DynamoDB, Amazon Relational Database Service (RDS), Amazon Elastic Map Reduce (EMR), Amazon Kinesis, Amazon Redshift and other AWS services. You will work across a range of areas including web and mobile applications, enterprise applications, HPC, IoT, Big Data and Artificial Intelligence / Machine Learning and engage with technical, IT and leadership teams of our customers.
As a member of the AWS Professional Services team, you are joining a team that invests in your success by providing comprehensive learning and mentorship programs. Our team also puts a high value on work-life balance. We offer a flexible work environment to help you balance your work and personal life while still remaining customer obsessed.
You will collaborate across the whole AWS organization, with other consultants, customer teams and partners on proof-of-concepts, workshops and complex implementation projects. You will innovate and experiment to help Customers achieve their business outcomes and deliver production-ready solutions at global scale. You will lead projects independently but also work as a member of a larger team. Your role will be key to earning customer trust.
As an Amazonian you will demonstrate the Amazon Leadership Principles, coaching and mentoring others on best practices, performance and career development.
About the team
Inclusive Team Culture
Here at AWS, it's in our nature to learn and be curious. Our employee-led affinity groups foster a culture of inclusion that empower us to be proud of our differences. Ongoing events and learning experiences, including our Conversations on Race and Ethnicity (CORE) and AmazeCon (gender diversity) conferences, inspire us to never stop embracing our uniqueness.
Work/Life Balance
We value work-life harmony. Achieving success at work should never come at the expense of sacrifices at home, which is why flexible work hours and arrangements are part of our culture. When we feel supported in the workplace and at home, there's nothing we can't achieve in the cloud.
Mentorship & Career Growth
We're continuously raising our performance bar as we strive to become Earth's Best Employer. That's why you'll find endless knowledge-sharing, mentorship and other career-advancing resources here to help you develop into a better-rounded professional.
BASIC QUALIFICATIONS
. Bachelor's degree, or equivalent experience, in Computer Science, Engineering, Mathematics or a related field.
. 6+ years experience of Data Lake/Hadoop platform implementation, including 3+ years of hands-on experience in implementation and performance tuning Hadoop/Spark implementations.
. 3+ years of data management expertise, spanning ETL processes, master data management or data management platforms experience, and integration in complex environments.
. Understanding of Apache Hadoop and the Hadoop ecosystem. Experience with one or more relevant tools (Sqoop, Flume, Kafka, Oozie, Hue, Zookeeper, HCatalog, Solr, Avro).
. Experience with massively-parallel-processing (MPP) models, real-time processing and analytics, data ingestion (batched and streamed) and data storage solutions.
. Understanding of database and analytical technologies in the industry including MPP and NoSQL databases, Data Warehouse design, ETL, BI reporting and Dashboard development.
. Familiarity with one or more SQL-on-Hadoop technology (Hive, Impala, Spark SQL, Presto).
. Experience developing software code in one or more programming languages (Java, JavaScript, Python, R etc).
. Hands on experience delivering large-scale data warehousing and analytics projects.
. Implementation experience using OLAP databases such as Redshift, Teradata, Netezza, Vertica.
. Knowledge of BI / data visualization tools (i.e. Tableau, Spotfire, Microstrategy, Cognos)
. Experience working directly with customers, partners or third-party vendors.
. Ability to think strategically about business, product, and technical challenges in an enterprise environment.
. Experience identifying the relationships between business services, information, applications and global infrastructure assets.
. Exposure to Agile development methodologies.
. Excellent communication and presentation skills.
. Strong sense of customer focus, ownership, urgency, and drive.
. Strong communication and data presentation skills, familiarity with using data visualization tools.
PREFERRED QUALIFICATIONS
. Masters or PhD in Computer Science, Physics, Engineering or Math.
. Implementation and tuning experience specifically using Amazon Elastic Map Reduce (EMR).
. Knowledge of basic AWS services (EC2, ELB, RDS, Route53 & S3, Redshift, Kinesis, Glue).
. Infrastructure automation through DevOps scripting (E.g. shell, Python, Ruby, Powershell).
. Track record of implementing AWS services in a variety of distributed computing, enterprise environments.
. Experience using machine learning libraries, such as scikit-learn, caret, mlr, mllib.
. Use of AWS services in distributed environments with Microsoft, IBM, Oracle, HP, etc.
- Desire and ability to interact with different levels of the organization from development to C-Level executives
We invest heavily in our team by continuously offering learning opportunities, sharing knowledge internally across all technical teams in AWS, and working on (customer & internal) projects that will broaden and deepen your technical expertise and business acumen.
If you have an entrepreneurial spirit, are eager to deliver results, are deeply technical, highly innovative, and a voracious learner, it is you, who we are looking for.
Job Details
Employment Types:
Full time
Industry:
Internet / E-commerce
Function:
IT
Roles:
Software Engineer / Programmer