Job Description
Face to face interview
Mandatary experience with PENTAHO
Responsibilities
- Data Pipeline Development and Management: Design, construct, install, test, and maintain highly scalable data management systems. Develop and optimize ETL/ELT pipelines using PySpark and Databricks to process large volumes of structured and unstructured data.
- Cloud Infrastructure: Utilize AWS services for data storage, computation, and orchestration, ensuring a reliable and efficient data infrastructure.
- Data Analysis and Insights: Collaborate with business stakeholders to understand customer experience challenges and opportunities. Analyze complex datasets to identify trends, patterns, and insights related to customer behavior, network performance, product usage, and churn.
- Business Use Case Analysis: Apply your analytical skills to various customer experience use cases, including:
- Churn Prediction: Develop models to identify customers at risk of leaving and understand the underlying drivers.
- Network Experience: Analyze network performance data to identify and address areas of poor customer experience.
- Personalization: Enable data-driven personalization of marketing communications, offers, and customer support interactions.
- Billing and Service Inquiries: Analyze inquiry data to identify root causes of customer confusion and drive improvements in billing and service clarity.
- Reporting and Visualization: Create compelling and insightful reports and dashboards using Tableau or Power BI to communicate findings to both technical and non-technical audiences.
- Data Governance and Quality: Ensure data accuracy, completeness, and consistency across all data platforms. Implement data quality checks and best practices.
- Collaboration and Mentorship: Work closely with cross-functional teams, including product, marketing, and engineering, to deliver data-driven solutions. Mentor junior team members and promote a culture of data-driven decision-making.
Qualifications
- Education: Bachelor's or Master's degree in Computer Science, Engineering, Statistics, or a related quantitative field.
- Experience: 5+ years of experience in a data engineering or data analyst role, with a proven track record of working with large-scale data ecosystems.
- Technical Skills:
- Expert-level proficiency in Python and SQL .
- Hands-on experience with PySpark for big data processing.
- In-depth knowledge of the Databricks platform.
- Strong experience with AWS cloud services (e.g., S3, EC2, Redshift, EMR).
- Demonstrated expertise in data visualization and reporting with Tableau or Power BI .
- Analytical Skills:
- Strong analytical and problem-solving skills with the ability to translate business requirements into technical solutions.
- Experience in the telecommunications industry with a focus on customer experience is highly desirable.
- Familiarity with statistical analysis and machine learning concepts is a plus.
- Soft Skills:
- Excellent communication and presentation skills with the ability to articulate complex technical concepts to a non-technical audience.
- Proven ability to work independently and as part of a collaborative team in a fast-paced environment.
- A strong sense of curiosity and a passion for using data to drive business impact.
Job Tags