Rogers Communications is a leading technology company in Canada that connects millions of Canadians through innovative solutions. They are seeking a Data Analyst who will transform large volumes of network telemetry into impactful data products, collaborating with various teams to solve complex operational challenges.
Design, model, and build analytical datasets using Azure Databricks, and Spark‑based distributed processing
Develop basic ingestion and transformation pipelines (batch and near‑real‑time) using modern data‑stack practices
Create data‑quality, lineage, and observability frameworks that promote trust, transparency, and reliability across all data layers
Build analytical views, dimensional models, and feature sets that support ML workflows, predictive analytics, and anomaly detection
Design KPI and metric layers that translate complex network telemetry into clear, consumable data products
Optimize large‑scale datasets for performance, incremental refresh, and multi‑audience consumption (curated, semantic, ML‑ready)
Analyze high‑volume telemetry across Wi‑Fi, DOCSIS, PON, LTE, TR‑069/TR‑181, and IP service platforms to surface degradation patterns and service‑impacting anomalies
Apply statistical and correlation techniques to identify root causes, trends, and cross‑domain performance drivers
Use advanced methods such as clustering, time‑series decomposition, and seasonality analysis to detect irregular or emerging behavior
Develop and maintain cloud‑native analytics solutions using Azure Databricks, Power BI, Splunk, Snowflake, and Spark clusters
Implement CI/CD practices for analytics and ML pipelines, including version control, automated testing, orchestration, and deployment controls
Build reusable data components and frameworks that promote consistency, accessibility, and shared ownership
Collaborate with data engineers on schema design, performance tuning, and scalable architecture
Partner with product owners and network teams to translate business and operational needs into clear, inclusive technical requirements
Contribute to Agile ceremonies and planning discussions, helping shape solutions from idea to production
Qualification
Required
A Bachelor's degree in Computer Science, Data Science, Engineering, Mathematics, or a related field—or equivalent practical experience
2+ years of experience in data analytics, data engineering, or building cloud‑native data products
Proficiency in SQL (CTEs, window functions, performance tuning, distributed queries)
Python (pandas, PySpark, automation, ML libraries)