DWH Data Engineer
Apply now »Date: Nov 21, 2024
Location: Kingston, JM, WI
Company: Digicel
About Digicel
Enabling customers to live, work, play and flourish in a connected world, Digicel’s world class LTE and fibre networks deliver state-of-the-art mobile, home and business solutions.
Serving 10 million consumer and business customers in 25 markets in the Caribbean and Central America, its investments of over US$5 billion and a commitment to its communities through its Digicel Foundations in Haiti, Jamaica and Trinidad & Tobago have contributed to positive outcomes for over 2 million people to date.
With the Better Connected ethos at the heart of everything, its 5,000 employees worldwide work together to make that a powerful reality for customers, communities and countries day in, day out.
Digicel also delivers news, sports broadcasting, digital media and financial services in several of its markets
Visit www.digicelgroup.com for more.
Primary objective of the job:
The role of a Data Warehouse (DWH) Data Engineer demands a robust technical foundation, analytical prowess, and outstanding collaborative abilities. This position involves crafting and cultivating efficient, scalable, and secure data warehouses. Working in close collaboration with cross-functional teams, the DWH Data Engineer integrates data from diverse sources, ensuring its alignment with data quality standards, and meeting organizational requirements and objectives. This role encompasses a breadth of software development and programming expertise, requiring a deep understanding of physical database design principles and the system development life cycle. Leveraging proficiency in data analysis and a comprehensive grasp of end user and business requirements, DWH Engineers translate business needs into technical solutions.
Data Engineers collaborate closely with customers and Business Intelligence (BI) Analysts to transform data into actionable insights that drive informed business decisions. Employing both business acumen and analytical skills, they delve into data, ensuring its accuracy, coherence, reliability, and accessibility.
Main Duties and Responsibilities:
- Business Requirement & Analysis:
-
- Collaborate closely with BI & Business Analysts to deeply comprehend business requirements, providing alternative data insights and modeling relevant business scenarios
- Conduct thorough analysis of business and functional requirements, translating them into robust, scalable, and operable solutions
- Offer strategic recommendations on data collection, integration, and retention requirements, incorporating business needs and industry best practices
- Development / Implementation:
-
- Implement data structures using best practices in data modeling, processes, and technologies to ensure efficiency and scalability
- Deploy tools and frameworks for automating report generation, identifying data-quality issues, and enforcing data governance
- Lead integration efforts for merging BI platforms with enterprise systems and applications
- Explore data to uncover patterns, meaningful relationships, anomalies, and trends, performing programming analyses across various data formats and platforms
- Develop innovative and effective approaches to solve analytics problems and communicate results and methodologies
- Leverage domain expertise in advanced statistical and predictive modeling to design and implement complex cross-functional data analysis
- Business Partnerships:
-
- Cultivate deep relationships with business partners and Data Analytics Center of Excellence (DA CoE) to ensure the delivery of effective technical solutions
- Identify opportunities for closer integration between DA CoE and business units to drive alignment and collaboration
- Project Planning:
-
- Lead workstream planning, from inception to delivery of data solutions, ensuring timelines align with business objectives
- Anticipate service demand and efficiently manage resources to optimize project outcomes
- Data Quality Assurance:
-
- Advocate for data quality, maintaining standards and ensuring the consistency of the data warehouse
- Recommend quality metrics, document and track them effectively, and troubleshoot ETL performance issues, assisting in performance tuning
- Address reported data reconciliation inconsistencies in data models and/or reports and implement improvements to enhance data warehouse performance
- Research & Evaluation:
-
- Stay updated on emerging data warehouse products/tools and industry best practices
- Research and recommend optimal BI products and services, aligning BI technologies with strategic initiatives
- Policies, Standards & Procedures:
-
- Recommend and implement data standards, policies, and procedures in collaboration with Data Stewards, ensuring compliance, access control, and continuous improvement initiatives are upheld
- Training and Documentation:
-
- Design and deliver end-user training and training materials, ensuring users, including executives, can effectively transform data into actionable information
- Mentor teams to foster a culture of continuous learning and innovation, educating the organization on new approaches and principles from both IT and business perspectives, driving organizational buy-in
Academic Qualifications and Experience Required:
- Bachelor's degree in Computer Science, Information Systems, Business Management, Mathematics, or a related field. Master's degree is preferred.
- Minimum of 7 years of technical experience, with at least 5 years in a major system
Functional Skills:
- Proficient in deriving solutions from loosely defined business challenges, demonstrating adaptability and problem-solving acumen
- Exceptional team leadership and mentoring abilities, fostering collaboration and professional growth within teams
- Skilled in architecting and implementing medium to large scale BI solutions on Azure, utilizing a comprehensive suite of Azure Data Platform services
- Capable of assessing the current production state of applications and evaluating the impact of new implementations on existing business processes
- Proficient in developing cost-effective architectures in Azure, providing recommendations to optimize data infrastructure
- Experienced in designing, setting up, and maintaining Azure services, ensuring smooth operations of critical components
- Expertise in executing Extract, Transform, and Load (ETL) processes from diverse source systems to Azure Data Storage services, employing a variety of tools and languages
- Competent in developing and managing pipelines in Azure Data Factory, facilitating seamless data extraction, transformation, and loading from multiple sources
- Possesses a strong understanding of Spark Architecture and effectively develops Spark applications for data extraction, transformation, and aggregation
- Assumes responsibility for estimating cluster sizes, monitoring, and troubleshooting Spark Databricks clusters, optimizing performance parameters for efficiency
DISCLAIMER:
This job description indicates the general nature and level of work expected of the incumbent. It is not designed to cover or contain a comprehensive listing of activities, duties or responsibilities required of the incumbent. Incumbent may, and probably will be asked to perform other duties as required. Each employee, regardless of classification, is required to maintain a safe, orderly and clean workplace, using safety precautions and observing safety rules at all times.
Job Segment:
Database, Data Warehouse, Data Analyst, Data Modeler, Business Manager, Technology, Data, Management