Job scopes:
The Data Engineer will lead the enterprise-wide data
transformations projects. The person will involve in developing and automating
data processing pipelines for financial and Investment data modelling,
analysis, and reporting from various data sources systems; The primary
responsibility of this position is to establish the enterprise data Lake
architecture under Microsoft Azure Data Factory, Databricks and Synapse and
lead a team to deliver data driven solutions.
The primary responsibilities:
- Lead architecture
design, develop, document, and implement end-to-end data pipelines and
data-driven solutions.
- Define roadmap to
transform data architecture focusing on scalability, performance, and
stability for the entire data lifecycle;
- Build data flow for data
acquisition, aggregation, and modelling, using both batch and streaming
paradigms.
- Perform data analysis,
data profiling, data cleansing, data lineage, data mapping and data
transformation.
- Development of
high-quality code for the core data stack including data integration hub,
data warehouse and data pipelines under Azure service.
- Recommend, execute and
deliver best practices in data management and data lifecycle processes,
including modular development of data processes, coding and configuration
standards, error handling and notification standards, auditing standards,
and data archival standards.
- Implementing security
and standards, documenting technical specifications and operating
procedures.
- Collaborate across
developers as part of a SCRUM team, ensuring collective team productivity
- Provide technical
support for any data issues with recommendations and resolutions.
TECHNICAL
SKILLS AND EXPERIENCE REQUIREMENTS
- 2-3 years professional experience as a data engineer,
software engineer, data analyst, data scientist, or related role.
- Experience
with Microsoft Azure Data Integration Stack (Azure Data Lake Gen2, Azure
Data Factory, Delta Lake, SSIS, SQL Server, Azure Data Warehouse),
Databricks, Spark.
- Working
experience in Investment or Real Estate industry, preferably with business
and functional knowledge.
- Expertise
building ETL and data pipelines on Databricks using data engineering
languages Python and SQL on Azure.
- Knowledge
and experience working with Python & SQL;
- Proven
experience with all aspects of the Data Pipeline (Data Sourcing,
Transformations, Data Quality, Etc…).
- Experience
with visual modelling tools including UML
- Proficient
in using data visualization tool such as Power BI, Workiva and in
standard office tools such as Excel.
- Familiar
with DevOps and Agile methodology.