Someone beat you to it!

Unfortunately, another Gumtree user is in the process of purchasing this item. Check back later in case they've changed their mind, or click the button below to browse more Pay Online Securely listings.

Intermediate Data Engineer

4 days ago24 views
Ad Saved to My List
View and manage your saved ads in your account.
Report Ad
General Details
Location:
Advertised By:Agency
Company Name:Executive Placements
Job Type:Full-Time
Description

We are seeking an exceptionally talented hands-on Intermediate Data Engineer who combines deep technical expertise in data pipelines, data modelling, and integration.

This role is accountable for designing and delivering scalable, reliable, and high-performance data solutions that support both analytics and application use cases. The successful candidate will play a key role in shaping the organization’s data architecture, ensuring that data is structured, governed, and accessible in a way that supports business decision-making and product capabilities.

The ideal candidate is a self-starter who values clean, maintainable data solutions, understands trade-offs between normalized and analytical models, and continuously improves the quality and performance of the data platform.

What you'll do:

  • Data Pipeline Development & Integration
    • Design, develop, and maintain scalable data pipelines using Azure Data Factory (ADF), including pipelines, data flows, triggers, and parameterization.
    • Integrate data from APIs, flat files, databases, and cloud/on-prem systems.
    • Implement robust ingestion patterns for structured and semi-structured data (JSON, XML, CSV).
    • Ensure reliable, efficient, and secure movement of data across systems.
  • Data Modelling & Transformation
    • Design and maintain both normalized (OLTP-aligned) and denormalized (analytical / reporting) data models.
    • Apply best practices in dimensional modelling (fact/dimension tables) as well as normalized relational design.
    • Implement transformations using SQL (T-SQL), stored procedures, and data flows to prepare analytics-ready datasets.
    • Ensure data models are scalable, reusable, and aligned with business requirements.
    • Manage historical data tracking, including slowly changing dimensions and auditability.
  • Performance, Reliability & Scalability
    • Optimize SQL queries, ETL pipelines, and data storage for large datasets (millions+ rows).
    • Implement indexing strategies, partitioning, and efficient data access patterns.
    • Ensure pipelines are resilient with proper error handling, retry logic, and monitoring.
    • Design solutions that minimize impact on transactional systems (clear separation of OLTP and reporting workloads).
    • Proactively identify and resolve performance bottlenecks.
  • Application & API Integration
    • Collaborate closely with backend (.NET) teams to support data access patterns and integration with application services.
    • Design and deliver aggregated datasets and data structures optimized for API consumption.
    • Support frontend (e.g., Vue.js) data requirements by enabling efficient querying, filtering, and pagination.
    • Contribute to embedded analytics and application-driven reporting use cases.
  • Collaboration & Continuous Improvement
    • Work closely with BI de
Id Subtitle 1354619291
View More
Apply now:
Executive Placements
Selling for 1 year
Total Ads5.28K
Active Ads5.28K
Professional Seller
Seller stats
5.28KTotal Ads
13.86MTotal Views
Contact Executive Placements
Message
(4020)
Name
(Optional)
Email Address
(Optional)
Phone
(Required)
Upload CV(Optional)
DOC or PDF only max 2 MB file size
Send Message
By clicking "Send" you accept the Terms & Conditions and Privacy Notice and agree to receive newsletters and promo offers from us.