Please note that our Terms & Conditions and Privacy Notice are applicable.
On Promotion in Jobs
Pay & Ship
Buy with confidence. Secure payment options & nationwide delivery. Learn more
Filter & refine
Clear All
Suggested
Top ads in Jobs
Vacancy for a Web Administrator/Customer Service Agent.A Contract to Permanent post for a Web Administrator/Customer Service Agent required for the handling of customers and administration of websites for a growing website company based in Midrand.This is a great starter position.The position entails editing of Photos and uploading it on to the Website, Photoshop skills would be advantageous, but is not a requirement, training will be providedYou will also be required to take calls from customers and make changes to websites via an admin system. Updating of information and data capture.Must be open-minded, customer service oriented, focused on details and pedantic.Very good people skills as you will be dealing with difficult clients. Very good telephonic skills. Good computer skills and internet savvy.Strictly Midrand Applicants only.Starting salary R7000 (Monday to Saturday).Salary for experienced candidates can be negotiated.Please email your CV and cover letter to ashley.ishwarbhai@gmail.com or apply on this post.
Midrand
We urgently need someone who can take on a contract job for 1 month (possible extension) at a client in Johannesburg from 1 February 2026.The skill sets/experience required is basically as follows:- information
systems audit - cybersecurity- advisory, governance, risk and compliance- ICT audit / advisory engagements- IT
Risk and Controls Assessments- Data AnalyticsPlease share your CV and supporting documents to info@tsholocs.co.za.
Fourways
Results for data in "data" in Jobs in South Africa in South Africa
1
Overview:We are seeking a seasoned Data Governance and Information Management professional with extensive experience in consulting, and enterprise data transformation. The ideal candidate will have a proven track record of building and leading data management capabilities that effectively bridge strategic intent with operational delivery, ensuring compliance with both global and regional regulations, including POPIA and BCBS239.Key Responsibilities:Design and implement robust data governance frameworks to support business objectives and regulatory requirements.Lead data quality remediation initiatives and embed data literacy and culture across the organisation.Establish and manage operating models and cross-functional data councils, aligning data strategy with business value creation.Provide subject matter expertise in data governance, data quality, and privacy enablement, particularly within Relationship Banking.Oversee the implementation of data management capabilities to support compliance and operational excellence.Advise system and process owners on data retention, destruction, and anonymisation requirements under POPIA.Support ongoing data lineage and remediation activities to strengthen data integrity across core platforms.Establish and lead the Data Management Function across multiple markets, including the creation of operating models and governance frameworks.Define and implement data ownership and stewardship structures, chairing data management councils comprising CROs, CDOs, and Heads of Data.Introduce and oversee data quality and metadata management solutions to ensure consistency and compliance across all operations.Conduct data maturity assessments in various jurisdictions, identifying and prioritising key areas for improvement.Lead the development of Master Data Management requirements and manage vendor evaluation processes.Drive data literacy and culture programmes to enhance data accountability across the organisation.Qualifications and Experience:+5 years experience in data governance, information management, and enterprise data transformation.Demonstrated expertise in regulatory compliance (POPIA, BCBS239) and data management best practices.Strong leadership skills with experience managing cross-functional teams and councils.Proven ability to deliver data management solutions across multiple markets and jurisdictions.Excellent communication and stakeholder management skills.Contract role Hybrid (Midrand)
https://www.executiveplacements.com/Jobs/D/Data-Governance--Information-Management-Specialis-1252493-Job-Search-1-16-2026-7-03-23-AM.asp?sid=gumtree
2d
Executive Placements
1
SavedSave
Data Lead (Engineer) to take ownership of the organisations data infrastructure, cloud environments, and IT services. This pivotal role will ensure that their data ecosystem is secure, scalable, reliable, and future-ready, supporting both operational and analytical needs.The successful candidate will work closely with the Analytics and Technology teams to design and maintain data systems that empower decision-making, enhance performance, and enable innovation across the business.Key ResponsibilitiesData Infrastructure & Cloud ManagementOwn and manage the organisations overall data infrastructure and cloud environments.Design, implement, and maintain scalable and secure data platforms that meet evolving business requirements.Oversee data storage, integration, and access strategies across on-premise and cloud systems.Monitor and optimise infrastructure performance, cost efficiency, and reliability.Data Governance & SecurityImplement and uphold best practices for data governance, integrity, and compliance with relevant standards and legislation (e.g. POPIA, GDPR).Ensure strong data security protocols are embedded across all systems and services.Develop and maintain documentation of data architecture, pipelines, and system processes.Collaboration & Stakeholder Managementhttps://www.executiveplacements.com/Jobs/D/Data-Lead-Engineer-Hybrid-1232227-Job-Search-1-16-2026-4-41-16-AM.asp?sid=gumtree
2d
Executive Placements
1
SavedSave
Required Skills & CompetenciesStrong proficiency in SQL and experience with relational and NoSQL databases.Expertise in ETL tools and frameworks (e.g., Apache Airflow, Talend).Knowledge of big data technologies (e.g., Hadoop, Spark) and data warehousing solutions.Familiarity with cloud data platforms (AWS Redshift, Azure Synapse, Google BigQuery).Experience with programming languages (Python, Java, or Scala) for data processing.Understanding of data modeling, data architecture, and performance optimization.Knowledge of data security and compliance best practices.Excellent analytical and problem-solving skills.Telecom domain knowledge is an advantage.
https://www.executiveplacements.com/Jobs/D/Data-Engineer-1251174-Job-Search-01-13-2026-00-00-00-AM.asp?sid=gumtree
2d
Executive Placements
1
ENVIRONMENT:YOUR passion for leading teams, driving Business Intelligence & Microsoft Fabric expertise is sought to fill the role of a MS Fabric Data Engineer wanted by a provider of tailored Financial Solutions. You will lead the initial phases of a warehouse rework project and support the ongoing expansion of a newly created data warehouse within the Microsoft Fabric platform. You will also build and lead a Data team, playing a critical role in designing, implementing, and maintaining data pipelines and ensuring high-quality data movement. The ideal candidate must preferably have Azure Data Engineer Associate or Fabric Analytics Engineer Associate/Data Engineer Associate Certifications with 5+ years’ experience working with complimentary Azure Data Technologies - Azure Synapse, Azure Data Factory, Azure Databricks, Azure SQL, etc. You also need proficiency with T-SQL, ETL/ELT tools, PySpark and Spark SQL & proven experience in using Pipelines, Lake House and Warehouse capabilities of the Microsoft Fabric platform. DUTIES:Maintain and update existing reports and dashboards in Power BI to reflect evolving business requirements.Assist with data extractions and execute defined SQL queries.Perform data validation and reconciliations between reporting layers.Work closely with the Analytics and Engineering teams to resolve data discrepancies.Support documentation of reports, dashboards, and data definitions.Participate in data testing and user feedback processes.Lead the design, development, and maintenance of scalable data pipelines and infrastructure within Microsoft Fabric.Architect and optimize the data warehouse within the Microsoft Fabric platform.Oversee data ingestion, transformation, and integration processes to ensure efficient data workflows.Implement and maintain ETL/ELT processes, data governance, and security best practices.Work closely with the Analyst and business stakeholders to translate requirements into technical solutions.Develop the data model to be implemented to achieve the business’ analytics objectives.Ensure data quality, reliability, and performance across all pipelines.Build and lead a Data team, initially managing the existing Analyst and gradually hiring additional Engineers and Analysts as needed.Provide mentorship, training, and career development opportunities for team members.Stay updated on emerging technologies in data engineering, analytics, and Microsoft Fabric advancements. REQUIREMENTS:Preferred Qualifications -One of the following Microsoft Certifications:Azure Data Engineer Associate (DP-203)Fabric Analytics Engineer Associate (DP-600)Fabric Data Engineer Associate (DP-700) Experience/Skills -5 + Years’ experience
https://www.executiveplacements.com/Jobs/M/MS-Fabric-Data-Engineer-JHBCPT-1249321-Job-Search-01-08-2026-02-00-16-AM.asp?sid=gumtree
1d
Executive Placements
1
SavedSave
Main purpose of the job:To lead the data processing and management activities of routine databases and research projects such as developing standard operating procedures, overseeing data capturing and research databases, analysis output, monitoring and data quality control, according to regulatory and compliance requirementsLocation:Johannesburg, HillbrowKey performance areas:Lead the development of the project database for the collection and management of project data in REDCapProvide technical input into data collection and data quality and creation of databases for analysis and interpretation of dataDevelop and implement data management work plansDevelop, implement and maintain all data related SOPsMonitoring and evaluation of captured patient or participant data on REDCap and other database systems in a timeous and accurate manner including quality control of entered dataOversee the management and quality assurance of all project data, in accordance with ethical and GCP requirements and SOPsMonitor and evaluate progress of data managementCompile monthly/quarterly/annual progress reports, as requiredOversee the maintenance of participant files and archivingParticipate in, and represent data management team at, meetings as requiredProvide support and training on databasesRaise and resolve data queries with the relevant team memberRespond to data queries from project staffProvide data supportProcess and produce accurate data reports within required timeframesDevelop and maintain research and programmatic databases as requested by staffAnalyse programmatic data to influence adaptive programming and evaluate effectiveness of programmatic interventionsContribute to or lead publicationsParticipate in capacity development initiatives with a focus on data management and analysisProvide coaching and mentoring for team members on data quality assurance and data managementSupport the provision of training and mentoring on data analysisEnsure training and mentoring on SOPs related to data managementTake ownership and accountability for tasks and demonstrates effective self-managementFollow through to ensure that quality and productivity standards of own work are consistently and accurately maintainedMaintain a positive attitude and respond openly to feedbackTake ownership for driving own career development by participating in ongoing training and development activities such as forums, conferences, policy setting workshops etc Required minimum education and training:Honours in IT or Statistics or a Biomedical field, or equivalentCertification in data
https://www.executiveplacements.com/Jobs/D/Data-Manager-WITS-RHI-1202959-Job-Search-07-14-2025-10-34-41-AM.asp?sid=gumtree
6mo
Executive Placements
1
SavedSave
Job & Company Description:Im looking to connect with Junior Data Engineers who have a solid foundation in data processing, analytics, and cloud technologies. By partnering with me, youll be considered for multiple upcoming roles where you can grow your technical skills while working alongside experienced data professionals.Key Responsibilities:Assist in building and maintaining data pipelines and ETL processes.Support data integration, transformation, and validation tasks.Work with senior engineers to optimise data solutions.Monitor data jobs and assist with troubleshooting issues.Contribute to documentation and data engineering best practices.Job Experience and Skills Required:Education:Degree in Computer Science, IT or related fields.Experience:2 years experience in a Data Engineering, Data Analyst, or related role.Exposure to ETL tools such as SSIS / SSRS or cloud-based ETL solutions.Experience of AWS Glue, Redshift, and CloudWatch.Experience of Python, Scala, SQL, and PySpark.Experience with data visualisation tools such as Power BI and Excel.Understanding of cloud-based data pipelines and workflows.Apply now!
https://www.jobplacements.com/Jobs/J/Junior-Data-Engineer-1250891-Job-Search-01-13-2026-04-13-12-AM.asp?sid=gumtree
5d
Job Placements
1
SavedSave
Company and Job DescriptionOur client is a leading player in the electrification and automation space, committed to innovation and sustainability. This role offers you the opportunity to work on advanced data engineering projects that enable data-driven decision-making across the organisation. Be part of a progressive company shaping the future of technology, working in a collaborative environment with minimal travel and standard business hours, while enjoying competitive compensation aligned with industry benchmarks. Youll gain exposure to modern data platforms, cutting-edge tools, and have the opportunity to influence strategic business reporting.Key Responsibilities:Design, build, and maintain scalable data pipelines for ingestion, transformation, and distribution.Manage data warehouse architecture and schemas for efficient analytics.Collaborate with stakeholders to translate business requirements into reliable data models.Job Experience and Skills RequiredDegree in Computer Science, Information Systems, Engineering, or related technical field.58+ years in data engineering, database development, or BI architecture.Proven expertise in GCP and Big Data technologies with strong proficiency in database querying languages and RDBMS.Experience with scripting languages for data manipulation and automation.Solid understanding of ETL/ELT methodologies and data modelling techniques.Analytical mindset with ability to troubleshoot complex data integration issues.Exposure to cloud-based data platforms and modern integration tools, Ensure data quality, security, and compliance across the infrastructureApply Now!
https://www.executiveplacements.com/Jobs/D/Data-Engineer-1251395-Job-Search-01-14-2026-04-14-16-AM.asp?sid=gumtree
4d
Executive Placements
1
SavedSave
The Data Scientist is responsible for driving the analytical, statistical, and programming interpretation of data to support decision making and drive business results. The Data Scientist supports product, teams with insights gained from analysing company & customer data to provide business predictions, proposals, and recommendations to improve business outcomes.The role of the Data Scientist is to leverage internal and, where applicable, external datasets to build and evolve, using best practice methodologies and statistical techniques advancing to AI methodologies for the Group.Drive measurable business outcomes by turning data into automated decisions by building active tools triggering an automated action. The Data Scientist is also accountable for the execution & building the next generation foundational data infrastructure and enable delivery of actionable insights to value generating outcomes
https://www.executiveplacements.com/Jobs/M/Manager-Data-Science-Fintech-1237514-Job-Search-1-15-2026-7-34-56-AM.asp?sid=gumtree
3d
Executive Placements
1
SavedSave
Own the design and implementation of scalable data solutions using Azure and Databricks. Youll be the go-to expert for optimising pipelines, integrating AI tools, and ensuring data flows like the Atlantic breeze. Collaborate with cross-functional teams and shape the future of data engineering.Skills & Experience: Python and SQL expertiseAzure Data Factory and Databricks masteryKafka event streaming wizardryCI/CD for data pipelinesStakeholder communication that inspires confidence Qualification:Degree in Computer Science, Data Engineering, or related field Contact DYLAN MAWONA on dmawona
https://www.executiveplacements.com/Jobs/S/Senior-Data-Engineer-1247845-Job-Search-12-29-2025-16-13-14-PM.asp?sid=gumtree
5d
Executive Placements
1
SavedSave
Key Responsibilities:Maintain and update existing reports and dashboards in Power BI to reflect evolving business requirements Assist with data extraction and execute defined SQL queries Perform data validation and reconciliations between reporting layers. Work closely with the analytics and engineering teams to resolve data discrepancies Support the documentation of reports, dashboards and data definitions Participate in data testing and user feedback processesExperience:5+ years experience with Microsoft Fabric Experience overseeing data ingestion, transformation and integration processes to ensure efficient data workflows Experience implementing and maintaining ETL / ELT processes, governance and security practicesExperience building and leading a data team Experience with T-SQL in SQL Server (On-premises or Cloud) using stored procedures and functions Experience with use of Azure DevOps and GitHub for CI/CD deployments and use of Power BI deployment Background in Financial Services, sales optimisation or regulatory reporting (advantageous) If you are interested in this opportunity, please apply directly.
https://www.executiveplacements.com/Jobs/D/Data-Engineer-Microsoft-Fabric-1252751-Job-Search-01-16-2026-10-13-51-AM.asp?sid=gumtree
1d
Executive Placements
1
SavedSave
RequirementsAn appropriate post-graduate qualification (BSc, Engineering, or similar)Relevant programming qualifications and / or certificationsRelevant Agile certification is preferable10 - 12 years experience in a Data Engineering role building and optimizing data pipelines, architectures and data setsConstructing data acquisition, warehousing and reporting solutionsAdvanced working SQL knowledge and 5 years experience working with relational databases, query authoring (SQL)Experience with a variety of databases, technologies, languages and visualisation engines (Power BI, Tableau, SAS).3 - 4 years experience building analytics tools that utilize the data pipeline to provide actionable insights into customer management, operational efficiency and other key business performance metrics.ResponsibilitiesDesign and implement data strategies and systems, to create and maintain the data architecture that will drive various initiatives across the organisationBuild infrastructure to automate extremely high volumes of data delivery and creatively solve data volume and scaling challenges. Contribute to the design and architecture of innovative solutions to difficult problems.Work with team and stakeholders to continually assess and redefine data technology stack to support changing data patterns and business use cases and to bridge the gaps between Data teams and Business by constantly collaborating with all parties to understand data needs.Build the infrastructure with IT which is required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and big data technologies.Collaborate with IT to source and load a wide range of data across our business into the data lake so that it can be used by analysts and developers to develop data solutions for the business.Develop and enhance the data ingestion framework using specified toolsets and will need to understand and continuously seek techniques to ingest data, as well as ensure a high degree of quality and confidence.Assemble large, complex data sets that meet functional / non-functional business requirements.Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.Liaise between technical teams and specialists and business stakeholders, fostering inter-departmental coordination and cooperation.Data Governance (Quality, Accessibility, Ownership and Security). Engage with stakeholders to obtain an understanding of their data practices to contract, manage and meet expectations.Identify client data quality concerns, conducts root cause analysis and provides feedback to management.Become
https://www.executiveplacements.com/Jobs/D/Data-Engineer-1202451-Job-Search-07-11-2025-04-37-22-AM.asp?sid=gumtree
6mo
Executive Placements
1
PBT Group is looking for a skilled Python Developer with strong AWS experience to join one of our delivery teams on a contract-to-perm basis. This role is well suited to a developer who enjoys working in cloud-native environments and contributing to data-driven platforms. Experience working within a data lake or modern analytics environment will be highly beneficial. The successful candidate will work closely with data engineers, analysts, and platform teams to build, maintain, and optimise scalable solutions in AWS. Key ResponsibilitiesDesign, develop, and maintain Python-based applications and services in AWS.Build and support data processing workflows within a data lake environment.Develop and maintain APIs, batch jobs, and data integration components.Work with cloud services such as S3, Lambda, Glue, EC2, and IAM.Collaborate with data engineers and analytics teams to enable reliable data ingestion and processing.Monitor, troubleshoot, and optimise cloud workloads for performance and cost efficiency.Follow best practices for security, logging, monitoring, and version control.Participate in Agile delivery processes, including sprint planning and reviews. Required Skills & ExperienceStrong proficiency in Python (mid to senior level).Hands-on experience working in AWS environments.Exposure to data lake architectures and cloud-based data platforms.Experience working with structured and semi-structured data.Solid understanding of SQL and data processing concepts.Experience with Git and CI/CD pipelines.Ability to work collaboratively in cross-functional teams. Nice to HaveExperience with AWS Glue, Athena, Redshift, or EMR.Exposure to data engineering or analytics engineering workloads.Familiarity with containerisation (Docker) or orchestration tools.Experience in financial services or enterprise data environments. Why Join PBT Group?Opportunity to work on high-impact, data-driven projects.Exposure to modern cloud and analytics platforms.Contract-to-perm pathway for long-term career growth.Collaborative culture with strong technical leadership.Projects across leading clients in financial services and beyond. * In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent * If you have not received any feedback after 2 weeks, please consider you application as unsuccessful.
https://www.executiveplacements.com/Jobs/P/Python-Developer-AWS--Data-Platforms-1252339-Job-Search-01-16-2026-02-00-15-AM.asp?sid=gumtree
2d
Executive Placements
1
SavedSave
PBT Group, leaders in Business Intelligence, has a vacancy for a Microsoft SQL Data Engineer. Duties:- Plan and analyse complex business requirements and implement technology enabled solutions to address multi-discipline business opportunities/problems.- Conduct planning, analysis and design activities in conjunction with other development specialists.- Participate in analysis of complex business opportunities/problems to deliver designs meeting requirements.- Participate in estimation of tasks and assist in the development of project plans.- Code or makes modifications to programs of high complexity, according to specifications.- Conduct medium to high complexity evaluations for product releases, stand-alone products, etc.- Conduct walkthroughs and quality review of deliverables.- Knowledge of design and developing end-to-end data acquisition processes to be used in population of data warehouse/data marts and/or in the creation of interfaces.- Provide guidance and mentoring on business intelligence technology and systems in general, especially in the area of ETL processes.- Participate in the formulation of standards to support the data acquisition development process.- Design, develop and execute complex data acquisition or interface routines using ETL tool, ensuring that business and technical requirements are met.- Ensure compliance with established policies, standards and methodologies. Required Skills:- Strong MS SQL Data Engineering experience- Solid SSIS (SQL Server Integration Services) experience.- Solid SSRS (SQL Server Reporting Services) experience.- Ability to analyse and define requirements- Database design- Intimate knowledge of source systems as well as a basic understanding of dimensional models.- Conventional database- and data warehouse modeling skills, in order to understand the data warehouse data models.- A sound knowledge of the programming language used to write the data staging programs or ETL tool.- A sound knowledge of SQL, or the language used to access the source databases and the data warehouse from the data staging programs or ETL tool.- A sound knowledge of the capabilities of the ETL tools, to know what their capabilities and shortcomings are – in order to exploit or avoid those aspects in the data staging programs.- Pride of work, thoroughness and attention to detail. Required Qualifications / Training:- Course on the ETL / related toolset.- Relevant data warehouse and BI solution training is essential.- B.Sc. or related degree is advantageous.- 2+ years programming experience. * In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your perso
https://www.executiveplacements.com/Jobs/S/SQL-Data-Engineer-1252336-Job-Search-01-16-2026-02-00-15-AM.asp?sid=gumtree
2d
Executive Placements
1
A recent professional profile photo is to accompany your applicationEMPLOYMENT TYPE : PermanentSECTOR : Research BASIC SALARY : Market RelatedSTART DATE : A.S.A.P / ImmediateREQUIREMENTS:Grade 12 (Matric).Diploma or Degree in Information Systems, Quality Management, Data Science, or Food Technology.35 years experience in data management, master data, or specification control, preferably in food or beverage manufacturing.Knowledge of BRCGS / IFS data traceability and specification control.Prior exposure to laboratory data, ERP, or client portal systems.LIMS administration or data management systems.ERP experience (SAGE/SAP) and QMS integration.Advanced Microsoft Excel and database management.Extensive laboratory knowledge. DUTIES:1. Product Specification Management:Maintain master product specifications (ingredients, allergens, analytical targets, packaging details).Ensure all changes are documented, version-controlled, and approved by relevant departments.Upload and verify specification data on client/retailer portals and in internal databases.Perform regular reviews to confirm specification accuracy across documents and systems.Create QR codes. 2. Portal Administration:Manage client and retailer portals (e.g., Tesco, Aldi, Ahold, GB, IPW, Fairtrade).Upload declarations, technical data sheets, certificates, and COAs.Track submission SLAs and renewal dates using a controlled portal calendar.Conduct RFT (Right First Time) verification on each submission. 3. LIMS Administration:Maintain sample IDs, test requests, and analytical result uploads in the LIMS.Ensure linkage of laboratory data to ERP batch numbers and specifications.Generate, verify, and archive COA reports and dashboards from LIMS data.Manage user access, permissions, and system housekeeping. 4. Data Integrity & System Alignment:Conduct monthly master data reconciliations between ERP (SAGE), QMS, LIMS, and portals.Identify and correct mismatches or obsolete entries.Implement data validation rules and periodic accuracy checks. 5. Change Control & Version Management:Manage data change control (new SKUs, label specs, blends, analytical targets).Maintain master data library and controlled access per department.Archive old versions in line with document control procedures. 6. Health, Safety & Housekeeping:Comply with company H&S policies and procedures; keep the work area
https://www.executiveplacements.com/Jobs/Q/Quality-Data-Specialist-Wine-industry-1242509-Job-Search-01-16-2026-00-00-00-AM.asp?sid=gumtree
2d
Executive Placements
1
SavedSave
We’re looking for a strong Senior Data Engineer to join a long-term contract engagement, working primarily across Microsoft data technologies in an enterprise environment. Key responsibilitiesDesign, build, and maintain robust data pipelines and architecturesIntegrate data from multiple sources, ensuring data quality, integrity, and performanceDevelop and maintain Data Warehouses and Operational Data Stores (ODS)Perform data modelling and optimise storage and retrievalBuild and support reports using SSRS and Power BIWork with cloud platforms (Azure primarily; AWS exposure beneficial)Collaborate closely with business and technical stakeholdersProvide ongoing support and maintenanceWork within Agile teams using Azure DevOps Tech stackSQL Server (T-SQL)SSIS, SSRSPower BISharePointPower Apps (advantageous)Azure (AWS exposure beneficial)Azure DevOps Nice to haveExperience with Dynamics 365Strong background in enterprise Microsoft ecosystems Working model: Remote currently, with the possibility of office-based work in Kempton Park in the future * In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent * If you have not received any feedback after 2 weeks, please consider you application as unsuccessful.
https://www.executiveplacements.com/Jobs/S/Senior-SharePoint-Data-Engineer-1252840-Job-Search-01-17-2026-02-00-14-AM.asp?sid=gumtree
12h
Executive Placements
1
SavedSave
In return, youll enjoy a competitive salary, strong benefits, and genuine opportunities for career growth in an innovative environment.What Youll Do:Analyze large datasets to uncover trends and opportunities.Build and maintain dashboards, reports, and data visualizations.Perform statistical analysis and predictive modeling.Collaborate with cross-functional teams to deliver data-driven solutions.Ensure data integrity and optimize data processes.What Were Looking For:Degree in Statistics, Mathematics, Computer Science, Data Science, or a related field.Proven experience as a Data Analyst.Must-have skills: SAS, SQL, R, and Python.Strong understanding of data modeling and relational databases.Excellent problem-solving and communication skills.Experience with visualization tools (e.g., Tableau and Power BI) is a plus.How to Apply
https://www.executiveplacements.com/Jobs/D/Data-Analyst-1249449-Job-Search-01-08-2026-00-00-00-AM.asp?sid=gumtree
5d
Executive Placements
1
SavedSave
We are looking for a highly organised and deadline-driven Data Capturer to support our operations by capturing, validating, and maintaining accurate information across company databases. This role requires strong attention to detail and the ability to work efficiently in a fast-paced environment.Minimum RequirementsGrade 12 / Matric (essential)Strong attention to detail and accuracyBasic computer literacy (MS Excel, Word, email)Good typing speed and data entry skillsAbility to work under pressure and meet deadlinesProven experience in data capturing or administration (advantageous)
https://www.jobplacements.com/Jobs/D/Data-Capturer-1252852-Job-Search-01-17-2026-02-00-15-AM.asp?sid=gumtree
12h
Job Placements
1
SavedSave
REQUIREMENTSMinimum education (essential):BSc in Computer Science, Engineering or relevant field Minimum applicable experience (years):2-4 yearsRequired nature of experience:Experience with SQL Server and Azure Synapse Analytics/Microsoft Fabric for query writing, indexing, performance tuning and schema design.Hands-on experience developing ETL pipelines, including data extraction from REST/SOAP APIs, databases and flat files.Proficiency in data transformation using Python and Azure-native tools.Experience with data warehousing.Background in data modelling, including dimensional modelling, schema evolution and versioning.Practical knowledge of cloud-based data storage and processing using Azure Blob Storage.Familiarity with pipeline optimisation, fault tolerance, monitoring and security best practices.Experience developing web applications using C# and the .NET platform.Experience with front-end development using Blazor, React.js, JavaScript/TypeScript, HTML, CSS/SCSS. Skills and Knowledge (essential):SQL Server, Azure Synapse Analytics, Azure Blob Storage, Microsoft FabricPythonREST/SOAP APIs, Data Extraction, Transformation, Loading (ETL)Azure Data Factory, Pipeline OrchestrationDimensional Modelling, Schema Evolution, Data WarehousingPower BIPerformance Optimisation, Indexing, Query TuningCloud Data Processing, BackupsC#, .NET, BlazorJavaScript/TypeScript, HTML, CSS/SCSOther:Proficient in Afrikaans and EnglishOwn transport and licenseKEY PERFORMANCE AREAS AND OBJECTIVESETL and Pipeline Development Design, build, and orchestrate efficient ETL pipelines using Azure Synapse for both batch and near-real-time data ingestion.Extract data from a variety of structured and unstructured sources including REST APIs, SOAP APIs, databases, and flat files.Apply robust data transformation logic using Python and native Azure Synapse transformation tools.Optimise data flows for performance, scalability, and cost-effectiveness.Implement retry mechanisms, logging and monitoring within pipelines to ensure data integrity and fault tolerance. Data Architecture and Management Design and manage scalable and efficient data architectures using Microsoft SQL Server and Azure services, including Synapse Analytics/Microsoft Fabric and Blob Storage.Develop robust schema designs, indexes and query strategies to support analytical and operational workloads.Support schema evolution and version control, ensuring long-term
https://www.executiveplacements.com/Jobs/D/Data-Engineer-1227925-Job-Search-01-13-2026-00-00-00-AM.asp?sid=gumtree
5d
Executive Placements
1
SavedSave
Key Responsibilities:Analyze business data to deliver actionable insights that support strategic decision-making.Design, build, and maintain Power BI dashboards and reports, including paginated reports using Report Builder.Develop complex SQL queries for data extraction, transformation, and analysis.Use Microsoft Fabric, Azure Synapse Analytics, and Data Factory to manage, transform, and orchestrate data pipelines.Build and optimize data models for reporting and analytics.Implement and maintain Data Lake/Delta Lake solutions.Collaborate with Data Engineers and business stakeholders to ensure data quality, accuracy, and consistency.Use DevOps for version control, change management, and deployment of BI assets.Apply Python for predictive analytics and forecasting.Work with Databricks to process, transform, and analyze large datasets.(Preferred) Use Azure Machine Learning for advanced analytics and AI-driven insights.Job Experience and Skills Required:Degree / Diploma or industry certification in Data Science, Computer Science, or Information Systems 5 years of experience in data analysis and business intelligence Strong experience with Microsoft Cloud BI Platform, SQL for querying and data manipulation, and Power BI, including Dax and Power Query Hands-on experience with Microsoft Fabric, Data Factory and Synapse Analytics Working knowledge of Data Lake / Delta Lake Architecture Experience with Databricks for big data processing Apply now!
https://www.executiveplacements.com/Jobs/S/Senior-BI-Developer-1248652-Job-Search-01-06-2026-04-13-07-AM.asp?sid=gumtree
5d
Executive Placements
1
SavedSave
Job Responsibilities: Work closely with the Business Intelligence (BI) Manager to transform raw, multi source data into reliable, timely, and actionable insights. On-time delivery of dashboards/reports against SLAs. Data quality metrics: completeness, accuracy, reconciliation rates. Stakeholder satisfaction scores and adoption of dashboards. Measured impact of AI-assisted insights (time saved, decision qualityJob Requirements: Grade 12 Tertiary qualification ( Degree/Diploma in Data Analytics, Information Systems, Statistics, Computer Science or related field) 24 years in a Data Analyst role supporting BI/reporting, ideally in operations or service environments.Skills: SQL (intermediate to advanced): joins, aggregations, etc. Power BI: data modelling, DAX, Power Query; dashboard UX and performance optimisation. Excel (advanced): Pivot Tables, Power Query, data cleaning techniques. Nice to have: AI tool proficiency: Copilot/ChatGPT or similar for insight generation; prompt engineering best practices and result validation. Analytical storytelling: turning data into clear narratives and recommendations. Awareness of data governance and privacy (e.g., POPIA). Exposure to Python or suitable alternatives to enable report automation.
https://www.executiveplacements.com/Jobs/D/Data--BI-Support-Analyst-1252149-Job-Search-01-15-2026-04-35-29-AM.asp?sid=gumtree
3d
Executive Placements
Save this search and get notified
when new items are posted!
