Sorry, the offer is not available,
but you can perform a new search or explore similar offers:

Software Engineer Iii, Machine Learning, Google Research At Google

Minimum qualifications: Bachelor's degree or equivalent practical experience. 2 years of experience with software development in Python or other programming ...


Google - Nairobi Area

Published a month ago

Director Ict At Kaimosi Friends University College (Kafuco)

KAFU/302/ICT/11/24 The applicant to this position shall: A Doctorate degree in Computer Science, Information Technology or its equivalent from a recognized U...


Kaimosi Friends University College (Kafuco) - Nairobi Area

Published a month ago

Ict Trainer At St. Kizito Vocational Centre Utawala

As an ICT Trainer, you will conduct training needs assessments, design and deliver training programs, and continuously update and improve training materials....


Ional Centre Utawala - Nairobi Area

Published a month ago

Specialist, Integrations Developer At Save The Children

Team and Job Purpose Purpose of the team is to lead on the strategic planning, design and delivery of digital and data technology solutions used across SCA. ...


Save The Children - Nairobi Area

Published a month ago

Senior Database Engineer At Ezra

Details of the offer

Key Responsibilities
Design, implement, and maintain robust database solutions.
Capacity planning in line with the infrastructure. Design and Implement DBs that can scale.
Optimize and tune database performance to ensure efficient data processing and retrieval.
Develop and maintain ETL (Extract, Transform, Load) processes for data integration and migration.
Ensure data integrity, consistency, and security across all database systems.
Collaborate with software engineers, data scientists, and other stakeholders to define data requirements and develop solutions.
Monitor and troubleshoot database issues, ensuring minimal downtime and quick resolution.
Automate database management tasks using automation tools such as Ansible, Terraform and Bash scripting
Implement backup and recovery strategies to safeguard critical data. Developing, managing and testing back-up and recovery plans
Monitoring performance and managing parameters to provide fast query responses to front-end users
Refining the logical design so that it can be translated into a specific data model
Maintaining data standards, including adherence to the data protection act
Writing database documentation, including data standards, procedures and definitions for the data dictionary (metadata)
Controlling access permissions and privileges. Establishing the needs of users and monitoring user access and security
Ensuring that storage, archiving, back-up and recovery procedures are functioning correctly
Work directly with development and infrastructure teams to enhance the performance and observability of various database services through monitoring solutions (Grafana, ELK)
Proficient with building data integrations using both API and file based protocols

Key Requirements
BSc Degree in one of the following subject areas: Computer Science, Business Administration, Information Technology or related field preferred
4 - 5 years IT operation with strong understanding of database structures, theories, principles, and practices
4-5 years PostgreSQL Database Administration experience
5+ years of experience in database engineering or a similar role.
Understanding of, and experience with, server-client computing and relational database environments
Experience with data management and data processing flowcharting techniques
Knowledge of reporting and query tools and practices
Proficiency in SQL and experience with database management systems (e.g., MySQL, PostgreSQL, Oracle, SQL Server).
Strong knowledge of data engineering concepts and ETL processes.
Extensive experience with Linux operating systems and Bash scripting.
Familiarity with cloud-based database solutions (e.g., AWS RDS, Google Cloud SQL, Azure SQL Database).
Experience with NoSQL databases (e.g., MongoDB, Cassandra) is a plus.
Strong problem-solving skills and the ability to work independently and as part of a team.
Excellent communication skills and the ability to convey complex technical concepts to non-technical stakeholders.
Undertstanding of big data technologies (Apache Hadoop, Spark) and DW solutions (Google Big-Query, Snowflake, Azure Synapse analytics)
Knowledge in python would be an added advantage


Nominal Salary: To be agreed

Source: Myjobmag_Co

Requirements

Built at: 2024-12-26T20:59:43.765Z