Forwarded from Kseniia Fedotova
#Limassol #Cyprus #vacancy #relocate #DataEngineer #DE #вакансия
Вакансия с релокацией на Кипр!
Всем привет! Международная компания, которая разрабатывает надежные и актуальные IT-решения для b2c и b2b сегмента находится в поисках Data-Engineer :
Чем предстоит заниматься:
- Создание новых и оптимизация существующих моделей данных;
- Поддержка сотрудников отделов дата аналитики и ML в плане чистоты кода обращения к БД;
- Разработка функционала для автоматизации сбора, обработки и хранения данных ;
- Извлечение и хранение признаков из данных для применения алгоритмов машинного обучения;
- Создание витрин данных для работы дата-аналитиков и дата-сайентистов;
- Разработка и поддержка сопроводительной документации и спецификаций данных, развитие и поддержка базы знаний по вопросам работы с данными.
От тебя:
- Уверенное владение Python;
- Уверенные знания SQL (проектирование моделей данных, оптимизация запросов);
- Опыт работы с разными базами данных (ClickHouse, PostgreSQL, MySQL), создание витрин;
- Опыт создания ETL пайплайнов (Airflow и др.).
Плюсом будет:
- Опыт работы с инструментами версионирования датасетов;
- Опыт работы с GitLab CI/CD;
- Опыт работы с Git;
- Опыт работы с Docker;
- Опыт работы с инструментами мониторинга (Grafana, Kibana);
- Опыт работы с экосистемой Hadoop: HDFS, Hive, Kafka, Spark, Impala;
- Опыт участия в построении DWH.
Мы предлагаем:
- Белую заработную плату на уровне Ваших профессиональных навыков и пожеланий;
- Помощь с релокацией на Кипр г. Лимассол, (визовое сопровождение, медицинское страхование для сотрудника и членов его семьи);
- Интересные задачи, перспективные проекты, продвинутые технологии;
- Работу в русскоязычной команде.
Современный подход к управлению процессами, временем, задачами. Отсутствие ненужной бюрократии, возможность реализовывать свои идеи. Для нас сотрудник – это не просто исполнитель, а думающий и амбициозный человек, которому важен результат.
Отправляй резюме в ЛС @yammmey
Вакансия с релокацией на Кипр!
Всем привет! Международная компания, которая разрабатывает надежные и актуальные IT-решения для b2c и b2b сегмента находится в поисках Data-Engineer :
Чем предстоит заниматься:
- Создание новых и оптимизация существующих моделей данных;
- Поддержка сотрудников отделов дата аналитики и ML в плане чистоты кода обращения к БД;
- Разработка функционала для автоматизации сбора, обработки и хранения данных ;
- Извлечение и хранение признаков из данных для применения алгоритмов машинного обучения;
- Создание витрин данных для работы дата-аналитиков и дата-сайентистов;
- Разработка и поддержка сопроводительной документации и спецификаций данных, развитие и поддержка базы знаний по вопросам работы с данными.
От тебя:
- Уверенное владение Python;
- Уверенные знания SQL (проектирование моделей данных, оптимизация запросов);
- Опыт работы с разными базами данных (ClickHouse, PostgreSQL, MySQL), создание витрин;
- Опыт создания ETL пайплайнов (Airflow и др.).
Плюсом будет:
- Опыт работы с инструментами версионирования датасетов;
- Опыт работы с GitLab CI/CD;
- Опыт работы с Git;
- Опыт работы с Docker;
- Опыт работы с инструментами мониторинга (Grafana, Kibana);
- Опыт работы с экосистемой Hadoop: HDFS, Hive, Kafka, Spark, Impala;
- Опыт участия в построении DWH.
Мы предлагаем:
- Белую заработную плату на уровне Ваших профессиональных навыков и пожеланий;
- Помощь с релокацией на Кипр г. Лимассол, (визовое сопровождение, медицинское страхование для сотрудника и членов его семьи);
- Интересные задачи, перспективные проекты, продвинутые технологии;
- Работу в русскоязычной команде.
Современный подход к управлению процессами, временем, задачами. Отсутствие ненужной бюрократии, возможность реализовывать свои идеи. Для нас сотрудник – это не просто исполнитель, а думающий и амбициозный человек, которому важен результат.
Отправляй резюме в ЛС @yammmey
Forwarded from Екатерина
#vacancy #data #analytics #dataengineer #limassol #remote
🧑💻Role: Data Engineer
🇨🇾 Location: Limassol
🕘Job type: Full-time
👍Experience: middle+/senior
The company ALMUS creates its own mobile applications, that are popular with millions of users around the world.
📋 Key Responsibilities:
1) Airflow Optimization:
- Evaluate and improve the performance and reliability of Apache Airflow workflows.
- Develop and maintain Airflow DAGs to automate data pipelines.
- Ensure efficient scheduling and execution of tasks to meet business requirements.
2) Alerting Implementation:
- Design and implement alerting mechanisms to monitor data pipelines and systems.
- Set up real-time notifications for pipeline failures and performance issues.
- Develop strategies for proactive monitoring and incident management.
3) Adding New Data Sources:
- Integrate new data sources such as PayPal, Stripe, Google, and Facebook into the data infrastructure.
- Ensure seamless data ingestion, transformation, and storage from diverse sources.
- Maintain data quality and consistency across all integrated platforms.
4) Handling Requests from Analytics and Other Departments:
- Collaborate with analytics and other departments to understand their data needs and requirements.
- Develop and maintain data solutions that support ad-hoc and scheduled data requests.
- Ensure timely and accurate delivery of data to support decision-making processes.
💼 Required Skills:
1) SQL:
- Advanced proficiency in SQL for querying, manipulating, and managing large datasets.
- Ability to optimize SQL queries for performance and efficiency.
2) Airflow:
- Strong knowledge of Apache Airflow for orchestrating complex data workflows.
- Experience in designing, implementing, and optimizing Airflow DAGs.
3)Git:
- Proficient in using Git for version control and collaboration on codebases.
- Ability to manage branches, merges, and code reviews effectively.
4) Python:
- Advanced skills in Python for developing data pipelines and automation scripts.
- Experience with relevant libraries and frameworks for data engineering.
5) Databricks:
- Experience with Databricks for big data processing and analytics.
🔥Additional Qualifications:
- Proven experience in a Data Engineer or similar role.
- Strong problem-solving skills and attention to detail.
- Excellent communication skills, with the ability to work collaboratively with cross-functional teams.
- Ability to manage multiple tasks and priorities in a fast-paced environment.
- Knowledge of cloud platforms and services (e.g., AWS, GCP, Azure) is a plus.
🍩 We offer:
- A supportive atmosphere, working on inspiring products in a team of great professionals
- In-office, hybrid/remote work opportunities
- A competitive salary package based on your unique expertise, skillset and impact on the product - Professional equipment and access to the necessary programs
- Flexible working hours
- 21 days of annual leave, 3 sick days
- Reimbursement of professional courses and trainings
- Free lunches in the office
- Medical Insurance
- A variety of corporate events and team-building activities.
Interested? 😉
Then send your CV to TG @nikita_pitalenko @Sir_Alejandro📝
🧑💻Role: Data Engineer
🇨🇾 Location: Limassol
🕘Job type: Full-time
👍Experience: middle+/senior
The company ALMUS creates its own mobile applications, that are popular with millions of users around the world.
📋 Key Responsibilities:
1) Airflow Optimization:
- Evaluate and improve the performance and reliability of Apache Airflow workflows.
- Develop and maintain Airflow DAGs to automate data pipelines.
- Ensure efficient scheduling and execution of tasks to meet business requirements.
2) Alerting Implementation:
- Design and implement alerting mechanisms to monitor data pipelines and systems.
- Set up real-time notifications for pipeline failures and performance issues.
- Develop strategies for proactive monitoring and incident management.
3) Adding New Data Sources:
- Integrate new data sources such as PayPal, Stripe, Google, and Facebook into the data infrastructure.
- Ensure seamless data ingestion, transformation, and storage from diverse sources.
- Maintain data quality and consistency across all integrated platforms.
4) Handling Requests from Analytics and Other Departments:
- Collaborate with analytics and other departments to understand their data needs and requirements.
- Develop and maintain data solutions that support ad-hoc and scheduled data requests.
- Ensure timely and accurate delivery of data to support decision-making processes.
💼 Required Skills:
1) SQL:
- Advanced proficiency in SQL for querying, manipulating, and managing large datasets.
- Ability to optimize SQL queries for performance and efficiency.
2) Airflow:
- Strong knowledge of Apache Airflow for orchestrating complex data workflows.
- Experience in designing, implementing, and optimizing Airflow DAGs.
3)Git:
- Proficient in using Git for version control and collaboration on codebases.
- Ability to manage branches, merges, and code reviews effectively.
4) Python:
- Advanced skills in Python for developing data pipelines and automation scripts.
- Experience with relevant libraries and frameworks for data engineering.
5) Databricks:
- Experience with Databricks for big data processing and analytics.
🔥Additional Qualifications:
- Proven experience in a Data Engineer or similar role.
- Strong problem-solving skills and attention to detail.
- Excellent communication skills, with the ability to work collaboratively with cross-functional teams.
- Ability to manage multiple tasks and priorities in a fast-paced environment.
- Knowledge of cloud platforms and services (e.g., AWS, GCP, Azure) is a plus.
🍩 We offer:
- A supportive atmosphere, working on inspiring products in a team of great professionals
- In-office, hybrid/remote work opportunities
- A competitive salary package based on your unique expertise, skillset and impact on the product - Professional equipment and access to the necessary programs
- Flexible working hours
- 21 days of annual leave, 3 sick days
- Reimbursement of professional courses and trainings
- Free lunches in the office
- Medical Insurance
- A variety of corporate events and team-building activities.
Interested? 😉
Then send your CV to TG @nikita_pitalenko @Sir_Alejandro📝
👍2
Forwarded from Anastasiya Lobova
#vacancy #dataengineer #iOS #Android #CRM #fulltime
Локация: Лимассол
Формат работы: офис/ удаленно/ гибрид
Занятость: полная
🔸Senior Android developer
🔸Senior CRM Manager
🔸Senior iOS Developer
🔸Data Engineer
Simple App - нативное мобильное приложение категории "Здоровье и Фитнес".
Наша интернациональная команда работает над тем, чтобы улучшать пищевые привычки пользователей и повышать качество их жизни.
Для наших коллег доступны возможности:
🥑 Сильный продукт, который улучшает качество жизни людей;
🥑 Международный проект;
🥑 Современное оборудование (Mac) и необходимый софт для работы.
✉️ lobova@simple.life
💬 @Tastyparty
Локация: Лимассол
Формат работы: офис/ удаленно/ гибрид
Занятость: полная
🔸Senior Android developer
🔸Senior CRM Manager
🔸Senior iOS Developer
🔸Data Engineer
Simple App - нативное мобильное приложение категории "Здоровье и Фитнес".
Наша интернациональная команда работает над тем, чтобы улучшать пищевые привычки пользователей и повышать качество их жизни.
Для наших коллег доступны возможности:
🥑 Сильный продукт, который улучшает качество жизни людей;
🥑 Международный проект;
🥑 Современное оборудование (Mac) и необходимый софт для работы.
✉️ lobova@simple.life
💬 @Tastyparty
❤1
Forwarded from Anastasiya Lobova
#vacancy #dataengineer #CRM #fulltime
Локация: Лимассол
Формат работы: офис/ удаленно/ гибрид
Занятость: полная
🔸Senior CRM Manager
🔸Data Engineer
Simple App - нативное мобильное приложение категории "Здоровье и Фитнес".
Наша интернациональная команда работает над тем, чтобы улучшать пищевые привычки пользователей и повышать качество их жизни.
Для наших коллег доступны возможности:
🥑 Сильный продукт, который улучшает качество жизни людей;
🥑 Международный проект;
🥑 Современное оборудование (Mac) и необходимый софт для работы.
✉️ lobova@simple.life
💬 @Tastyparty
Локация: Лимассол
Формат работы: офис/ удаленно/ гибрид
Занятость: полная
🔸Senior CRM Manager
🔸Data Engineer
Simple App - нативное мобильное приложение категории "Здоровье и Фитнес".
Наша интернациональная команда работает над тем, чтобы улучшать пищевые привычки пользователей и повышать качество их жизни.
Для наших коллег доступны возможности:
🥑 Сильный продукт, который улучшает качество жизни людей;
🥑 Международный проект;
🥑 Современное оборудование (Mac) и необходимый софт для работы.
✉️ lobova@simple.life
💬 @Tastyparty
Forwarded from Anastasiya Lobova
#vacancy #dataanalyst #dataengineer #accountant
Локация: Лимассол
Формат работы: офис/ удаленно/ гибрид
🔸Data Analyst
🔸Data Engineer
🔸NetSuite Accountant (Part-time)
Simple App - нативное мобильное приложение категории "Здоровье и Фитнес".
Наша интернациональная команда работает над тем, чтобы улучшать пищевые привычки пользователей и повышать качество их жизни.
Для наших коллег доступны возможности:
🥑 Сильный продукт, который улучшает качество жизни людей;
🥑 Международный проект;
🥑 Современное оборудование (Mac) и необходимый софт для работы.
✉️ lobova@simple.life
💬 @Tastyparty
Локация: Лимассол
Формат работы: офис/ удаленно/ гибрид
🔸Data Analyst
🔸Data Engineer
🔸NetSuite Accountant (Part-time)
Simple App - нативное мобильное приложение категории "Здоровье и Фитнес".
Наша интернациональная команда работает над тем, чтобы улучшать пищевые привычки пользователей и повышать качество их жизни.
Для наших коллег доступны возможности:
🥑 Сильный продукт, который улучшает качество жизни людей;
🥑 Международный проект;
🥑 Современное оборудование (Mac) и необходимый софт для работы.
✉️ lobova@simple.life
💬 @Tastyparty
Forwarded from Svetlana Kirichenko
#vacancy #вакансия #job #opportunity #fulltime #Limassol #Cyprus #office #DataEngineer #DWH #SQL #PostgreSQL #MySQL #Airflow #Airbyte #Kafka #Debeziu #ClickHouse
HI guys! Here’s another amazing opportunity from Brainsome!
🔥Position: DATA ENGINEER
🔥Company: Brainsome
🔥Location: Limassol, office near The Limassol Zoo
Project Description:
Our product is a streamlined platform for managing and optimizing digital advertising, designed to handle high traffic volumes and complex data processing. We manage a system that processes thousands of requests per second, with a total data volume reaching tens of terabytes. The platform includes tools for ad publishers to submit traffic, a traffic exchange engine that enables seamless buying and selling of traffic, and a CRM-like platform for managing advertisers, ad publishers, and key settings within the traffic exchange process. We specialize in solving complex challenges in processing and analyzing large-scale data.
Your Experience:
✔️ 4+ years of experience in data engineering
✔️ Strong proficiency in SQL and experience with databases such as ClickHouse, PostgreSQL, MySQL
✔️ Hands-on experience with tools such as Airflow, Airbyte, Kafka, and Debezium
✔️ Programming skills in Python for data processing
✔️ Experience with version control systems (e.g., Git)
✔️ Understanding of data security and data protection best practices
✔️ A results-oriented approach with attention to detail and problem-solving skills
Your Responsibilities:
🔹Designing, developing, and maintaining data warehouses (DWH) and storage schemas
🔹Creating and maintaining data pipelines to transfer data between source databases and the DWH
🔹Working with various data sources, including relational databases, message queues, external APIs, and Google Sheets, and developing custom connectors for these sources
🔹Collaborating with data analysts to design and maintain data marts and integrate them with BI tools such as Tableau
🔹Contributing to data architecture decisions and assisting in the selection of appropriate technologies for projects
🔹Optimizing the performance and scalability of existing databases and data warehouses
🔹Handling large datasets, ensuring their availability, reliability, and integrity
We offer:
🔸Top rate pay
🔸Relocation Assistance
🔸Sports compensation
🔸Language classes compensation
🔸Medical insurance
🔸Free lunches and snacks at the office
🔸Access to mental health service
🔸Cozy office in the center of Limassol
🔸Flexible working hours
🔸Senior-level team
🔸Team buildings and parties
📫Contact: @Svetlana_Kirichenko_ITRecruiter
HI guys! Here’s another amazing opportunity from Brainsome!
🔥Position: DATA ENGINEER
🔥Company: Brainsome
🔥Location: Limassol, office near The Limassol Zoo
Project Description:
Our product is a streamlined platform for managing and optimizing digital advertising, designed to handle high traffic volumes and complex data processing. We manage a system that processes thousands of requests per second, with a total data volume reaching tens of terabytes. The platform includes tools for ad publishers to submit traffic, a traffic exchange engine that enables seamless buying and selling of traffic, and a CRM-like platform for managing advertisers, ad publishers, and key settings within the traffic exchange process. We specialize in solving complex challenges in processing and analyzing large-scale data.
Your Experience:
✔️ 4+ years of experience in data engineering
✔️ Strong proficiency in SQL and experience with databases such as ClickHouse, PostgreSQL, MySQL
✔️ Hands-on experience with tools such as Airflow, Airbyte, Kafka, and Debezium
✔️ Programming skills in Python for data processing
✔️ Experience with version control systems (e.g., Git)
✔️ Understanding of data security and data protection best practices
✔️ A results-oriented approach with attention to detail and problem-solving skills
Your Responsibilities:
🔹Designing, developing, and maintaining data warehouses (DWH) and storage schemas
🔹Creating and maintaining data pipelines to transfer data between source databases and the DWH
🔹Working with various data sources, including relational databases, message queues, external APIs, and Google Sheets, and developing custom connectors for these sources
🔹Collaborating with data analysts to design and maintain data marts and integrate them with BI tools such as Tableau
🔹Contributing to data architecture decisions and assisting in the selection of appropriate technologies for projects
🔹Optimizing the performance and scalability of existing databases and data warehouses
🔹Handling large datasets, ensuring their availability, reliability, and integrity
We offer:
🔸Top rate pay
🔸Relocation Assistance
🔸Sports compensation
🔸Language classes compensation
🔸Medical insurance
🔸Free lunches and snacks at the office
🔸Access to mental health service
🔸Cozy office in the center of Limassol
🔸Flexible working hours
🔸Senior-level team
🔸Team buildings and parties
📫Contact: @Svetlana_Kirichenko_ITRecruiter
Forwarded from Svetlana Kirichenko
#vacancy #вакансия #job #opportunity #fulltime #Limassol #Cyprus #office #DataEngineer #DWH #SQL #PostgreSQL #MySQL #Airflow #Airbyte #Kafka #Debeziu #ClickHouse
HI guys! Here’s another amazing opportunity from Brainsome!
🔥Position: DATA ENGINEER
🔥Company: Brainsome
🔥Location: Limassol, office near The Limassol Zoo
Project Description:
Our product is a streamlined platform for managing and optimizing digital advertising, designed to handle high traffic volumes and complex data processing. We manage a system that processes thousands of requests per second, with a total data volume reaching tens of terabytes. The platform includes tools for ad publishers to submit traffic, a traffic exchange engine that enables seamless buying and selling of traffic, and a CRM-like platform for managing advertisers, ad publishers, and key settings within the traffic exchange process. We specialize in solving complex challenges in processing and analyzing large-scale data.
Your Experience:
✔️ 4+ years of experience in data engineering
✔️ Strong proficiency in SQL and experience with databases such as ClickHouse, PostgreSQL, MySQL
✔️ Hands-on experience with tools such as Airflow, Airbyte, Kafka, and Debezium
✔️ Programming skills in Python for data processing
✔️ Experience with version control systems (e.g., Git)
✔️ Understanding of data security and data protection best practices
✔️ A results-oriented approach with attention to detail and problem-solving skills
Your Responsibilities:
🔹Designing, developing, and maintaining data warehouses (DWH) and storage schemas
🔹Creating and maintaining data pipelines to transfer data between source databases and the DWH
🔹Working with various data sources, including relational databases, message queues, external APIs, and Google Sheets, and developing custom connectors for these sources
🔹Collaborating with data analysts to design and maintain data marts and integrate them with BI tools such as Tableau
🔹Contributing to data architecture decisions and assisting in the selection of appropriate technologies for projects
🔹Optimising the performance and scalability of existing databases and data warehouses
🔹Handling large datasets, ensuring their availability, reliability, and integrity
We offer:
🔸Top rate pay
🔸Relocation Assistance
🔸Sports compensation
🔸Language classes compensation
🔸Medical insurance
🔸Free lunches and snacks at the office
🔸Access to mental health service
🔸Cozy office in the center of Limassol
🔸Flexible working hours
🔸Senior-level team
🔸Team buildings and parties
📫Contact: @Svetlana_Kirichenko_ITRecruiter
HI guys! Here’s another amazing opportunity from Brainsome!
🔥Position: DATA ENGINEER
🔥Company: Brainsome
🔥Location: Limassol, office near The Limassol Zoo
Project Description:
Our product is a streamlined platform for managing and optimizing digital advertising, designed to handle high traffic volumes and complex data processing. We manage a system that processes thousands of requests per second, with a total data volume reaching tens of terabytes. The platform includes tools for ad publishers to submit traffic, a traffic exchange engine that enables seamless buying and selling of traffic, and a CRM-like platform for managing advertisers, ad publishers, and key settings within the traffic exchange process. We specialize in solving complex challenges in processing and analyzing large-scale data.
Your Experience:
✔️ 4+ years of experience in data engineering
✔️ Strong proficiency in SQL and experience with databases such as ClickHouse, PostgreSQL, MySQL
✔️ Hands-on experience with tools such as Airflow, Airbyte, Kafka, and Debezium
✔️ Programming skills in Python for data processing
✔️ Experience with version control systems (e.g., Git)
✔️ Understanding of data security and data protection best practices
✔️ A results-oriented approach with attention to detail and problem-solving skills
Your Responsibilities:
🔹Designing, developing, and maintaining data warehouses (DWH) and storage schemas
🔹Creating and maintaining data pipelines to transfer data between source databases and the DWH
🔹Working with various data sources, including relational databases, message queues, external APIs, and Google Sheets, and developing custom connectors for these sources
🔹Collaborating with data analysts to design and maintain data marts and integrate them with BI tools such as Tableau
🔹Contributing to data architecture decisions and assisting in the selection of appropriate technologies for projects
🔹Optimising the performance and scalability of existing databases and data warehouses
🔹Handling large datasets, ensuring their availability, reliability, and integrity
We offer:
🔸Top rate pay
🔸Relocation Assistance
🔸Sports compensation
🔸Language classes compensation
🔸Medical insurance
🔸Free lunches and snacks at the office
🔸Access to mental health service
🔸Cozy office in the center of Limassol
🔸Flexible working hours
🔸Senior-level team
🔸Team buildings and parties
📫Contact: @Svetlana_Kirichenko_ITRecruiter
Forwarded from FxPro Careers
#vacancy #dataengineer #Limassol #fxpro #hiring #applynow
FxPro BI team is growing!
FxPro is a leading online broker, with over 115+ international awards and 22+ years in the industry 🚀
🔥 Currently, we have open positions for our BI team.
As a Senior Data Engineer you will be responsible for designing and optimizing efficient big data structures, implementing data cleansing techniques.
➡️ Senior Data Engineer
What we provide
• Excellent compensation package 💸
• Medical and life insurance 🏥
• Provident fund 💼
• In-house gym with a personal trainer 💪
• Free daily lunch catering, snacks, and beverages 🍽️
• Company discount card for various products & services 🎟️
• 50% discount on language courses 📝
• 21 days of annual leave and 10 days of sick leave annually 🌴
• Shuttle bus service from Limassol 🚌
• Birthday certificate program 🎉
Click the link to start your journey with us!
FxPro BI team is growing!
FxPro is a leading online broker, with over 115+ international awards and 22+ years in the industry 🚀
🔥 Currently, we have open positions for our BI team.
As a Senior Data Engineer you will be responsible for designing and optimizing efficient big data structures, implementing data cleansing techniques.
➡️ Senior Data Engineer
What we provide
• Excellent compensation package 💸
• Medical and life insurance 🏥
• Provident fund 💼
• In-house gym with a personal trainer 💪
• Free daily lunch catering, snacks, and beverages 🍽️
• Company discount card for various products & services 🎟️
• 50% discount on language courses 📝
• 21 days of annual leave and 10 days of sick leave annually 🌴
• Shuttle bus service from Limassol 🚌
• Birthday certificate program 🎉
Click the link to start your journey with us!
FxPro
Current Openings
Take a look at the current openings at FxPro
Forwarded from Dina Ve
#DataEngineer #vacancy #вакансия #ETL #удаленка #DataPlatform #EUJobs #remote #fulltime
🚀 Finom is Hiring: Senior Data Engineer 🚀
Join Finom as a Senior Data Engineer and play a key role in building and optimizing our data infrastructure across our company hubs in Cyprus, Spain, and Poland! If you’re experienced in ETL development, Databricks, and are proactive with a start-up mindset, this role is for you.
🔹 What You’ll Do:
Develop and standardize ETL processes for data marts on the Databricks Data Platform using Spark
Build real-time data marts to support anti-fraud detection and prevention models
Troubleshoot data-related issues, ETL failures, and provide technical support
Establish monitoring systems for ETL processes and data quality on the Databricks platform
Collaborate with data scientists, backend, and BI engineers on cross-functional projects
Identify and implement opportunities for improving the Data Warehouse (DWH) to boost efficiency
Assemble functional and non-functional datasets, including real-time data gathering and updates
Provide mentorship and guidance to middle-level data engineers
🔹 Who You Are:
5+ years of experience as an ETL Developer or Data Engineer
Proficient in SQL-based programming and experienced with Python for data manipulation, pipeline development, and backend systems
Skilled in cloud-based data platforms, especially Databricks, Spark, and Spark Streaming
Strong analytical and problem-solving skills for resolving complex data issues
Startup-oriented, proactive, and results-driven, with the ability to work independently
Familiar with stream-processing platforms (Apache Kafka, Amazon Kinesis)
Full working proficiency in English
🔹 Why Join Us?
Collaborative, innovative work environment with growth opportunities
Hybrid work setup, with relocation support provided for Cyprus
Ready to make an impact? Send your CV to @Dina_itrecruiter and become a part of our team!
🚀 Finom is Hiring: Senior Data Engineer 🚀
Join Finom as a Senior Data Engineer and play a key role in building and optimizing our data infrastructure across our company hubs in Cyprus, Spain, and Poland! If you’re experienced in ETL development, Databricks, and are proactive with a start-up mindset, this role is for you.
🔹 What You’ll Do:
Develop and standardize ETL processes for data marts on the Databricks Data Platform using Spark
Build real-time data marts to support anti-fraud detection and prevention models
Troubleshoot data-related issues, ETL failures, and provide technical support
Establish monitoring systems for ETL processes and data quality on the Databricks platform
Collaborate with data scientists, backend, and BI engineers on cross-functional projects
Identify and implement opportunities for improving the Data Warehouse (DWH) to boost efficiency
Assemble functional and non-functional datasets, including real-time data gathering and updates
Provide mentorship and guidance to middle-level data engineers
🔹 Who You Are:
5+ years of experience as an ETL Developer or Data Engineer
Proficient in SQL-based programming and experienced with Python for data manipulation, pipeline development, and backend systems
Skilled in cloud-based data platforms, especially Databricks, Spark, and Spark Streaming
Strong analytical and problem-solving skills for resolving complex data issues
Startup-oriented, proactive, and results-driven, with the ability to work independently
Familiar with stream-processing platforms (Apache Kafka, Amazon Kinesis)
Full working proficiency in English
🔹 Why Join Us?
Collaborative, innovative work environment with growth opportunities
Hybrid work setup, with relocation support provided for Cyprus
Ready to make an impact? Send your CV to @Dina_itrecruiter and become a part of our team!
Forwarded from Svetlana Kirichenko
#vacancy #вакансия #job #opportunity #fulltime #Limassol #Cyprus #office #DataEngineer #DWH #SQL #PostgreSQL #MySQL #Airflow #Airbyte #Kafka #Debezium #ClickHouse
HI guys! Here’s another amazing opportunity from Brainsome!
🔥Position: DATA ENGINEER
🔥Company: Brainsome
🔥Location: Limassol, office near The Limassol Zoo
Project Description:
Our product is a streamlined platform for managing and optimizing digital advertising, designed to handle high traffic volumes and complex data processing. We manage a system that processes thousands of requests per second, with a total data volume reaching tens of terabytes. The platform includes tools for ad publishers to submit traffic, a traffic exchange engine that enables seamless buying and selling of traffic, and a CRM-like platform for managing advertisers, ad publishers, and key settings within the traffic exchange process. We specialize in solving complex challenges in processing and analyzing large-scale data.
Your Experience:
✔️ 4+ years of experience in data engineering
✔️ Strong proficiency in SQL and experience with databases such as ClickHouse, PostgreSQL, MySQL
✔️ Hands-on experience with tools such as Airflow, Airbyte, Kafka, and Debezium
✔️ Programming skills in Python for data processing
✔️ Experience with version control systems (e.g., Git)
✔️ Understanding of data security and data protection best practices
✔️ A results-oriented approach with attention to detail and problem-solving skills
Your Responsibilities:
🔹Designing, developing, and maintaining data warehouses (DWH) and storage schemas
🔹Creating and maintaining data pipelines to transfer data between source databases and the DWH
🔹Working with various data sources, including relational databases, message queues, external APIs, and Google Sheets, and developing custom connectors for these sources
🔹Collaborating with data analysts to design and maintain data marts and integrate them with BI tools such as Tableau
🔹Contributing to data architecture decisions and assisting in the selection of appropriate technologies for projects
🔹Optimising the performance and scalability of existing databases and data warehouses
🔹Handling large datasets, ensuring their availability, reliability, and integrity
We offer:
🔸Top rate pay
🔸Relocation Assistance
🔸Sports compensation
🔸Language classes compensation
🔸Medical insurance
🔸Free lunches and snacks at the office
🔸Access to mental health service
🔸Cozy office in the center of Limassol
🔸Flexible working hours
🔸Senior-level team
🔸Team buildings and parties
📫Contact: @Svetlana_Kirichenko_ITRecruiter
HI guys! Here’s another amazing opportunity from Brainsome!
🔥Position: DATA ENGINEER
🔥Company: Brainsome
🔥Location: Limassol, office near The Limassol Zoo
Project Description:
Our product is a streamlined platform for managing and optimizing digital advertising, designed to handle high traffic volumes and complex data processing. We manage a system that processes thousands of requests per second, with a total data volume reaching tens of terabytes. The platform includes tools for ad publishers to submit traffic, a traffic exchange engine that enables seamless buying and selling of traffic, and a CRM-like platform for managing advertisers, ad publishers, and key settings within the traffic exchange process. We specialize in solving complex challenges in processing and analyzing large-scale data.
Your Experience:
✔️ 4+ years of experience in data engineering
✔️ Strong proficiency in SQL and experience with databases such as ClickHouse, PostgreSQL, MySQL
✔️ Hands-on experience with tools such as Airflow, Airbyte, Kafka, and Debezium
✔️ Programming skills in Python for data processing
✔️ Experience with version control systems (e.g., Git)
✔️ Understanding of data security and data protection best practices
✔️ A results-oriented approach with attention to detail and problem-solving skills
Your Responsibilities:
🔹Designing, developing, and maintaining data warehouses (DWH) and storage schemas
🔹Creating and maintaining data pipelines to transfer data between source databases and the DWH
🔹Working with various data sources, including relational databases, message queues, external APIs, and Google Sheets, and developing custom connectors for these sources
🔹Collaborating with data analysts to design and maintain data marts and integrate them with BI tools such as Tableau
🔹Contributing to data architecture decisions and assisting in the selection of appropriate technologies for projects
🔹Optimising the performance and scalability of existing databases and data warehouses
🔹Handling large datasets, ensuring their availability, reliability, and integrity
We offer:
🔸Top rate pay
🔸Relocation Assistance
🔸Sports compensation
🔸Language classes compensation
🔸Medical insurance
🔸Free lunches and snacks at the office
🔸Access to mental health service
🔸Cozy office in the center of Limassol
🔸Flexible working hours
🔸Senior-level team
🔸Team buildings and parties
📫Contact: @Svetlana_Kirichenko_ITRecruiter
Forwarded from Michalis Sotiriadis
#vacancy #onsite #cyprus #IT #technology
Title: Data Engineer
Location: Cyprus (Limassol)
We are looking to hire a Data Engineer to join our global team This position is essential in ensuring that our data infrastructure meets the demands of our stakeholders and end-users. By developing robust data pipelines and architectures, you will rely on your expertise in AWS, data engineering and development practices.
Responsibilities:
• Design, develop, and maintain scalable data pipelines and architectures to support Business Intelligence and data analytics.
• Ensure the integration of data from various sources, transforming raw data into usable formats for analysis and reporting.
• Collaborate with business stakeholders to understand their data requirements and ensure their needs are met through effective data solutions.
• Optimize data delivery, re-designing infrastructure for greater scalability and performance where necessary.
• Implement data quality processes to ensure the accuracy and reliability of the data.
• Maintain relationships with stakeholders across various departments, educating them on technological advancements in data engineering and exploring potential opportunities.
• Serve as the main link between stakeholders and internal teams, ensuring clear and effective communication.
• Establish testing and sandboxing environments to ensure reliable and stable product development and deployment.
• Establish proper logging, monitoring, and alerting for our data systems to maintain high system reliability and performance.
• Enable practices of multi-instance environments with load balancing, ensuring redundancy, stability, and scalability.
• Ensure that our systems and processes adhere to the highest industry standards and best practices in data engineering and continuous delivery.
• Stay informed about technology trends and requirements, collaborating with the team to adapt and improve based on this knowledge.
Qualifications:
• Extensive experience with data engineering methodologies and best practices, including ETL (Extract, Transform, Load) processes.
• Advanced skills in AWS services such as S3, Redshift, RDS, Glue, and other related AWS tools.
• Strong knowledge of data modelling, data warehousing, and big data technologies.
• Solid understanding of IT infrastructure, including fundamental troubleshooting capabilities and familiarity with common software suites.
• A talent for identifying and addressing data requirements through a user-growth perspective, ensuring scalability and user satisfaction.
• Proficiency in various analytic tools to monitor, measure, and improve system performance and reliability.
• Good understanding of the infrastructure and data lifecycle, from planning and development to monitoring and maintenance.
• Exceptional ability to prioritize tasks efficiently, managing multiple priorities and projects effectively.
• Minimum of 3 years of experience in a data engineering role, preferably with a focus on AWS technologies.
• Solid knowledge of Python, with experience in data-related libraries and frameworks.
APPLY HERE: https://tototheo.bamboohr.com/careers/175
#vacancy #dataengineer #technology #IT
Title: Data Engineer
Location: Cyprus (Limassol)
We are looking to hire a Data Engineer to join our global team This position is essential in ensuring that our data infrastructure meets the demands of our stakeholders and end-users. By developing robust data pipelines and architectures, you will rely on your expertise in AWS, data engineering and development practices.
Responsibilities:
• Design, develop, and maintain scalable data pipelines and architectures to support Business Intelligence and data analytics.
• Ensure the integration of data from various sources, transforming raw data into usable formats for analysis and reporting.
• Collaborate with business stakeholders to understand their data requirements and ensure their needs are met through effective data solutions.
• Optimize data delivery, re-designing infrastructure for greater scalability and performance where necessary.
• Implement data quality processes to ensure the accuracy and reliability of the data.
• Maintain relationships with stakeholders across various departments, educating them on technological advancements in data engineering and exploring potential opportunities.
• Serve as the main link between stakeholders and internal teams, ensuring clear and effective communication.
• Establish testing and sandboxing environments to ensure reliable and stable product development and deployment.
• Establish proper logging, monitoring, and alerting for our data systems to maintain high system reliability and performance.
• Enable practices of multi-instance environments with load balancing, ensuring redundancy, stability, and scalability.
• Ensure that our systems and processes adhere to the highest industry standards and best practices in data engineering and continuous delivery.
• Stay informed about technology trends and requirements, collaborating with the team to adapt and improve based on this knowledge.
Qualifications:
• Extensive experience with data engineering methodologies and best practices, including ETL (Extract, Transform, Load) processes.
• Advanced skills in AWS services such as S3, Redshift, RDS, Glue, and other related AWS tools.
• Strong knowledge of data modelling, data warehousing, and big data technologies.
• Solid understanding of IT infrastructure, including fundamental troubleshooting capabilities and familiarity with common software suites.
• A talent for identifying and addressing data requirements through a user-growth perspective, ensuring scalability and user satisfaction.
• Proficiency in various analytic tools to monitor, measure, and improve system performance and reliability.
• Good understanding of the infrastructure and data lifecycle, from planning and development to monitoring and maintenance.
• Exceptional ability to prioritize tasks efficiently, managing multiple priorities and projects effectively.
• Minimum of 3 years of experience in a data engineering role, preferably with a focus on AWS technologies.
• Solid knowledge of Python, with experience in data-related libraries and frameworks.
APPLY HERE: https://tototheo.bamboohr.com/careers/175
#vacancy #dataengineer #technology #IT
Forwarded from Svetlana Kirichenko
#vacancy #вакансия #job #opportunity #fulltime #Limassol #Cyprus #office #DataEngineer #DWH #SQL #PostgreSQL #MySQL #Airflow #Airbyte #Kafka #Debezium #ClickHouse
HI guys! Here’s another amazing opportunity from Brainsome!
🔥Position: DATA ENGINEER
🔥Company: Brainsome
🔥Location: Limassol, office near The Limassol Zoo
Project Description:
Our product is a streamlined platform for managing and optimizing digital advertising, designed to handle high traffic volumes and complex data processing. We manage a system that processes thousands of requests per second, with a total data volume reaching tens of terabytes. The platform includes tools for ad publishers to submit traffic, a traffic exchange engine that enables seamless buying and selling of traffic, and a CRM-like platform for managing advertisers, ad publishers, and key settings within the traffic exchange process. We specialize in solving complex challenges in processing and analyzing large-scale data.
Your Experience:
✔️ 4+ years of experience in data engineering
✔️ Strong proficiency in SQL and experience with databases such as ClickHouse, PostgreSQL, MySQL
✔️ Hands-on experience with tools such as Airflow, Airbyte, Kafka, and Debezium
✔️ Programming skills in Python for data processing
✔️ Experience with version control systems (e.g., Git)
✔️ Understanding of data security and data protection best practices
✔️ A results-oriented approach with attention to detail and problem-solving skills
Your Responsibilities:
🔹Designing, developing, and maintaining data warehouses (DWH) and storage schemas
🔹Creating and maintaining data pipelines to transfer data between source databases and the DWH
🔹Working with various data sources, including relational databases, message queues, external APIs, and Google Sheets, and developing custom connectors for these sources
🔹Collaborating with data analysts to design and maintain data marts and integrate them with BI tools such as Tableau
🔹Contributing to data architecture decisions and assisting in the selection of appropriate technologies for projects
🔹Optimising the performance and scalability of existing databases and data warehouses
🔹Handling large datasets, ensuring their availability, reliability, and integrity
We offer:
🔸Top rate pay
🔸Relocation Assistance
🔸Sports compensation
🔸Language classes compensation
🔸Medical insurance
🔸Free lunches and snacks at the office
🔸Access to mental health service
🔸Cozy office in the center of Limassol
🔸Flexible working hours
🔸Senior-level team
🔸Team buildings and parties
📫Contact: @Svetlana_Kirichenko_ITRecruiter
HI guys! Here’s another amazing opportunity from Brainsome!
🔥Position: DATA ENGINEER
🔥Company: Brainsome
🔥Location: Limassol, office near The Limassol Zoo
Project Description:
Our product is a streamlined platform for managing and optimizing digital advertising, designed to handle high traffic volumes and complex data processing. We manage a system that processes thousands of requests per second, with a total data volume reaching tens of terabytes. The platform includes tools for ad publishers to submit traffic, a traffic exchange engine that enables seamless buying and selling of traffic, and a CRM-like platform for managing advertisers, ad publishers, and key settings within the traffic exchange process. We specialize in solving complex challenges in processing and analyzing large-scale data.
Your Experience:
✔️ 4+ years of experience in data engineering
✔️ Strong proficiency in SQL and experience with databases such as ClickHouse, PostgreSQL, MySQL
✔️ Hands-on experience with tools such as Airflow, Airbyte, Kafka, and Debezium
✔️ Programming skills in Python for data processing
✔️ Experience with version control systems (e.g., Git)
✔️ Understanding of data security and data protection best practices
✔️ A results-oriented approach with attention to detail and problem-solving skills
Your Responsibilities:
🔹Designing, developing, and maintaining data warehouses (DWH) and storage schemas
🔹Creating and maintaining data pipelines to transfer data between source databases and the DWH
🔹Working with various data sources, including relational databases, message queues, external APIs, and Google Sheets, and developing custom connectors for these sources
🔹Collaborating with data analysts to design and maintain data marts and integrate them with BI tools such as Tableau
🔹Contributing to data architecture decisions and assisting in the selection of appropriate technologies for projects
🔹Optimising the performance and scalability of existing databases and data warehouses
🔹Handling large datasets, ensuring their availability, reliability, and integrity
We offer:
🔸Top rate pay
🔸Relocation Assistance
🔸Sports compensation
🔸Language classes compensation
🔸Medical insurance
🔸Free lunches and snacks at the office
🔸Access to mental health service
🔸Cozy office in the center of Limassol
🔸Flexible working hours
🔸Senior-level team
🔸Team buildings and parties
📫Contact: @Svetlana_Kirichenko_ITRecruiter
Forwarded from Svetlana Kirichenko
#vacancy #вакансия #office #DataEngineer #DWH #SQL #PostgreSQL #MySQL #Airflow #Airbyte #Kafka #Debezium #ClickHouse
HI guys! We are really in to looking talanted experts for Brainsome!
🔥Position: DATA ENGINEER
🔥Company: Brainsome
🔥Location: Limassol, office near The Limassol Zoo
Project Description:
Our product is a streamlined platform for managing and optimizing digital advertising, designed to handle high traffic volumes and complex data processing. We manage a system that processes thousands of requests per second, with a total data volume reaching tens of terabytes. The platform includes tools for ad publishers to submit traffic, a traffic exchange engine that enables seamless buying and selling of traffic, and a CRM-like platform for managing advertisers, ad publishers, and key settings within the traffic exchange process. We specialize in solving complex challenges in processing and analyzing large-scale data.
Your Experience:
✔️ 4+ years of experience in data engineering
✔️ Strong proficiency in SQL and experience with databases such as ClickHouse, PostgreSQL, MySQL
✔️ Hands-on experience with tools such as Airflow, Airbyte, Kafka, and Debezium
✔️ Programming skills in Python for data processing
✔️ Experience with version control systems (e.g., Git)
✔️ Understanding of data security and data protection best practices
✔️ A results-oriented approach with attention to detail and problem-solving skills
Your Responsibilities:
🔹Designing, developing, and maintaining data warehouses (DWH) and storage schemas
🔹Creating and maintaining data pipelines to transfer data between source databases and the DWH
🔹Working with various data sources, including relational databases, message queues, external APIs, and Google Sheets, and developing custom connectors for these sources
🔹Collaborating with data analysts to design and maintain data marts and integrate them with BI tools such as Tableau
🔹Contributing to data architecture decisions and assisting in the selection of appropriate technologies for projects
🔹Optimising the performance and scalability of existing databases and data warehouses
🔹Handling large datasets, ensuring their availability, reliability, and integrity
We offer:
🔸Top rate pay
🔸Relocation Assistance
🔸Sports compensation
🔸Language classes compensation
🔸Medical insurance
🔸Free lunches and snacks at the office
🔸Cozy office in the center of Limassol
🔸Flexible working hours
🔸Senior-level team
🔸Team buildings and parties
📫Contact: @Svetlana_Kirichenko_ITRecruiter
HI guys! We are really in to looking talanted experts for Brainsome!
🔥Position: DATA ENGINEER
🔥Company: Brainsome
🔥Location: Limassol, office near The Limassol Zoo
Project Description:
Our product is a streamlined platform for managing and optimizing digital advertising, designed to handle high traffic volumes and complex data processing. We manage a system that processes thousands of requests per second, with a total data volume reaching tens of terabytes. The platform includes tools for ad publishers to submit traffic, a traffic exchange engine that enables seamless buying and selling of traffic, and a CRM-like platform for managing advertisers, ad publishers, and key settings within the traffic exchange process. We specialize in solving complex challenges in processing and analyzing large-scale data.
Your Experience:
✔️ 4+ years of experience in data engineering
✔️ Strong proficiency in SQL and experience with databases such as ClickHouse, PostgreSQL, MySQL
✔️ Hands-on experience with tools such as Airflow, Airbyte, Kafka, and Debezium
✔️ Programming skills in Python for data processing
✔️ Experience with version control systems (e.g., Git)
✔️ Understanding of data security and data protection best practices
✔️ A results-oriented approach with attention to detail and problem-solving skills
Your Responsibilities:
🔹Designing, developing, and maintaining data warehouses (DWH) and storage schemas
🔹Creating and maintaining data pipelines to transfer data between source databases and the DWH
🔹Working with various data sources, including relational databases, message queues, external APIs, and Google Sheets, and developing custom connectors for these sources
🔹Collaborating with data analysts to design and maintain data marts and integrate them with BI tools such as Tableau
🔹Contributing to data architecture decisions and assisting in the selection of appropriate technologies for projects
🔹Optimising the performance and scalability of existing databases and data warehouses
🔹Handling large datasets, ensuring their availability, reliability, and integrity
We offer:
🔸Top rate pay
🔸Relocation Assistance
🔸Sports compensation
🔸Language classes compensation
🔸Medical insurance
🔸Free lunches and snacks at the office
🔸Cozy office in the center of Limassol
🔸Flexible working hours
🔸Senior-level team
🔸Team buildings and parties
📫Contact: @Svetlana_Kirichenko_ITRecruiter
❤1
Forwarded from Marina Godunova
cyprus #onsite #vacancy #job #dataengineer
We are looking for data engineer
For our company
📍Location: Cyprus
Work type: on-site
✅️Fluency in spoken and written English, additional languages are an asset.
✅️Experience in handling ambiguity, simplifying requirements, and distilling use cases down to deliver solutions.
✅️Good problem-solving, debugging, and troubleshooting skills.
✅️5+ years of data analysis experience.
✅️High proficiency in data visualization tools such as Excel, Tableau, Power BI, and extensive experience with SQL and NoSQL databases.
✅️Advanced analytical skills with the ability to apply statistical methods effectively, elevating the organization's data maturity and understanding.
✅️Strong fundamentals of working with databases (SQL, NoSQL, etc.).
✅️Ability to apply statistical methods to analyze and interpret data, providing meaningful insights.
If you feel like the right fit, please DM
@Marinarecruiter25
We are looking for data engineer
For our company
📍Location: Cyprus
Work type: on-site
✅️Fluency in spoken and written English, additional languages are an asset.
✅️Experience in handling ambiguity, simplifying requirements, and distilling use cases down to deliver solutions.
✅️Good problem-solving, debugging, and troubleshooting skills.
✅️5+ years of data analysis experience.
✅️High proficiency in data visualization tools such as Excel, Tableau, Power BI, and extensive experience with SQL and NoSQL databases.
✅️Advanced analytical skills with the ability to apply statistical methods effectively, elevating the organization's data maturity and understanding.
✅️Strong fundamentals of working with databases (SQL, NoSQL, etc.).
✅️Ability to apply statistical methods to analyze and interpret data, providing meaningful insights.
If you feel like the right fit, please DM
@Marinarecruiter25
Forwarded from Marina Godunova
cyprus #onsite #vacancy #job #dataengineer
We are looking for data engineer
For our company
📍Location: Cyprus
Work type: on-site
✅️Fluency in spoken and written English, additional languages are an asset.
✅️Experience in handling ambiguity, simplifying requirements, and distilling use cases down to deliver solutions.
✅️Good problem-solving, debugging, and troubleshooting skills.
✅️5+ years of data analysis experience.
✅️High proficiency in data visualization tools such as Excel, Tableau, Power BI, and extensive experience with SQL and NoSQL databases.
✅️Advanced analytical skills with the ability to apply statistical methods effectively, elevating the organization's data maturity and understanding.
✅️Strong fundamentals of working with databases (SQL, NoSQL, etc.).
✅️Ability to apply statistical methods to analyze and interpret data, providing meaningful insights.
If you feel like the right fit, please DM
@Marinarecruiter25
We are looking for data engineer
For our company
📍Location: Cyprus
Work type: on-site
✅️Fluency in spoken and written English, additional languages are an asset.
✅️Experience in handling ambiguity, simplifying requirements, and distilling use cases down to deliver solutions.
✅️Good problem-solving, debugging, and troubleshooting skills.
✅️5+ years of data analysis experience.
✅️High proficiency in data visualization tools such as Excel, Tableau, Power BI, and extensive experience with SQL and NoSQL databases.
✅️Advanced analytical skills with the ability to apply statistical methods effectively, elevating the organization's data maturity and understanding.
✅️Strong fundamentals of working with databases (SQL, NoSQL, etc.).
✅️Ability to apply statistical methods to analyze and interpret data, providing meaningful insights.
If you feel like the right fit, please DM
@Marinarecruiter25
Forwarded from Svetlana Kirichenko
#vacancy #вакансия #office #DataEngineer #DWH #SQL #PostgreSQL #MySQL #Airflow #Airbyte #Kafka #Debezium #ClickHouse #ML #MachineLearning
HI guys! We are really in to looking talanted experts for Brainsome!
🔥Position: DATA ENGINEER+ML
🔥Company: Brainsome
🔥Location: Limassol, office near The Limassol Zoo
Project Description:
Our product is a streamlined platform for managing and optimizing digital advertising, designed to handle high traffic volumes and complex data processing. We manage a system that processes thousands of requests per second, with a total data volume reaching tens of terabytes. The platform includes tools for ad publishers to submit traffic, a traffic exchange engine that enables seamless buying and selling of traffic, and a CRM-like platform for managing advertisers, ad publishers, and key settings within the traffic exchange process. We specialize in solving complex challenges in processing and analyzing large-scale data.
Your Experience:
✔️ 4+ years of experience in data engineering
✔️ Strong proficiency in SQL and experience with databases such as ClickHouse, PostgreSQL, MySQL
✔️ Hands-on experience with tools such as Airflow, Airbyte, Kafka, and Debezium
✔️ Programming skills in Python for data processing
✔️ Experience with version control systems (e.g., Git)
✔️ Understanding of data security and data protection best practices
✔️ A results-oriented approach with attention to detail and problem-solving skills
Your Responsibilities:
🔹Data Collection & Ingestion: Acquire and ingest data from diverse sources, including OLTP/OLAP databases, event streams, logs, third-party APIs, and file/object storage.
🔹Connector Configuration: Set up and manage data source connectors to ensure seamless data flow into the Data Lake.
🔹Data Modeling & Documentation: Design, document, and maintain data models to support efficient storage, retrieval, and processing.
🔹ETL/ELT Pipeline Development: Build, optimize, and maintain batch and streaming data pipelines for efficient data processing.
🔹Data Warehousing & Lake Management: Organize and manage Data Warehouses and Data Lakes to ensure scalability, security, and performance.
🔹Data Preparation for BI & ML: Transform and curate data to support Business Intelligence and Machine Learning workflows.
🔹Data Quality & Observability: Implement testing, monitoring, and observability frameworks to ensure data accuracy, reliability, and consistency.
We offer:
🔸Top rate pay
🔸Relocation Assistance
🔸Sports compensation
🔸Language classes compensation
🔸Medical insurance
🔸Free lunches and snacks at the office
🔸Cozy office in the center of Limassol
🔸Flexible working hours
🔸Senior-level team
🔸Team buildings and parties
📫Contact: @Svetlana_Kirichenko_ITRecruiter
HI guys! We are really in to looking talanted experts for Brainsome!
🔥Position: DATA ENGINEER+ML
🔥Company: Brainsome
🔥Location: Limassol, office near The Limassol Zoo
Project Description:
Our product is a streamlined platform for managing and optimizing digital advertising, designed to handle high traffic volumes and complex data processing. We manage a system that processes thousands of requests per second, with a total data volume reaching tens of terabytes. The platform includes tools for ad publishers to submit traffic, a traffic exchange engine that enables seamless buying and selling of traffic, and a CRM-like platform for managing advertisers, ad publishers, and key settings within the traffic exchange process. We specialize in solving complex challenges in processing and analyzing large-scale data.
Your Experience:
✔️ 4+ years of experience in data engineering
✔️ Strong proficiency in SQL and experience with databases such as ClickHouse, PostgreSQL, MySQL
✔️ Hands-on experience with tools such as Airflow, Airbyte, Kafka, and Debezium
✔️ Programming skills in Python for data processing
✔️ Experience with version control systems (e.g., Git)
✔️ Understanding of data security and data protection best practices
✔️ A results-oriented approach with attention to detail and problem-solving skills
Your Responsibilities:
🔹Data Collection & Ingestion: Acquire and ingest data from diverse sources, including OLTP/OLAP databases, event streams, logs, third-party APIs, and file/object storage.
🔹Connector Configuration: Set up and manage data source connectors to ensure seamless data flow into the Data Lake.
🔹Data Modeling & Documentation: Design, document, and maintain data models to support efficient storage, retrieval, and processing.
🔹ETL/ELT Pipeline Development: Build, optimize, and maintain batch and streaming data pipelines for efficient data processing.
🔹Data Warehousing & Lake Management: Organize and manage Data Warehouses and Data Lakes to ensure scalability, security, and performance.
🔹Data Preparation for BI & ML: Transform and curate data to support Business Intelligence and Machine Learning workflows.
🔹Data Quality & Observability: Implement testing, monitoring, and observability frameworks to ensure data accuracy, reliability, and consistency.
We offer:
🔸Top rate pay
🔸Relocation Assistance
🔸Sports compensation
🔸Language classes compensation
🔸Medical insurance
🔸Free lunches and snacks at the office
🔸Cozy office in the center of Limassol
🔸Flexible working hours
🔸Senior-level team
🔸Team buildings and parties
📫Contact: @Svetlana_Kirichenko_ITRecruiter
Forwarded from DZIMA
#vacancy #вакансия #DataEngineer #igaming #Cyprus #Middle
Компания: Tribe
Локация и формат работы: только с территории Кипра (помогаем с визами, релокацией и тд.)
Позиция: Middle Data Engineer
Компенсация: по рынку или выше, обсуждаем лично:)
Описание продукта и задач: Огромный и увлекательный продукт в сфере онлайн-развлечений от крупнейшего мирового холдинга в сфере igaming.
🔻Вы:
- от 3 лет опыта в роли Data Engineer
- Имеете опыт написания тестов
- Имеете опыт работы с Airflow / Dagster
- Обладаете продвинутыми навыками в написании и оптимизации SQL запросов
- Имеете опыт работы с MySQL / PostgreSQL
- Имеете опыт работы с Clickhouse / Greenplum
- Имеете опыт работы с Apache Kafka / RabbitMQ
- Имеете опыт работы с DBT
- soft-skills на отличном уровне
- позитив, юмор и рок:)
🔻Чем будете заниматься?
- Интегрировать новые ETL/ELT процессы
- Проектировать и развивать архитектуру хранилищ
- Разрабатывать и развивать внутренние инструменты
- Принимать участие в ревью кода коллег
🔻Мы:
- годовой бонус по результатам работы
- компенсация квартиры до 750 Euro
- помощь с релокацией, оформлением визы для Вас и членов семьи
- беспроцентная рассрочка на покупку авто от компании
- оплачиваемый отпуск по законам Кипра
- оплачиваемый спорт
- частичная оплата английского
- корпоративное питание в офисе (завтраки, обеды)
- офисный спортзал
- оплата выставок, конференций
- технически сложные задачи и возможность развиваться как вертикально, так и горизонтально
📌контакт: @DzmitryS6
Компания: Tribe
Локация и формат работы: только с территории Кипра (помогаем с визами, релокацией и тд.)
Позиция: Middle Data Engineer
Компенсация: по рынку или выше, обсуждаем лично:)
Описание продукта и задач: Огромный и увлекательный продукт в сфере онлайн-развлечений от крупнейшего мирового холдинга в сфере igaming.
🔻Вы:
- от 3 лет опыта в роли Data Engineer
- Имеете опыт написания тестов
- Имеете опыт работы с Airflow / Dagster
- Обладаете продвинутыми навыками в написании и оптимизации SQL запросов
- Имеете опыт работы с MySQL / PostgreSQL
- Имеете опыт работы с Clickhouse / Greenplum
- Имеете опыт работы с Apache Kafka / RabbitMQ
- Имеете опыт работы с DBT
- soft-skills на отличном уровне
- позитив, юмор и рок:)
🔻Чем будете заниматься?
- Интегрировать новые ETL/ELT процессы
- Проектировать и развивать архитектуру хранилищ
- Разрабатывать и развивать внутренние инструменты
- Принимать участие в ревью кода коллег
🔻Мы:
- годовой бонус по результатам работы
- компенсация квартиры до 750 Euro
- помощь с релокацией, оформлением визы для Вас и членов семьи
- беспроцентная рассрочка на покупку авто от компании
- оплачиваемый отпуск по законам Кипра
- оплачиваемый спорт
- частичная оплата английского
- корпоративное питание в офисе (завтраки, обеды)
- офисный спортзал
- оплата выставок, конференций
- технически сложные задачи и возможность развиваться как вертикально, так и горизонтально
📌контакт: @DzmitryS6