#DataEngineer #ContractPosition #Remote #SQL #BigData #FinancialData #Python #BigQ #Looker #Snowflake
Разыскиваем #DataEngineer на работу по контракту с крупной американской венчурной компанией.
Контракт на 6 месяцев с возможностью перезаключения договора.
Предпочтительна возможность работать в их часовых поясах, но возможны варианты.
Стек технологий: GCP, ETL, Snowflake, BigQ, Python, Looker (нужен full stack)
Английский B2 и выше – условие обязательное.
Работать за пределами России и Беларуси - условие обязательное.
Зарплата $5000 – 6500 NET
Для связи: https://t.me/Tary_bird
Description of the Data Engineer contract position:
Location: Preferably San Francisco Bay Area, or remotely in the Pacific or Central Time zone.
Company:
A large venture company with assets of over $11 billion and employees in Austin, London, Menlo Park, and San Francisco.
What you will be doing:
As a data engineer, you will report to the head of data and analytics and help create the entire data structure and infrastructure supporting operations.
Responsibilities:
Design, create, and maintain the data infrastructure necessary for optimal extraction, transformation, and loading of data from various data sources using SQL, NoSQL, and big data technologies.
Develop and implement data collection systems that integrate various sources such as company proprietary data and third-party data sources, etc.
Create an automated process for collecting and visualizing user engagement data from CRM/UI.
What we are looking for:
Qualifications:
• Experience of at least 3 years as a data engineer or full stack in the field of data warehousing, data monitoring, and building and maintaining ETL pipelines.
• Valid experience with the Google cloud platform (GCP).
• Deep experience with data pipeline and workflow management tools (e.g., Airflow).
• Solid knowledge and experience with database design, setup, and maintenance.
• Proven ability to work in highly dynamic environments with high product velocity.
• Strong proficiency in Python.
• Strong proficiency in SQL.
• Familiarity with data visualization tools (Looker ).
• Experience with Snowflake.
• Experience with BigQuery.
• Strong communication skills, both orally and in writing.
• Familiarity with CRM (Affinity, Salesforce), automation tools (Zapier)
Bonus points:
• Experience in venture capital data operations/working with financial data.
• Familiarity with CRM (Affinity, Salesforce), automation tools (Zapier).
• Bachelor's or master's degree in computer science, database management, etc.
Разыскиваем #DataEngineer на работу по контракту с крупной американской венчурной компанией.
Контракт на 6 месяцев с возможностью перезаключения договора.
Предпочтительна возможность работать в их часовых поясах, но возможны варианты.
Стек технологий: GCP, ETL, Snowflake, BigQ, Python, Looker (нужен full stack)
Английский B2 и выше – условие обязательное.
Работать за пределами России и Беларуси - условие обязательное.
Зарплата $5000 – 6500 NET
Для связи: https://t.me/Tary_bird
Description of the Data Engineer contract position:
Location: Preferably San Francisco Bay Area, or remotely in the Pacific or Central Time zone.
Company:
A large venture company with assets of over $11 billion and employees in Austin, London, Menlo Park, and San Francisco.
What you will be doing:
As a data engineer, you will report to the head of data and analytics and help create the entire data structure and infrastructure supporting operations.
Responsibilities:
Design, create, and maintain the data infrastructure necessary for optimal extraction, transformation, and loading of data from various data sources using SQL, NoSQL, and big data technologies.
Develop and implement data collection systems that integrate various sources such as company proprietary data and third-party data sources, etc.
Create an automated process for collecting and visualizing user engagement data from CRM/UI.
What we are looking for:
Qualifications:
• Experience of at least 3 years as a data engineer or full stack in the field of data warehousing, data monitoring, and building and maintaining ETL pipelines.
• Valid experience with the Google cloud platform (GCP).
• Deep experience with data pipeline and workflow management tools (e.g., Airflow).
• Solid knowledge and experience with database design, setup, and maintenance.
• Proven ability to work in highly dynamic environments with high product velocity.
• Strong proficiency in Python.
• Strong proficiency in SQL.
• Familiarity with data visualization tools (Looker ).
• Experience with Snowflake.
• Experience with BigQuery.
• Strong communication skills, both orally and in writing.
• Familiarity with CRM (Affinity, Salesforce), automation tools (Zapier)
Bonus points:
• Experience in venture capital data operations/working with financial data.
• Familiarity with CRM (Affinity, Salesforce), automation tools (Zapier).
• Bachelor's or master's degree in computer science, database management, etc.
#DataEngineer #ContractPosition #Remote # GCP #ThoughtSpot #BigData #Affinity #Slack #Looker #Snowflake
Разыскиваем #DataEngineer на работу по контракту с крупной американской венчурной компанией.
Контракт на 6 месяцев с возможностью перезаключения договора.
Предпочтительна возможность работать в их часовых поясах, но возможны варианты.
Стек технологий: GCP, ETL, Snowflake, CRM Affinity, SQL, Airflow, ThoughtSpot (preferred) or Looker , Python, SQL (нужен full stack!)
Английский B2 и выше – условие обязательное.
Работать за пределами России и Беларуси - условие обязательное.
Зарплата $5000 – 6500 NET
Для самых внимательных, кто действительно читает описание вакансии: просим откликаться в том случае, если у вас есть полный стек и присылать резюме в формате Word.
Для связи: https://t.me/Tary_bird
Description of the Data Engineer contract position:
Location: Preferably San Francisco Bay Area, or remotely in the Pacific or Central Time zone.
Company:
A large venture company with assets of over $11 billion and employees in Austin, London, Menlo Park, and San Francisco.
What to expect:
Your role as a data engineer involves reporting to the head of the data and analytics department and participating in the creation of the entire structure and infrastructure necessary to support operations.
Responsibilities:
Developing, creating, and maintaining data infrastructure for optimal extraction, transformation, and loading of data from various sources using SQL, NoSQL, and big data technologies.
Creating and implementing data collection systems that integrate various sources, including company proprietary data and external sources.
Automating the process of collecting and visualizing user engagement data from CRM/UI.
Developing and supporting ETL (Extract, Transform, Load) processes on the Google Cloud platform and in the Snowflake system for efficient data processing.
Extracting data from the Affinity CRM system, ensuring its correctness and relevance.
Integrating notifications into Slack to improve communication within the team.
If necessary, developing and supporting analytical reports and dashboards in BI tools such as ThoughtSpot (preferred) or Looker to make data-driven decisions.
What we are looking for:
Qualifications:
• Experience of at least 3 years as a data engineer or full stack in the field of data warehousing, data monitoring, and building and maintaining ETL pipelines, including experience with Google Cloud and Snowflake.
• Deep experience with data pipeline and workflow management tools (Airflow).
• Strong proficiency in SQL and Python
• Experience with BigQuery.
• experience extracting data out of Affinity CRM and integrate notifications back to Slack
• Solid knowledge and experience with database design, setup, and maintenance.
• Proven ability to work in highly dynamic environments with high product velocity
• Strong communication skills, both orally and in writing.
Nice to have:
• BI tool experience on ThoughtSpot (preferred) or Looker
• Bachelor's or master's degree in computer science, database management, etc.
For those who pay close attention and thoroughly read through job descriptions: please only apply if you possess full-stack capabilities and send your resume in Word format.
Разыскиваем #DataEngineer на работу по контракту с крупной американской венчурной компанией.
Контракт на 6 месяцев с возможностью перезаключения договора.
Предпочтительна возможность работать в их часовых поясах, но возможны варианты.
Стек технологий: GCP, ETL, Snowflake, CRM Affinity, SQL, Airflow, ThoughtSpot (preferred) or Looker , Python, SQL (нужен full stack!)
Английский B2 и выше – условие обязательное.
Работать за пределами России и Беларуси - условие обязательное.
Зарплата $5000 – 6500 NET
Для самых внимательных, кто действительно читает описание вакансии: просим откликаться в том случае, если у вас есть полный стек и присылать резюме в формате Word.
Для связи: https://t.me/Tary_bird
Description of the Data Engineer contract position:
Location: Preferably San Francisco Bay Area, or remotely in the Pacific or Central Time zone.
Company:
A large venture company with assets of over $11 billion and employees in Austin, London, Menlo Park, and San Francisco.
What to expect:
Your role as a data engineer involves reporting to the head of the data and analytics department and participating in the creation of the entire structure and infrastructure necessary to support operations.
Responsibilities:
Developing, creating, and maintaining data infrastructure for optimal extraction, transformation, and loading of data from various sources using SQL, NoSQL, and big data technologies.
Creating and implementing data collection systems that integrate various sources, including company proprietary data and external sources.
Automating the process of collecting and visualizing user engagement data from CRM/UI.
Developing and supporting ETL (Extract, Transform, Load) processes on the Google Cloud platform and in the Snowflake system for efficient data processing.
Extracting data from the Affinity CRM system, ensuring its correctness and relevance.
Integrating notifications into Slack to improve communication within the team.
If necessary, developing and supporting analytical reports and dashboards in BI tools such as ThoughtSpot (preferred) or Looker to make data-driven decisions.
What we are looking for:
Qualifications:
• Experience of at least 3 years as a data engineer or full stack in the field of data warehousing, data monitoring, and building and maintaining ETL pipelines, including experience with Google Cloud and Snowflake.
• Deep experience with data pipeline and workflow management tools (Airflow).
• Strong proficiency in SQL and Python
• Experience with BigQuery.
• experience extracting data out of Affinity CRM and integrate notifications back to Slack
• Solid knowledge and experience with database design, setup, and maintenance.
• Proven ability to work in highly dynamic environments with high product velocity
• Strong communication skills, both orally and in writing.
Nice to have:
• BI tool experience on ThoughtSpot (preferred) or Looker
• Bachelor's or master's degree in computer science, database management, etc.
For those who pay close attention and thoroughly read through job descriptions: please only apply if you possess full-stack capabilities and send your resume in Word format.
#DataEngineer #ContractPosition #Remote #GCP #ThoughtSpot #BigData #Affinity #Slack #Looker #Snowflake
Разыскивается DataEngineer на работу по контракту с крупной американской венчурной компанией.
Контракт на 6 месяцев с возможностью перезаключения договора.
Предпочтительна возможность работать в их часовых поясах, но возможны варианты.
Стек технологий: GCP, ETL, Snowflake, CRM Affinity, SQL, Airflow, ThoughtSpot (preferred) or Looker , Python, SQL (нужен full stack!!!)
Английский B2 и выше – условие обязательное.
Работать за пределами России и Беларуси - условие обязательное.
Зарплата $5000 – 6500 NET
Для самых внимательных, кто действительно читает описание вакансии: просим - откликаться только в том случае, если у вас есть полный стек, - присылать резюме в формате Word.
Для связи: https://t.me/Tary_bird
Description of the Data Engineer contract position:
Location: Preferably San Francisco Bay Area, or remotely in the Pacific or Central Time zone.
Company:
A large venture company with assets of over $11 billion and employees in Austin, London, Menlo Park, and San Francisco.
What to expect:
Your role as a data engineer involves reporting to the head of the data and analytics department and participating in the creation of the entire structure and infrastructure necessary to support operations.
Responsibilities:
Developing, creating, and maintaining data infrastructure for optimal extraction, transformation, and loading of data from various sources using SQL, NoSQL, and big data technologies.
Creating and implementing data collection systems that integrate various sources, including company proprietary data and external sources.
Automating the process of collecting and visualizing user engagement data from CRM/UI.
Developing and supporting ETL (Extract, Transform, Load) processes on the Google Cloud platform and in the Snowflake system for efficient data processing.
Extracting data from the Affinity CRM system, ensuring its correctness and relevance.
Integrating notifications into Slack to improve communication within the team.
If necessary, developing and supporting analytical reports and dashboards in BI tools such as ThoughtSpot (preferred) or Looker to make data-driven decisions.
What we are looking for:
Qualifications:
• Experience of at least 3 years as a data engineer or full stack in the field of data warehousing, data monitoring, and building and maintaining ETL pipelines, including experience with Google Cloud and Snowflake.
• Deep experience with data pipeline and workflow management tools (Airflow).
• Strong proficiency in SQL and Python
• Experience with BigQuery.
• experience extracting data out of Affinity CRM and integrate notifications back to Slack
• Solid knowledge and experience with database design, setup, and maintenance.
• Proven ability to work in highly dynamic environments with high product velocity
• Strong communication skills, both orally and in writing.• BI tool experience on ThoughtSpot (preferred) or Looker
Nice to have:
• Bachelor's or master's degree in computer science, database management, etc.
Разыскивается DataEngineer на работу по контракту с крупной американской венчурной компанией.
Контракт на 6 месяцев с возможностью перезаключения договора.
Предпочтительна возможность работать в их часовых поясах, но возможны варианты.
Стек технологий: GCP, ETL, Snowflake, CRM Affinity, SQL, Airflow, ThoughtSpot (preferred) or Looker , Python, SQL (нужен full stack!!!)
Английский B2 и выше – условие обязательное.
Работать за пределами России и Беларуси - условие обязательное.
Зарплата $5000 – 6500 NET
Для самых внимательных, кто действительно читает описание вакансии: просим - откликаться только в том случае, если у вас есть полный стек, - присылать резюме в формате Word.
Для связи: https://t.me/Tary_bird
Description of the Data Engineer contract position:
Location: Preferably San Francisco Bay Area, or remotely in the Pacific or Central Time zone.
Company:
A large venture company with assets of over $11 billion and employees in Austin, London, Menlo Park, and San Francisco.
What to expect:
Your role as a data engineer involves reporting to the head of the data and analytics department and participating in the creation of the entire structure and infrastructure necessary to support operations.
Responsibilities:
Developing, creating, and maintaining data infrastructure for optimal extraction, transformation, and loading of data from various sources using SQL, NoSQL, and big data technologies.
Creating and implementing data collection systems that integrate various sources, including company proprietary data and external sources.
Automating the process of collecting and visualizing user engagement data from CRM/UI.
Developing and supporting ETL (Extract, Transform, Load) processes on the Google Cloud platform and in the Snowflake system for efficient data processing.
Extracting data from the Affinity CRM system, ensuring its correctness and relevance.
Integrating notifications into Slack to improve communication within the team.
If necessary, developing and supporting analytical reports and dashboards in BI tools such as ThoughtSpot (preferred) or Looker to make data-driven decisions.
What we are looking for:
Qualifications:
• Experience of at least 3 years as a data engineer or full stack in the field of data warehousing, data monitoring, and building and maintaining ETL pipelines, including experience with Google Cloud and Snowflake.
• Deep experience with data pipeline and workflow management tools (Airflow).
• Strong proficiency in SQL and Python
• Experience with BigQuery.
• experience extracting data out of Affinity CRM and integrate notifications back to Slack
• Solid knowledge and experience with database design, setup, and maintenance.
• Proven ability to work in highly dynamic environments with high product velocity
• Strong communication skills, both orally and in writing.• BI tool experience on ThoughtSpot (preferred) or Looker
Nice to have:
• Bachelor's or master's degree in computer science, database management, etc.
#DataEngineer #ContractPosition #Remote #GCP #Snowflake #dbt #Fintech #API #Airflow #GitHub
Разыскивается Data Engineer на работу по контракту с крупной американской венчурной компанией.
Контракт на 6 месяцев с возможностью перезаключения договора.
Предпочтительна возможность работать в их часовых поясах, минимальное время пересечения – 4 часа.
Стек технологий: GCP, Snowflake, dbt, Airflow, GitHub, API/SFTP, Python, SQL.
Английский B2 и выше – условие обязательное.
Опыт работы в финтех/банковском секторе - условие обязательное.
Работать за пределами России и Беларуси - условие обязательное.
Зарплата: $5000 – 7000 NET.
Для самых внимательных, кто действительно читает описание вакансии:
• Пожалуйста, откликайтесь только в том случае, если у вас есть необходимый опыт по всему стеку (GCP, Snowflake, dbt, Airflow, GitHub, Python and SQL, API/SFTP), а также опыт работы в финтех/банковском секторе.
• Присылайте резюме в формате Word.
Спасибо!
Для связи: https://t.me/Tary_bird
____________________________________
Description of the Data Engineer contract position:
Location: Preferably Pacific Time Zone, with at least 4 hours overlap with working hours.
Company:
A large venture company with assets of over $11 billion and employees in Austin, London, Menlo Park, and San Francisco.
What to expect:
Your role as a data engineer involves reporting to the head of the data and analytics department and participating in the creation of the entire structure and infrastructure necessary to support operations in the fintech/banking sector.
Responsibilities:
• Developing, creating, and maintaining data infrastructure for optimal extraction, transformation, and loading of data from various sources using SQL, and big data technologies.
• Creating and implementing data collection systems that integrate various sources, including company proprietary data and external sources.
• Automating the process of collecting and visualizing user engagement data.
• Developing and supporting data processes on the Google Cloud platform and in the Snowflake system for efficient data processing.
• Extracting data via API/SFTP and ensuring its correctness and relevance.
What we are looking for:
Qualifications:
• Fintech/Bank working experience (must have).
• Minimum 6 years of professional experience as a data engineer/data analyst in the fintech/banking sector.
• Deep knowledge of GCP, Snowflake, dbt, Airflow, and GitHub.
• Strong proficiency in Python and SQL.
• Experience in data intake via API/SFTP.
• Attention to detail and strong communication skills, both orally and in writing.
Nice to have:
• Bachelor's or master's degree in computer science, database management, etc.
Please send the completed application form together with your CV.
• How many years of experience do you have with Google Cloud Platform (GCP)?
• How many years of experience do you have with Snowflake?
• How many years of experience do you have with dbt?
• How many years of experience do you have with Airflow?
• How many years of experience do you have with GitHub?
• Do you have experience working with data intake through API/SFTP? If yes, please describe.
• How many years of experience do you have with Python?
• How many years of experience do you have with SQL?
• What salary USD is expected?
Разыскивается Data Engineer на работу по контракту с крупной американской венчурной компанией.
Контракт на 6 месяцев с возможностью перезаключения договора.
Предпочтительна возможность работать в их часовых поясах, минимальное время пересечения – 4 часа.
Стек технологий: GCP, Snowflake, dbt, Airflow, GitHub, API/SFTP, Python, SQL.
Английский B2 и выше – условие обязательное.
Опыт работы в финтех/банковском секторе - условие обязательное.
Работать за пределами России и Беларуси - условие обязательное.
Зарплата: $5000 – 7000 NET.
Для самых внимательных, кто действительно читает описание вакансии:
• Пожалуйста, откликайтесь только в том случае, если у вас есть необходимый опыт по всему стеку (GCP, Snowflake, dbt, Airflow, GitHub, Python and SQL, API/SFTP), а также опыт работы в финтех/банковском секторе.
• Присылайте резюме в формате Word.
Спасибо!
Для связи: https://t.me/Tary_bird
____________________________________
Description of the Data Engineer contract position:
Location: Preferably Pacific Time Zone, with at least 4 hours overlap with working hours.
Company:
A large venture company with assets of over $11 billion and employees in Austin, London, Menlo Park, and San Francisco.
What to expect:
Your role as a data engineer involves reporting to the head of the data and analytics department and participating in the creation of the entire structure and infrastructure necessary to support operations in the fintech/banking sector.
Responsibilities:
• Developing, creating, and maintaining data infrastructure for optimal extraction, transformation, and loading of data from various sources using SQL, and big data technologies.
• Creating and implementing data collection systems that integrate various sources, including company proprietary data and external sources.
• Automating the process of collecting and visualizing user engagement data.
• Developing and supporting data processes on the Google Cloud platform and in the Snowflake system for efficient data processing.
• Extracting data via API/SFTP and ensuring its correctness and relevance.
What we are looking for:
Qualifications:
• Fintech/Bank working experience (must have).
• Minimum 6 years of professional experience as a data engineer/data analyst in the fintech/banking sector.
• Deep knowledge of GCP, Snowflake, dbt, Airflow, and GitHub.
• Strong proficiency in Python and SQL.
• Experience in data intake via API/SFTP.
• Attention to detail and strong communication skills, both orally and in writing.
Nice to have:
• Bachelor's or master's degree in computer science, database management, etc.
Please send the completed application form together with your CV.
• How many years of experience do you have with Google Cloud Platform (GCP)?
• How many years of experience do you have with Snowflake?
• How many years of experience do you have with dbt?
• How many years of experience do you have with Airflow?
• How many years of experience do you have with GitHub?
• Do you have experience working with data intake through API/SFTP? If yes, please describe.
• How many years of experience do you have with Python?
• How many years of experience do you have with SQL?
• What salary USD is expected?