\nItโs an exciting time to join Metrika! A Series A-funded startup in growth-mode with teammates across the US, Canada, UK, and Europe, we are building the world's premier operational intelligence platform for blockchain networks. Metrika partners with blockchain protocols, foundations, and node runners to help them and their community members analyze individual and network-wide metrics of their Distributed Ledger Technology (DLT) networks to maintain and improve their performance, security, and reliability.\n\n\n\nThese are the early days of our platform, and as a Senior Data Engineer you will be able to contribute, influence, and take ownership in significant parts of our systems. Our goal is to build a very high performance platform, capable of analyzing thousands of transactions across multiple blockchain networks in real-time.\n\n\n\nIf you are a Senior Data Engineer, with a solid understanding of data lakes, data warehouses, ETL, distributed systems, passion for your work and would love to work with a geographically distributed team, in an emerging industry join us! No prior experience in blockchain necessary, but an interest in learning and being deeply immersed is.\n\n\n\nWhat you'll be doing:\n\n\n\n* \nDesigning, implementing and maintaining data processing pipelines โ this includes ingestion, clean up, transformation, aggregation, batch and streaming jobs, as well managing the data lifecycle to ensure affordable and performant long-term storage across our data stores and data lake. You will work closely with our software engineers, SREs and our Analytics team to make sure data smoothly flows across Metrika and beyond to our customers and users.\n\n* \nWorking under a Scrum or Kanban framework.\n\n* \nOwning your work. This means being proud of your work, actively striving for excellence, observing the best practices of your craft and always aiming to improve your skill.\n\n* \nUnderstanding, participating and contributing to the company goals, regardless of your role. Metrika is a small company with a very inclusive culture. We are looking for people that share those values with us.\n\n\n\n\n\n\nPlease note: Our Engineering team is predominantly based in Europe and the eastern United States. This position is currently open to those resident and currently able to work in the European Economic Area (EU, Norway, Liechtenstein), Switzerland, the UK as well the eastern United States/Canada (UTC-4/UTC-5 timezone).\n\n\n\n\nMetrika Inc. is an Equal Opportunity employer. All applicants will be considered without regard for race, color, national origin, ethnicity, gender, disability, sexual orientation, gender identity, or religion.\n\n\nWe are looking for individuals with:\n\n* \nA Bachelor's degree in Computer Science, Electrical Engineering, Physics or Mathematics. Masters or higher degrees preferred.\n\n* \nMulti-year experience in data engineering, in large-scale production environments.\n\n* \nAt Metrika we mostly use Python for data processing; most of our ETL/Data processing jobs are written in Python. You will need to have some familiarity with scheduling systems (e.g. Airflow, luigi etc.), data transformation tools (e.g. dbt), distributed compute frameworks (e.g. Apache Spark, Apache Flink, ray.io etc.), and a solid understanding of the concepts of data governance, data lineage/provenance.\n\n* \nExcellent understanding of TDD, agile development methodology and version control.\n\n* \nThe ability to function autonomously to solve problems, and deliver working software. Our remote environment and geographic distribution requires people that can work well on their own.\n\n* \nThe ability to communicate well with your team, both interactively and asynchronously, and that of being a positive, constructive team member.\n\n\n\n\n\n\nYou'll be a great fit if you have:\n\n* \nWorked and contributed to a Big Data production environment, handling multiple GB of data per day.\n\n* \nGood knowledge of Python.\n\n* \nExperience with Apache Spark, Apache Flink, Ray.io and Airflow,\n\n* \nExperience with using and building CI/CD pipelines\n\n* \nExperience with Docker/Kubernetes or Serverless environments.\n\n* \nExperience with SQS/SNS, Apache Kafka, RabbitMQ or other brokers.\n\n* \nExperience with public cloud providers, e.g. AWS, GCP, Azure, DigitalOcean etc.\n\n* \nExperience with blockchain systems.\n\n\n\nOnce you submit your application, you will receive an automated email from the recruitee.com domain within a few minutes acknowledging we have received your application. If you do not receive this email within a few minutes, please check your spam folder or other filtered folders. And to ensure our future communications reach you, please add emails from the recruitee.com domain to your safe list. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Python, Serverless, Cloud, Node, Senior and Engineer jobs that are similar:\n\n
$65,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nRemote job
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
\n๐ก CryptoQuant operates an easy and standardized Web3 data analysis service to provide institutional and retail investors in 150 countries around the world with the data analysis tools necessary for investing in Web3 assets.\n\nCryptoQuant Tech team collects, processes, and analyzes data on transactions of blockchain assets such as BTC and ETH, as well as various types of financial activities (e.g., DeFi) that take place on the blockchain, and makes it easy for investors to understand the data.\n\nWe are looking for a data engineer who wants to work with a large amount of real-time blockchain transaction data and experience the fascination of blockchain data at a deeper level than anyone else.\n\n\n \nRoles & Responsibilities\n\n* Develop data pipelines to collect and process blockchain data\n* Extract raw data from various data sources\n\n* Build data pipelines to process and load data\n\n* Manage cloud infrastructure for data pipeline operation\n\n* Develop CI/CD and monitoring system for building automated pipelines\n\n\n\n\n* Build a data warehouse for blockchain data analysis\n* Develop blockchain domain data mart\n\n* data quality management such as data consistency verification\n\n\n\n\n* Possible data engineering works are the following:\n* Build data pipeline of Bitcoin / EVM (e.g. Ethereum) / Non-EVM series blockchains\n\n* develop blockchain data analysis data mart\n\n* develop data API for serving blockchain data\n\n\n\n\n\n\nWhile carrying out these tasks, you can expect the following growth\n\n* You can accumulate experience in operating large-scale data warehouses while developing TB-scale or greater web3 data.\n\n* You can directly participate in the design and implementation of large-scale parallel processing and efficient workflow design while designing high throughput batch processing and low latency real-time streaming ETL infrastructure.\n\n* You can quickly increase your overall understanding of the crypto and web3 ecosystem by collaborating with global leading web3 data scientists and analysts.\n\n* With a customer-centric mindset, you can actively execute projects with project ownership.\n\n Requirements\n\n\n* Fits well with CryptoQuant culture. (https://bit.ly/3r4TH1v)\n\n* Experience with data processing using Python, SQL.\n\n* Experience with workflow management platforms (e.g. Apache Airflow)\n\n* Experience building and operating data marts for various purposes\n\n* Experience with Docker Containers or Kubernetes\n\n* Experience in building CI / CD\n\n* Experience with Git / GitHub\n\n* Experience with Python web frameworks (e.g. Flask, FastAPI)\n\n\n Nice to have\n\n\n* Experience in designing, building, and operating real-time data processing (e.g. Kafka) or micro-batch data processing pipelines in a lead role\n\n* Experience building, developing, and operating Kubernetes clusters at work\n\n* Experience working with data warehousing solutions (e.g. Hadoop, Spark)\n\n* Experience working with data discovery platforms (e.g DataHub, Amundsen)\n\n* At least 3 years of relevant experience with RDB, NoSQL, etc.\n\n* Experience in using and managing cloud services (e.g. AWS, GCP) / data platform (e.g. Snowflake)\n\n* Knowledge of various blockchain concepts and structures\n\n* Experience with blockchain node (e.g. Bitcoin, Ethereum) operations or on-chain data\n\n* Strong communication skills and good written and spoken English ability.\n\n\n Work location and working hours\n\nLocation: Remote\n\nWorking hours: 40 Hours per week (required for 1~3 core hours for collaboration)\n\n \nHiring Process\n\nReview application > Take Home Assignment > Technical Interview > Cultural-fit Interview > Reference check > Job offer > Hiring\n\n \n\nOther Notices\n\n\n* Individual notifications regarding the results of each stage and interview schedule will be sent via email.\n\n* Results of document submissions will be communicated within a week after submission.\n* If there is any delay compared to the results announced in the advertisement, we will also contact you individually.\n\n\n\n\n* Pre-tasks will be notified through a passing email upon document approval and must be submitted within 7 days.\n\n* Procedures following the pre-tasks will be conducted through schedule coordination.\n\n* Job interviews will be based on the information provided in your submitted documents and the results of pre-tasks.\n\n* Cultural fit interviews are interviews to assess the cultural fit of applicants based on the corporate culture of Cryptoquant.\n\n\n \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Web3, Design, Crypto, Python, Docker, Cloud, Git, Node, API and Engineer jobs that are similar:\n\n
$72,500 — $117,500/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nWorldwide
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
Voltron Data is hiring a Remote Senior C++ Software Engineer Data Engines
\nWe are looking for a highly motivated Senior C++ Software Engineer for Data Engines, youโll have the opportunity to work directly on Theseus, the accelerator-native data processing engine built for composability. You will work closely with Voltron Data development teams to build, optimize and maintain our data execution framework, adding new features, making it run faster and more scalable and even contributing to new core architectural components that will enable the engine to run at Petabyte scale.\n\n \n\nWhy work at Voltron Data?\n\n\n* We are Going for Impact: We are a Series A, venture-backed startup assembling a global team to build a new foundation for data analytics with Apache Arrow. This foundation will usher in a wave of innovation in data processing that can take full advantage of the speed and efficiency offered by modern hardware. \n\n* We are Committed to Bridging Open Source Communities: We are a collection of open source maintainers who have been driving open source ecosystems over the last 15 years, particularly in the C++, Python, and R programming ecosystems. \n\n* We are Building a Diverse, Inclusive Company: We are creating a representative, equitable, and respectful workplace that prioritizes employee growth. Everyone at Voltron Data is bought into the companyโs success; all voices are critical to shaping the organizationโs future. \n\n\n\nTimeline:\n\nBelow is a rough timeline of where you can expect to be at different points during your career path starting in this position.\nUpon Joining:\n\n\n* Spending time learning about the Apache Arrow, the compute primitives we use in Theseus, the query parser and optimizer and other foundational components.\n\n* Diving into the data processing engine architecture, how all the different components interact with each other and how data flows through the compute graph. \n\n* Understanding memory management mechanics, including spilling memory from GPU, to Host and Disk.\n\n* Learning and embracing the software development culture at Voltron Data.\n\n\n\nWithin a month:\n\n\n* Profiling single node and distributed queries executions and analyzing the engine telemetry to better understand how the engine works and how to solve distributed engine issues.\n\n* Diving deep into the various distributed relational algebra algorithms to understand how they work and how they can be improved.\n\n* Working with the team on fixing bugs, implementing simple optimizations or code refactoring projects.\n\n\n\nWithin 6 months:\n\n\n* Building new relational algebra components to expand SQL coverage or DataFrame functionality coverage.\n\n* Making small improvements to more sophisticated engine components such as resource management, task scheduling, and fault tolerance.\n\n\n\nWithin 12 months:\n\n\n* Proposing and implementing core architecture improvements to the engine.\n\n* Working on challenging tasks such as language agnostic user defined functions, multi-query concurrency, and multi-tenancy.\n\n* Integrating the engine with other components and features developed by other teams in the company to provide enterprise grade customer experiences.\n\n\n\nPrevious experience that could be helpful:\n\n\n* Experience with data processing engines or frameworks\n\n* Experience in distributed and multi-threaded systems\n\n* Experience in HW resource management including memory and thread pools \n\n* Working with SQL and non-SQL systems and their computational abstractions\n\n* Developing in C++, especially using modern C++\n\n* Developing for multiple types of hardware (i.e. CPU, GPU)\n\n\n\n\nUS Compensation - The salary range for this role is between $171,000.00 to $210,000.00. We have a global market-based pay structure which varies by location. Please note that the base pay range is a guideline and for candidates who receive an offer, the exact base pay will vary based on factors such as actual work location, skills and experience of the candidate. This position is also eligible for additional incentives such as equity awards.\n\n \n\n#LISM1 \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Node, Senior and Engineer jobs that are similar:\n\n
$65,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.