Filecoin is hiring a Remote Marketing and Events Summer Intern
\nAbout Filecoin Foundation \nFilecoin Foundation (FF) is an independent organization that facilitates governance of the Filecoin network, funds critical development projects, supports the growth of the Filecoin ecosystem, and advocates for Filecoin and the decentralized web. In 2017, the creators of Filecoin envisioned that an independent Filecoin Foundation would serve as the long-term governance body for the Filecoin Ecosystem. They gave the Foundation the mandate to โgrow an open ecosystem for decentralized storageโ and to โgive developers an open and sustainable platform to build, enhance and monetize those services.โ They wanted the Foundation to be modeled on Foundations for other open source projects like the Apache Software Foundation, the Mozilla Foundation and the Linux Foundation. The Filecoin Foundation operates independently of Protocol Labs, the organization that designed and built the Filecoin network. As a member of our early-stage team, you will have the opportunity to help define our growth as the organization scales. At Filecoin Foundation, we are a fully remote organization and support a remote, collaborative, and inclusive working culture from anywhere in the world.\n\nFilecoin Foundation is looking for a talented Marketing and Events Intern. The ideal candidate has sharp written and verbal communication skills, attention to detail, and curiosity for Web3 communications, marketing, and events careers.\n\nYou will work alongside the Events and Marketing team at Filecoin Foundation as we scale our community event program. You will support our participation at industry events with duties like logistics management, onsite staffing, and social/marketing needs. You will also aid in creating engaging and informative content, including blog posts, social posts, and external messaging to educate our audience about the benefits of decentralized storage on the Filecoin network. \n\nThis role is non-technical and requires keen attention to detail, prioritization, communication, and follow-through. Our team is diverse and interdisciplinary, and we welcome your interesting, non-traditional, and/or early career experiences. The kind of person who loves writing, solving puzzles, and having their hands in multiple projects at one time is perfect for this role! \n\nResponsibilities \n\n\n* Support day-to-day operations of events, including (but not limited to): coordinating end-to-end workflow for programs, managing event calendaring for the team, managing event website and keeping it up to date, overseeing logistics and shipping, keeping trackers and reports up to date\n\n* Research and create compelling content such as blog posts, articles, whitepapers, and social media posts to educate our audience about the benefits of decentralization and the Filecoin network\n\n* Assist in the coordination and execution of content marketing campaigns to increase brand awareness and engagement\n\n* Maintain and organize event productivity spaces (Notion/Airtable/Google Drive/Slack) and team calls/notes continuously\n\n* Support various event aspects such as staffing, runbooks, registration, and post-event reconciliation\n\n* Research industry calendars of main Crypto/Web3/Web2 events and update our databases\n\n* Manage web and marketing content to support the growth of the Filecoin Ecosystem Explorer page\n\n* Support all Orbit community program administrative duties such as invoicing, quarterly planning, event approvals, etc. \n\n* Maintain daily communication with events, communications, and marketing teams on event updates\n\n* Create post-event registration and attendee reports \n\n* Organize and manage vendor and partner contacts in CMS\n\n* Stay up-to-date with industry trends to ensure content is current and relevant\n\n* Help manage and maintain content calendars and publishing schedules\n\n* Support the translation of technical concepts into easily understandable content\n\n\n\n\nYour Profile\n\n\n* Current or recent enrollment in a graduate or undergraduate degree program with an interest and aptitude to work in event planning, customer service, marketing, communications, PR, hospitality, or Web3\n\n* Strong written and verbal communication skills and interpersonal skills\n\n* Exceptional organization skills and a keen attention to detail\n\n* Flexible and adaptable; able to seamlessly switch priorities as needed and balance short-term deliverables with long-term strategic goals \n\n* Team player who thrives in a high-energy, collaborative work environment\n\n* Bridge-builder within and between organizations\n\n* Ability to keep track of multiple workstreams\n\n* Ability to see the big picture even while operating in deep in the weeds\n\n* Event logistics and project management experience a plus\n\n* Knowledge of Customer Relationship Management systems and event registration tools a plus\n\n\n\n\nA reasonable hourly rate estimate of the current range for this positions is $20/hr - $22/hr.\n\n\n \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Web3 and Marketing jobs that are similar:\n\n
$82,500 — $145,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nWorldwide
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
\nItโs an exciting time to join Metrika! A Series A-funded startup with teammates across the US, Canada, UK, and Europe, we are building the world's premier risk management and compliance platform for digital assets. Metrika works with financial institutions and regulators to help them identify, measure, manage and monitor any risk related to blockchain networks and digital assets.\n\n\n\n\nAs a Senior Data Engineer you will be able to contribute, influence, and take ownership in significant parts of our systems. Our goal is to build a very high performance platform, capable of analyzing thousands of data points across multiple blockchain networks in real-time.\n\n\n\n\nIf you are a Senior Data Engineer, with a solid understanding of data lakes, data warehouses, ETL, distributed systems, passion for your work and would love to work with a geographically distributed team, in an emerging industry join us! No prior experience in blockchain/digital assets is necessary, but an interest in learning and being deeply immersed is.\n\n\n\n\nWhat you'll be doing:\n\n\n\n\n* \nDesigning, implementing and maintaining data processing pipelines โ this includes ingestion, clean up, transformation, aggregation, batch and streaming jobs, as well managing the data lifecycle to ensure affordable and performant long-term storage across our data stores and data lake. You will work closely with our software engineers, SREs and our Analytics team to make sure data smoothly flows across Metrika and beyond to our customers and users.\n\n* \nWorking under a Scrum or Kanban framework.\n\n* \nOwning your work. This means being proud of your work, actively striving for excellence, observing the best practices of your craft and always aiming to improve your skill.\n\n* \nUnderstanding, participating and contributing to the company goals, regardless of your role. Metrika is a small company with a very inclusive culture. We are looking for people that share those values with us.\n\n\n\n\n\n\nPlease note: Our Engineering team is predominantly based in Europe. This position is currently open to those resident and currently able to work in the European Economic Area (EU, Norway, Liechtenstein), Switzerland, and the UK.\n\n\n\n\nMetrika Inc. is an Equal Opportunity employer. All applicants will be considered without regard for race, color, national origin, ethnicity, gender, disability, sexual orientation, gender identity, or religion.\n\n\n\nWe are looking for individuals with:\n\n* \nA Bachelor's degree in Computer Science, Electrical Engineering, Physics or Mathematics. Masters or higher degrees preferred.\n\n* \nMulti-year experience in data engineering, in large-scale production environments.\n\n* \nAt Metrika we mostly use Python for data processing: most of our ETL/Data processing jobs are written in Python. You will need to have a solid understanding of the concepts of data governance, data lineage/provenance.\n\n* \nYou will need to have prior experience with scheduling systems (e.g., Airflow), \n\n* \nProven experience with databases (SQL & NoSQL such as Postgres and MongoDB), distributed query engines (e.g. Trino) and distributed computing frameworks (e.g., Ray). \n\n* \nExcellent understanding of TDD, agile development methodology and version control.\n\n* \nThe ability to function autonomously to solve problems, and deliver working software. Our remote environment and geographic distribution requires people that can work well on their own.\n\n* \nThe ability to communicate well with your team, both interactively and asynchronously, and that of being a positive, constructive team member.\n\n\n\n\nYou'll be a great fit if you have:\n\n* \nWorked and contributed to a Big Data production environment, handling multiple GB of data per day.\n\n* \nGood knowledge of Python.\n\n* \nExperience with Trino and Airflow.\n\n* \nExperience with using and building CI/CD pipelines.\n\n* \nExperience with Docker/Kubernetes or Serverless environments.\n\n* \nExperience with SQS/SNS, Apache Kafka, RabbitMQ or other brokers.\n\n* \nExperience with public cloud providers, e.g. AWS, GCP, Azure, DigitalOcean etc.\n\n\n\n\nPerks & Benefits\n\n\n* Competitive salary and equity compensation\n\n* Medical insurance (based on location)\n\n* All-remote\n\n* Metrika offer a generous budget for your home office\n\n* Supported attendance at blockchain conferences\n\n* Budget to meet other Metrikaers in your locality, if applicable\n\n\n\n\n\n\n\nOnce you submit your application, you will receive an automated email from the recruitee.com domain within a few minutes acknowledging we have received your application. If you do not receive this email within a few minutes, please check your spam folder or other filtered folders. And to ensure our future communications reach you, please add emails from the recruitee.com domain to your safe list. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Python, Serverless, Cloud, NoSQL, Senior and Engineer jobs that are similar:\n\n
$60,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nRemote job
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
\nItโs an exciting time to join Metrika! A Series A-funded startup in growth-mode with teammates across the US, Canada, UK, and Europe, we are building the world's premier operational intelligence platform for blockchain networks. Metrika partners with blockchain protocols, foundations, and node runners to help them and their community members analyze individual and network-wide metrics of their Distributed Ledger Technology (DLT) networks to maintain and improve their performance, security, and reliability.\n\n\n\nThese are the early days of our platform, and as a Senior Data Engineer you will be able to contribute, influence, and take ownership in significant parts of our systems. Our goal is to build a very high performance platform, capable of analyzing thousands of transactions across multiple blockchain networks in real-time.\n\n\n\nIf you are a Senior Data Engineer, with a solid understanding of data lakes, data warehouses, ETL, distributed systems, passion for your work and would love to work with a geographically distributed team, in an emerging industry join us! No prior experience in blockchain necessary, but an interest in learning and being deeply immersed is.\n\n\n\nWhat you'll be doing:\n\n\n\n* \nDesigning, implementing and maintaining data processing pipelines โ this includes ingestion, clean up, transformation, aggregation, batch and streaming jobs, as well managing the data lifecycle to ensure affordable and performant long-term storage across our data stores and data lake. You will work closely with our software engineers, SREs and our Analytics team to make sure data smoothly flows across Metrika and beyond to our customers and users.\n\n* \nWorking under a Scrum or Kanban framework.\n\n* \nOwning your work. This means being proud of your work, actively striving for excellence, observing the best practices of your craft and always aiming to improve your skill.\n\n* \nUnderstanding, participating and contributing to the company goals, regardless of your role. Metrika is a small company with a very inclusive culture. We are looking for people that share those values with us.\n\n\n\n\n\n\nPlease note: Our Engineering team is predominantly based in Europe and the eastern United States. This position is currently open to those resident and currently able to work in the European Economic Area (EU, Norway, Liechtenstein), Switzerland, the UK as well the eastern United States/Canada (UTC-4/UTC-5 timezone).\n\n\n\n\nMetrika Inc. is an Equal Opportunity employer. All applicants will be considered without regard for race, color, national origin, ethnicity, gender, disability, sexual orientation, gender identity, or religion.\n\n\nWe are looking for individuals with:\n\n* \nA Bachelor's degree in Computer Science, Electrical Engineering, Physics or Mathematics. Masters or higher degrees preferred.\n\n* \nMulti-year experience in data engineering, in large-scale production environments.\n\n* \nAt Metrika we mostly use Python for data processing; most of our ETL/Data processing jobs are written in Python. You will need to have some familiarity with scheduling systems (e.g. Airflow, luigi etc.), data transformation tools (e.g. dbt), distributed compute frameworks (e.g. Apache Spark, Apache Flink, ray.io etc.), and a solid understanding of the concepts of data governance, data lineage/provenance.\n\n* \nExcellent understanding of TDD, agile development methodology and version control.\n\n* \nThe ability to function autonomously to solve problems, and deliver working software. Our remote environment and geographic distribution requires people that can work well on their own.\n\n* \nThe ability to communicate well with your team, both interactively and asynchronously, and that of being a positive, constructive team member.\n\n\n\n\n\n\nYou'll be a great fit if you have:\n\n* \nWorked and contributed to a Big Data production environment, handling multiple GB of data per day.\n\n* \nGood knowledge of Python.\n\n* \nExperience with Apache Spark, Apache Flink, Ray.io and Airflow,\n\n* \nExperience with using and building CI/CD pipelines\n\n* \nExperience with Docker/Kubernetes or Serverless environments.\n\n* \nExperience with SQS/SNS, Apache Kafka, RabbitMQ or other brokers.\n\n* \nExperience with public cloud providers, e.g. AWS, GCP, Azure, DigitalOcean etc.\n\n* \nExperience with blockchain systems.\n\n\n\nOnce you submit your application, you will receive an automated email from the recruitee.com domain within a few minutes acknowledging we have received your application. If you do not receive this email within a few minutes, please check your spam folder or other filtered folders. And to ensure our future communications reach you, please add emails from the recruitee.com domain to your safe list. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Python, Serverless, Cloud, Node, Senior and Engineer jobs that are similar:\n\n
$65,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nRemote job
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
Wealthsimple is hiring a Remote Analytics Developer Market Risk DSE
\nYour career is an investment that grows over time!\n\n\nWealthsimple is on a mission to help everyone achieve financial freedom by reimagining what it means to manage your money. Using smart technology, we take financial services that are often confusing, opaque and expensive and make them transparent and low-cost for everyone. Weโre the largest fintech company in Canada, with over 3 million users who trust us with more than $30 billion in assets.\n\n\nOur teams ship often and make an impact with groundbreaking ideas. We're looking for talented people who keep it simple and value collaboration and humility as we continue to create inclusive and high-performing teams where people can be inspired to do their best work.\n\n\nAbout the team\n\n\nThe Risk Data Science & Engineering team consists of data scientists and analytics developers with diverse experiences and educational backgrounds. The team is responsible for delivering high-quality measurement tools and predictive analytics to support data-driven decision making across all risk stripes. The team is staffed with seasoned data and modeling experts and is well-supported by product managers, software developers, and risk strategists.\n\n\nWe are hiring for an Analytics Developer in the Market Risk DSE team. In this role you will be responsible for building robust, efficient and integrated data models that enable our risk analytics and machine learning. These data models will serve as the foundation of our risk strategies, allowing the team to monitor and detect risk trends, build both traditional and machine learning models, and launch new financial products. As an Analytics Developer, you will be leading the charge against metric definition and providing risk strategy inputs. A successful Analytics Developer is able to blend business acumen and software engineering best practices while effectively communicating with stakeholders.\n\n\n\nIn this role, you will have the opportunity to:\n* Build data models in the cloud data warehouse that will be used as the source of truth for analytics and machine learning;\n* Develop comprehensive reports and dashboards to support data-driven decision making;\n* Implement and maintain robust data controls to ensure data integrity and accuracy;\n* Apply software engineering best practices like version control and continuous integration to the analytics code base;\n* Ideate and implement new models and analytics;\n* Translate business requirements into data models that will help stakeholders answer key business questions;\n* Ensure models are well tested, documented and maintained;\n* Believe that simple is better, Occam's razor is your friend;\n* Take ownership and ship it; release incrementally and iteratively;\n* Teach and learn from your teammates. We value making others successful!\n\n\n\nSkills we are looking for:\n* Strong technical skills in manipulating large data sets with complex SQL and Python;\n* Keen interest in financial markets;\n* Experience with market data, market risk models, trading risk management, or trading operations would be an asset;\n* Experience with dbt;\n* Expertise in data visualization. Experience with data visualization tools on the cloud (e.g. Apache Superset / Quicksight) is a plus;\n* Comfortable with software engineering best practices like version control and using Git;\n* Experience with cloud data warehousing (e.g. Snowflake, Bigquery, Redshift);\n* Proficient understanding of data warehousing methodologies and concepts (Kimball, Inmon, etc);\n* Excellent communication skills in order to translate business requirements into data models and maintain clear documentation;\n* Able to build and maintain multi-functional relationships with various teams across the business;\n* Experience with graph databases is a plus;\n\n\n\n\n\nWhy Wealthsimple?\n๐ค Competitive Salary with top-tier health benefits and life insurance\n๐ Retirement savings matching plan using Wealthsimple Work\n๐ด 20 vacation days per year and unlimited sick and mental health days\n๐ Up to $1500 per year towards wellness and professional development budgets respectively \n๐ซ 90 days away program: Employees can work internationally in eligible countries for up to 90 days per calendar year \n๐ A wide variety of peer and company-led Employee Resource Groups (ie. Rainbow, Women of Wealthsimple, Black @ WS) \n๐ Company-wide wellness days off scheduled throughout the year\n\n\nWeโre a remote-first team, with over 1000 employees coast to coast in North America. Be a part of our Canadian success story and help shape the financial future of millions โ join us!\n\n\nRead our Culture Manual and learn more about how we work.\n\n\nDEI Statement\nAt Wealthsimple, we are building products for a diverse world and we need a diverse team to do that successfully. We strongly encourage applications from everyone regardless of race, religion, colour, national origin, gender, sexual orientation, age, marital status, or disability status. \n\n\nAccessibility Statement\nWealthsimple provides an accessible candidate experience. If you need any accommodations or adjustments throughout the interview process and beyond, please let us know, and we will work with you to provide the necessary support and make reasonable accommodations to facilitate your participation. We are continuously working to improve our accessibility practices and welcome any feedback or suggestions on how we can better accommodate candidates with accessibility needs. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Cloud and Cloud jobs that are similar:\n\n
$45,000 — $80,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nCanada
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
\nAxoni is building the next generation of capital markets technology. Our solutions are used by the worldโs leading banks, asset managers, hedge funds, and infrastructure providers. Our diverse team focuses every day on our goal of building products that will change and improve how our clients and the markets will interact. \n\n\nWe are seeking talented, motivated professionals that want to be part of this once-in-a-career opportunity to not only see, but also drive the incredible changes coming to global capital markets. We are building a culture where our team feels valued and everyone is given an opportunity to grow and succeed. We try to live by our Core Values and demonstrate what we believe represent the kind of company we are working to build. These Values are: Delivery is everything; Choose Kindness; Be better every day.\n\n\nAxoni is looking for Java Software Engineers who will be responsible for software development for our biggest client initiative. Our projects vary across multiple industries including Bond Issuance, Securities Lending, and Equity Swaps to deliver a seamless, optimized experience all the way to the end user. You will work directly with our clients to understand and solve their largest pain points. \n\n\n*Selected Hiring Hubs Include: New York, New Jersey, Pennsylvania, Connecticut, DC Area, North Carolina, Florida, Texas, and England* \n\n\n\nYou will: \n* Use Java to develop cloud-hosted, API-first, microservices and applications\n* Handle end-to-end development, including coding, testing, debugging and reviewing code\n* Interact with users and development teams to gather and define requirements and analyze user stories for validity and feasibility\n* Work within the team on iterative development that delivers high-quality, stable services\n* Engineer effective, defect-free configurations and code that meets business requirements and team standards\n* Interact with messaging systems like Apache Kafka, MQ, etc.\n* Work in a scrum team and follow Agile and Test Driven Development best practices\n* Work with containerization/orchestration tools such as Docker or Kubernetes\n\n\n\nQualifications: \n* 5+ years of professional software development experience using Java\n* Experience designing distributed enterprise software\n* Experience working with DevOps tools such as Kubernetes/Helm, Terraform, Docker, ect.\n* Experience deploying and supporting production workloads\n* Experience building REST services and/or microservices\n* Strong database experience, preferably with PostgreSQL, MySQL, Oracle, or DB2\n* Familiarity with tools and frameworks in the Java ecosystems (Spring, Spring Boot, Vert.x etc.)\n* Experience with AWS infrastructure\n* Experience writing concurrent and multi-threaded java applications\n\n\n\n\nBonus Points: \n* Experience with SaaS \n* Capital markets and fintech experience\n\n\n\n\n\nIndividuals seeking employment at Axoni are considered without regards to race, color, religion, national origin, age, sex, marital status, ancestry, physical or mental disability, veteran status, gender identity, or sexual orientation. You are being given the opportunity to provide the following information in order to help us comply with federal and state Equal Employment Opportunity/Affirmative Action record keeping, reporting, and other legal requirements. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Docker, DevOps, Java, Legal and Engineer jobs that are similar:\n\n
$70,000 — $120,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nEngland
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
Match Group is hiring a Remote Senior Software Engineer Backend
\nHinge is the dating app designed to be deleted\n\n\nIn today's digital world, finding genuine relationships is tougher than ever. At Hinge, weโre on a mission to inspire intimate connection to create a less lonely world. Weโre obsessed with understanding our usersโ behaviors to help them find love, and our success is defined by one simple metricโ setting up great dates. With tens of millions of users across the globe, weโve become the most trusted way to find a relationship, for all.\n\n\nCollaborate with a cross-disciplinary team to build features and work with other engineers to plan out projects. Design and build backend systems with an emphasis on quality and scalability. Own complex projects end-to-end and effectively communicate to stakeholders. Work in a cloud native tech stack: Kubernetes, AWS, Go web services, Postgres, Redis, Kafka. Be a thought partner for backend team strategy and technical direction. Create and maintain feedback cycles with your peers and manager. Operate and maintain production systems. Support and mentor junior developers. Assist with team hiring and learning. Use strong communication skills (written and verbal) to provide product and project ideas to contribute to trust and safety goal. Telecommuting may be permitted. When not telecommuting, must report to 809 Washington St, New York, NY 10014. Salary: $169,229 - $220,000 per year.\n \n \nMinimum Requirements: Bachelor's degree or U.S. equivalent in Electrical Engineering, Computer Science, Computer Engineering, Software Engineering, Information Technology, or related field, plus 5 years of professional experience as Software Engineer, Software Developer, or any occupation/position/job title involving building backend infrastructures. In lieu of a Bachelor's degree plus 5 years of experience, the employer will accept a Masterโs degree or U.S. equivalent in Electrical Engineering, Computer Science, Computer Engineering, Software Engineering, Information Technology, or related field plus 3 years of professional experience as Software Engineer, Software Developer, or any occupation/position/job title involving building backend infrastructures. Must also have the following: 3 years of professional experience building backend infrastructures for consumer-facing features (Business to Consumer) built on iOS and Android; 3 years of professional experience handling large volumes (millions daily) of data within AWS using Python and Golang scripting languages and handling cloud-based container including Docker and Kubernetes; 3 years of professional experience handling data and event streaming using Apache Spark and handling data storage using relational databases including MySQL and NoSQL database including PostgreSQL and Redis; 3 years of professional experience performing and employing software engineering best practices for the full software development life cycle (including coding standards, code reviews, source control management, build processes, testing, and operations); 2 years of professional experience performing backend software engineering (including leading and collaborating with Internal Tooling, Bad Actor Detection, Privacy & Compliance and Safety Product teams across web and apps) and developing backend infrastructures to drive systems that support the trust and safety of users with microservices written in Golang; 2 years of professional experience leading and creating project roadmaps of deployments for B2C web applications and mobile apps (including iOS and Android) and breaking down step to designate to peers; and 2 years of professional experience reviewing peer code and mentoring junior engineers.\n \nPlease send resume to: [email protected]. Please specify ad code [WLLL].\n\n\n\n\n$169,229 - $220,000 a yearFactors such as scope and responsibilities of the position, candidate's work experience, education/training, job-related skills, internal peer equity, as well as market and business considerations may influence base pay offered. This salary range is reflective of a position based in New York, New York.\n\n#LI-DNI\n\n\nAs a member of our team, youโll enjoy:\n\n\n401(k) Matching: We match 100% of the first 10% of pre-tax 401(k) contributions you make, up to a maximum of $10,000 per year.\n\n\nProfessional Growth: Get a $3,000 annual Learning & Development stipend once youโve been with us for three months. You also get free access to Udemy, an online learning and teaching marketplace with over 6000 courses, starting your first day.\n\n\nParental Leave & Planning: When you become a new parent, youโre eligible for 100% paid parental leave (20 paid weeks for both birth and non-birth parents.)\n\n\nFertility Support: Youโll get easy access to fertility care through Carrot, from basic treatments to fertility preservation. We also provide $10,000 toward fertility preservation. You and your spouse/domestic partner are both eligible.\n\n\nDate Stipend: All Hinge employees receive a $100 monthly stipend for epic datesโ Romantic or otherwise. Hinge Premium is also free for employees and their loved ones.\n\n\nERGs: We have eight Employee Resource Groups (ERGs)โAsian, Unapologetic, Disability, LGBTQIA+, Vibras, Women/Nonbinary, Parents, and Remoteโthat hold regular meetings, host events, and provide dedicated support to the organization & its community.\n\n\nAt Hinge, our core values areโฆ\n\n\nAuthenticity: We share, never hide, our words, actions and intentions.\n\n\nCourage: We embrace lofty goals and tough challenges.\n\n\nEmpathy: We deeply consider the perspective of others.\n\n\nDiversity inspires innovation\n\n\nHinge is an equal-opportunity employer. We value diversity at our company and do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status. We believe success is created by a diverse workforce of individuals with different ideas, strengths, interests, and cultural backgrounds.\n\n\nIf you require reasonable accommodation to complete a job application, pre-employment testing, or a job interview or to otherwise participate in the hiring process, please contact [email protected]. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Design, Python, Docker, Cloud, NoSQL, Mobile, Senior, Junior, Golang, Engineer and Backend jobs that are similar:\n\n
$60,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nNew York, New York
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
Daniel J Edelman Holdings is hiring a Remote Platform Engineer
\nWe are seeking a skilled Platform Engineer to join our team. They will be instrumental in developing and optimizing our cloud architecture as well as managing our development and analytics tooling. The ideal candidate has a proven track record of maintaining and developing cloud-based architecture which supports large-scale data initiatives for data science and software development teams. \n \nThe Platform Engineer will be responsible for designing, implementing, and managing cloud architectures to meet business requirements. The ideal candidate possesses expertise in translating business needs into scalable cloud solutions while ensuring security, reliability, and cost-effectiveness.\n\n\n\nResponsibilities:\n* Collaborate with business stakeholders to understand the product requirements and translate them into scalable and resilient cloud architectures.\n* Collaborate closely with Data Engineering, Data Science and Software Development teams to contribute to the design of the cloud solutions for our products.\n* End-to-end implementation of the optimized and secure cloud-based/cloud-native architecture. \n* Develop and document cloud architecture designs, ensuring alignment with industry best practices.\n* Provide expertise in cloud and platform engineering to the Product Data team, ensuring alignment with the company's strategic goals.\n* Contribution to the selection and integration of cloud-based vendors, tools and frameworks.\n* Keep up with emerging trends in cloud engineering and introduce new technologies or practices that can benefit the organization.\n* Implement security measures to safeguard cloud environments, including identity and access management, encryption, and compliance controls.\n* Conduct regular security assessments and address vulnerabilities promptly.\n* Monitor and optimize cloud infrastructure for performance, cost, and reliability.\n* Implement performance tuning strategies to enhance overall system efficiency.\n* Continuous improvement and innovation for cloud and platform engineering.\n* Implement and manage the provisioning of cloud resources based on project requirements.\n* Responsible for the maintenance and support of cloud-based and cloud-native architecture including access controls, security and networking.\n* Configure and fine-tune cloud infrastructure components for optimal performance.\n* Perform audits and assessments of cloud environments to ensure compliance with security and regulatory standards.\n* Provide recommendations for continuous improvement and adherence to best practices.\n* Lead the deployment of applications onto our cloud platform, ensuring seamless integration and functionality.\n* Manage and monitor cloud applications to maintain performance, availability, and scalability.\n\n\n\nQualifications:\n* 3-5 years of proven experience as a Platform Engineer or similar role in designing, implementing, and managing cloud architectures.\n* Expertise in constructing, installing, and maintaining large-scale cloud-native and cloud-based architecture.\n* Database management expertise: Postgres, Snowflake, Lucene-based search engines (Apache Solr/AWS OpenSearch/Elastic Search)\n\n\n* Cloud-Native tooling expertise: Amazon S3, AWS EMR, Amazon EC2 Amazon RDS, Amazon Sagemaker, Amazon ECS, Amazon ECR, Amazon VPC, Amazon IAM (*alternatives from other cloud providers are acceptable) \n* Cloud-based Application tooling: Databricks administration\n* Strong communication in English. Ability to communicate technical concepts to non-technical audience.\n* Cloud Certifications from Cloud providers (AWS. GCP, Azure) \n* Experience with Streaming Technologies such as Apache Kafka and AWS Kinesis\n* Experience Productionizing ML based cloud solutions.\n\n\n\n\n\n#LI-RT9\n\n\nEdelman Data & Intelligence (DXI) is a global, multidisciplinary research, analytics and data consultancy with a distinctly human mission.\n\n\nWe use data and intelligence to help businesses and organizations build trusting relationships with people: making communications more authentic, engagement more exciting and connections more meaningful.\n\n\nDXI brings together and integrates the necessary people-based PR, communications, social, research and exogenous data, as well as the technology infrastructure to create, collect, store and manage first-party data and identity resolution. DXI is comprised of over 350 research specialists, business scientists, data engineers, behavioral and machine-learning experts, and data strategy consultants based in 15 markets around the world.\n\n\nTo learn more, visit: https://www.edelmandxi.com \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Design, Cloud and Engineer jobs that are similar:\n\n
$70,000 — $100,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
Daniel J Edelman Holdings is hiring a Remote Senior Data Engineer
\nWe currently seeking a Senior Data Engineer with 5-7 yearsโ experience. The ideal candidate would have the ability to work independently within an AGILE working environment and have experience working with cloud infrastructure leveraging tools such as Apache Airflow, Databricks, DBT and Snowflake. A familiarity with real-time data processing and AI implementation is advantageous. \n\n\n\nResponsibilities:\n* Design, build, and maintain scalable and robust data pipelines to support analytics and machine learning models, ensuring high data quality and reliability for both batch & real-time use cases.\n* Design, maintain, optimize data models and data structures in tooling such as Snowflake and Databricks. \n* Leverage Databricks for big data processing, ensuring efficient management of Spark jobs and seamless integration with other data services.\n* Utilize PySpark and/or Ray to build and scale distributed computing tasks, enhancing the performance of machine learning model training and inference processes.\n* Monitor, troubleshoot, and resolve issues within data pipelines and infrastructure, implementing best practices for data engineering and continuous improvement.\n* Diagrammatically document data engineering workflows. \n* Collaborate with other Data Engineers, Product Owners, Software Developers and Machine Learning Engineers to implement new product features by understanding their needs and delivery timeously. \n\n\n\nQualifications:\n* Minimum of 3 years experience deploying enterprise level scalable data engineering solutions.\n* Strong examples of independently developed data pipelines end-to-end, from problem formulation, raw data, to implementation, optimization, and result.\n* Proven track record of building and managing scalable cloud-based infrastructure on AWS (incl. S3, Dynamo DB, EMR). \n* Proven track record of implementing and managing of AI model lifecycle in a production environment.\n* Experience using Apache Airflow (or equivalent) , Snowflake, Lucene-based search engines.\n* Experience with Databricks (Delta format, Unity Catalog).\n* Advanced SQL and Python knowledge with associated coding experience.\n* Strong Experience with DevOps practices for continuous integration and continuous delivery (CI/CD).\n* Experience wrangling structured & unstructured file formats (Parquet, CSV, JSON).\n* Understanding and implementation of best practices within ETL end ELT processes.\n* Data Quality best practice implementation using Great Expectations.\n* Real-time data processing experience using Apache Kafka Experience (or equivalent) will be advantageous.\n* Work independently with minimal supervision.\n* Takes initiative and is action-focused.\n* Mentor and share knowledge with junior team members.\n* Collaborative with a strong ability to work in cross-functional teams.\n* Excellent communication skills with the ability to communicate with stakeholders across varying interest groups.\n* Fluency in spoken and written English.\n\n\n\n\n\n#LI-RT9\n\n\nEdelman Data & Intelligence (DXI) is a global, multidisciplinary research, analytics and data consultancy with a distinctly human mission.\n\n\nWe use data and intelligence to help businesses and organizations build trusting relationships with people: making communications more authentic, engagement more exciting and connections more meaningful.\n\n\nDXI brings together and integrates the necessary people-based PR, communications, social, research and exogenous data, as well as the technology infrastructure to create, collect, store and manage first-party data and identity resolution. DXI is comprised of over 350 research specialists, business scientists, data engineers, behavioral and machine-learning experts, and data strategy consultants based in 15 markets around the world.\n\n\nTo learn more, visit: https://www.edelmandxi.com \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Python, DevOps, Cloud, Senior, Junior and Engineer jobs that are similar:\n\n
$60,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
\nAbout the Role: \n\nWe are building our documentation team and are looking for a Technical Writer to help lead the end-to-end vision, strategy, and implementation of Redpanda products. Reporting to the Technical Writing Manager, you will document multiple functional areas of Redpanda products, including all aspects of security, data balancing, and cluster configuration. You will advocate for the user experience of our products (including the core streaming platform and cloud SaaS areas), spearhead a user-centric approach, and help lead product strategy. You will be a foundational member of the documentation team, helping to build processes, and drive strategy. This is an opportunity to work closely with product and engineering leadership and teams at a fast-growing, well-funded startup.\n\nYou Will:\n\n\n* Collaborate with engineering teams to discuss technologies and workflows, and create top-notch documentation\n\n* Write, format, and edit content, including security configurations, cluster configuration, and Redpanda Cloud content\n\n* Contribute to web content designโa content design background is a plus\n\n* Facilitate knowledge sharing throughout our engineering, product management, and design teams and help produce accurate, well-written communications\n\n* Partner with content specialists to define documentation standards and best practices\n\n* Participate in assessing and proposing software and methods to continuously improve our documentation processes\n\n* Write and revise content/test features with the customer experience as the primary driver\n\n* Engage in community discussions with OSS, prospective, and paying customers\n\n* Be part of a diverse remote distributed team\n\n\n\n\nYou Have:\n\n\n* 4+ years of experience writing technical documentation for an external developer audience\n\n* Familiarity with basic software security fundamentals \n\n* Ability to distill complex technical concepts into clear, comprehensive, developer-focused documentation\n\n* Recent experience working in a docs-as-code environment \n\n* Understand and have worked on Git, open-source development, Asciidoc/Antora, front-end web development\n\n* Experience working with open source technologies and communities\n\n* Read and code in languages such as Java, golang or Python, and YAMLโfamiliarity with data streaming and/or Apache Kafka a plus\n\n* Excellent information architecture and information design skills\n\n* Familiar working with Apache Kafkaยฎ (event streaming), databases, or distributed systems\n\n* Rabid desire to help our customers get the information they need quickly and easily\n\n* Excellent writing, editing, and verbal communication skills\n\n\n\n\n \n\nU.S. base salary range for this role is $144,000 - $170,000 (CA, NY, WA) and $125,000 - $148,000 (other US locations). Our salary ranges are determined by role, level, and location. As a remote-first company, we strive to consider each candidate's job-related skills, location, experience, relevant education or training to determine individual base salary. Your talent partner will share more about the specific salary range for your preferred location during the hiring process. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Design, SaaS, Writer, Education, Cloud and Golang jobs that are similar:\n\n
$65,000 — $95,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nSan Francisco, California, United States
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
\nAbout Us\n\nAt People Data Labs, weโre committed to democratizing access to high-quality B2B data and leading the emerging DaaS economy. We empower developers, engineers, and data scientists to create innovative, compliant data products at scale with our clean, easy-to-use datasets of resume, company, location, and education data consumed through our suite of APIs. \n\nPDL is an innovative, fast-growing, global team backed by world-class investors, including Craft Ventures, Flex Capital, and Founders Fund. We scour the world for people hungry to improve, curious about how things work, and willing to challenge the status quo to build something new and better.\n\nRoles & Responsibilities:\n\n\n* Analyzing data using statistical techniques and tools to identify anomalous data, clean data, and derive meaningful insights and trends. \n\n* Ensuring data integrity, accuracy, and completeness throughout the analysis process.\n\n* Generate, maintain, and update dashboard reports using our business intelligence tool, highlighting key findings and trends.\n\n* Develop and maintain databases, data systems, and data analytics pipelines within database management systems.\n\n* Work with stakeholders in Engineering, Product, and Revenue to assist with data-related technical issues and support their data infrastructure and analytics needs.\n\n* Ensure the integrity and consistency of database schemas, including managing updates, version control, and documenting schema changes to support data analysis and reporting requirements.\n\n* Responsible for assistance and further development of our quality assurance process.\n\n\n\n\nTechnical Requirements\n\n\n* 3-5+ years industry experience with clear examples of strategic and analytical technical problem solving and implementation\n\n* Strong software development and analytics fundamentals\n\n* Expertise in with SQL & Python\n\n* Experience with Apache Spark or PySpark\n\n* Experience with data cleaning and data processing (e.g., cleaning, transformation) using SQL and Python.\n\n* Knowledge of modern data design and storage patterns (e.g., incremental updating, partitioning and segmentation, rebuilds and backfills)\n\n* Experience working in Databricks (including delta live tables, data lakehouse patterns, etc.)\n\n* Experience with cloud computing services (AWS (preferred), GCP, Azure or similar)\n\n* Experience with data warehousing (e.g., Databricks, Snowflake, Redshift, BigQuery, or similar)\n\n* Understanding of modern data storage formats and tools (e.g., parquet, ORC, Avro, Delta Lake)\n\n\n\n\nProfessional Requirements\n\n\n* Must thrive in a fast paced environment and be able to work independently\n\n* Can work effectively remotely (able to be proactive about managing blockers, proactive on reaching out and asking questions, and participating in team activities)\n\n* Strong written communication skills on Slack/Chat and in documents\n\n* You are experienced in writing data design docs (pipeline design, dataflow, schema design)\n\n* You can scope and breakdown projects, communicate and collaborate progress and blockers effectively with your manager, team, and stakeholders\n\n* Experience collaborating with Product and Engineering teams\n\n\n\n\nNice To Haves:\n\n\n* Degree in a quantitative discipline such as computer science, mathematics, statistics, or engineering\n\n* Experience working with business intelligence tools and dashboard creation\n\n* Experience working with data acquisition / data integration\n\n* Expertise with Python and the Python data stack (e.g., numpy, pandas, PySpark)\n\n* Experience evaluating data quality and maintaining consistently high data standards across new feature releases (e.g., consistency, accuracy, validity, completeness)\n\n\n\n\nOur Benefits\n\n\n* Stock\n\n* Competitive Salaries\n\n* Unlimited paid time off\n\n* Medical, dental, & vision insurance \n\n* Health, fitness, and office stipends\n\n* The permanent ability to work wherever and however you want\n\n\n\n\nNo C2C, 1099, or Contract-to-Hire. Recruiters need not apply.\n\nPeople Data Labs does not discriminate on the basis of race, sex, color, religion, age, national origin, marital status, disability, veteran status, genetic information, sexual orientation, gender identity or any other reason prohibited by law in provision of employment opportunities and benefits. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Design, Python, Education and Cloud jobs that are similar:\n\n
$65,000 — $95,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nSan Francisco, California, United States
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
Filecoin is hiring a Remote Marketing and Events Summer Intern
\nAbout Filecoin Foundation \nFilecoin Foundation (FF) is an independent organization that facilitates governance of the Filecoin network, funds critical development projects, supports the growth of the Filecoin ecosystem, and advocates for Filecoin and the decentralized web. In 2017, the creators of Filecoin envisioned that an independent Filecoin Foundation would serve as the long-term governance body for the Filecoin Ecosystem. They gave the Foundation the mandate to โgrow an open ecosystem for decentralized storageโ and to โgive developers an open and sustainable platform to build, enhance and monetize those services.โ They wanted the Foundation to be modeled on Foundations for other open source projects like the Apache Software Foundation, the Mozilla Foundation and the Linux Foundation. The Filecoin Foundation operates independently of Protocol Labs, the organization that designed and built the Filecoin network. As a member of our early-stage team, you will have the opportunity to help define our growth as the organization scales. At Filecoin Foundation, we are a fully remote organization and support a remote, collaborative, and inclusive working culture from anywhere in the world.\n\nFilecoin Foundation is looking for a talented Marketing and Events Intern. The ideal candidate has sharp written and verbal communication skills, attention to detail, and curiosity for Web3 communications, marketing, and events careers.\n\nYou will work alongside the Events and Marketing team at Filecoin Foundation as we scale our community event program. You will support our participation at industry events with duties like logistics management, onsite staffing, and social/marketing needs. You will also aid in creating engaging and informative content, including blog posts, social posts, and external messaging to educate our audience about the benefits of decentralized storage on the Filecoin network. \n\nThis role is non-technical and requires keen attention to detail, prioritization, communication, and follow-through. Our team is diverse and interdisciplinary, and we welcome your interesting, non-traditional, and/or early career experiences. The kind of person who loves writing, solving puzzles, and having their hands in multiple projects at one time is perfect for this role! \n\nResponsibilities \n\n\n* Support day-to-day operations of events, including (but not limited to): coordinating end-to-end workflow for programs, managing event calendaring for the team, managing event website and keeping it up to date, overseeing logistics and shipping, keeping trackers and reports up to date\n\n* Research and create compelling content such as blog posts, articles, whitepapers, and social media posts to educate our audience about the benefits of decentralization and the Filecoin network\n\n* Assist in the coordination and execution of content marketing campaigns to increase brand awareness and engagement\n\n* Maintain and organize event productivity spaces (Notion/Airtable/Google Drive/Slack) and team calls/notes continuously\n\n* Support various event aspects such as staffing, runbooks, registration, and post-event reconciliation\n\n* Research industry calendars of main Crypto/Web3/Web2 events and update our databases\n\n* Manage web and marketing content to support the growth of the Filecoin Ecosystem Explorer page\n\n* Support all Orbit community program administrative duties such as invoicing, quarterly planning, event approvals, etc. \n\n* Maintain daily communication with events, communications, and marketing teams on event updates\n\n* Create post-event registration and attendee reports \n\n* Organize and manage vendor and partner contacts in CMS\n\n* Stay up-to-date with industry trends to ensure content is current and relevant\n\n* Help manage and maintain content calendars and publishing schedules\n\n* Support the translation of technical concepts into easily understandable content\n\n\n\n\nYour Profile\n\n\n* Current or recent enrollment in a graduate or undergraduate degree program with an interest and aptitude to work in event planning, customer service, marketing, communications, PR, hospitality, or Web3\n\n* Strong written and verbal communication skills and interpersonal skills\n\n* Exceptional organization skills and a keen attention to detail\n\n* Flexible and adaptable; able to seamlessly switch priorities as needed and balance short-term deliverables with long-term strategic goals \n\n* Team player who thrives in a high-energy, collaborative work environment\n\n* Bridge-builder within and between organizations\n\n* Ability to keep track of multiple workstreams\n\n* Ability to see the big picture even while operating in deep in the weeds\n\n* Event logistics and project management experience a plus\n\n* Knowledge of Customer Relationship Management systems and event registration tools a plus\n\n\n\n\nA reasonable hourly rate estimate of the current range for this positions is $20/hr - $22/hr.\n\n\n \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Web3, Marketing and Linux jobs that are similar:\n\n
$80,000 — $125,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nWorldwide
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
\nVerana Health, a digital health company that delivers quality drug lifecycle and medical practice insights from an exclusive real-world data network, recently secured a $150 million Series E led by Johnson & Johnson Innovation โ JJDC, Inc. (JJDC) and Novo Growth, the growth-stage investment arm of Novo Holdings. \n\nExisting Verana Health investors GV (formerly Google Ventures), Casdin Capital, and Brook Byers also joined the round, as well as notable new investors, including the Merck Global Health Innovation Fund, THVC, and Breyer Capital.\n\nWe are driven to create quality real-world data in ophthalmology, neurology and urology to accelerate quality insights across the drug lifecycle and within medical practices. Additionally, we are driven to advance the quality of care and quality of life for patients. DRIVE defines our internal purpose and is the galvanizing force that helps ground us in a shared corporate culture. DRIVE is: Diversity, Responsibility, Integrity, Voice-of-Customer and End-Results. Click here to read more about our culture and values. \n\n Our headquarters are located in San Francisco and we have additional offices in Knoxville, TN and New York City with employees working remotely in AZ, CA, CO, CT, FL, GA, IL, LA, MA, NC, NJ, NY, OH, OR, PA, TN, TX, UT , VA, WA, WI. All employees are required to have permanent residency in one of these states. Candidates who are willing to relocate are also encouraged to apply. \n\nJob Title: Data Engineer\n\nJob Intro:\n\nAs a Data/Software Engineer at Verana Health, you will be responsible for extending a set of tools used for data pipeline development. You will have strong hands-on experience in design & development of cloud services. Deep understanding of data quality metadata management, data ingestion, and curation. Generate software solutions using Apache Spark, Hive, Presto, and other big data frameworks. Analyzing the systems and requirements to provide the best technical solutions with regard to flexibility, scalability, and reliability of underlying architecture. Document and improve software testing and release processes across the entire data team.\n\nJob Duties and Responsibilities:\n\n\nArchitect, implement, and maintain scalable data architectures to meet data processing and analytics requirements utilizing AWS and Databricks\n\nAbility to troubleshoot complex data issues and optimize pipelines taking into consideration data quality, computation and cost.\n\nCollaborate with cross-functional teams to understand and translate data needed into effective data pipeline solutions\n\nDesign solutions to solving problems related to ingestion and curation of highly variable data structures in a highly concurrent cloud environment.\n\nRetain metadata for tracking of execution details to reproducibility and providing operational metrics.\n\nCreate routines to add observability and alerting to the health of pipelines.\n\nEstablish data quality checks and ensure data integrity and accuracy throughout the data lifecycle.\n\nResearch , perform proof-of-concept and leverage performant database technologies(like Aurora Postgres, Elasticsearch, Redshift) to support end user applications that need sub second response time.\n\nParticipate in code reviews.\n\nProactive in staying updated with industry trends and emerging technologies in data engineering.\n\nDevelopment of data services using RESTful APIโs which are secure(oauth/saml), scalable(containerized using dockers), observable (using monitoring tools like datadog, elk stack), documented using OpenAPI/Swagger by using frameworks in python/java and automated CI/CD deployment using Github actions.\n\nDocument data engineering processes , architectures, and configurations.\n\n\n\n\nBasic Requirements:\n\n\nA minimum of a BS degree in computer science, software engineering, or related scientific discipline.\n\nA minimum of 3 years of experience in software development\n\nStrong programming skills in languages such as Python/Pyspark, SQL\n\nExperience with Delta lake, Unity Catalog, Delta Sharing, Delta live tables(DLT)\n\nExperience with data pipeline orchestration tools - Airflow, Databricks Workflows\n\n1 year of experience working in AWS cloud computing environment, preferably with Lambda, S3, SNS, SQS\n\nUnderstanding of Data Management principles(governance, security, cataloging, life cycle management, privacy, quality)\n\nGood understanding of relational databases.\n\nDemonstrated ability to build software tools in a collaborative, team oriented environment that are product and customer driven.\n\nStrong communication and interpersonal skills\n\nUtilizes source code version control.\n\nHands-on experience with Docker containers and container orchestration.\n\n\n\n\nBonus:\n\n\nHealthcare and medical data experience is a plus.\n\nAdditional experience with modern compiled programming languages (C++, Go, Rust)\n\nExperience building HTTP/REST APIs using popular frameworks\n\nBuilding out extensive automated test suites\n\n\n\n\nBenefits:\n\nWe provide health, vision, and dental coverage for employees\n\n\n\n\n\n\n\nVerana pays 100% of employee insurance coverage and 70% of family\n\nPlus an additional monthly $100 individual / $200 HSA contribution with HDHP\n\n\n\n\n\n\n\n\nSpring Health mental health support\n\nFlexible vacation plans\n\nA generous parental leave policy and family building support through the Carrot app\n\n$500 learning and development budget\n\n$25/wk in Doordash credit\n\nHeadspace meditation app - unlimited access\n\nGympass - 3 free live classes per week + monthly discounts for gyms like Soulcycle\n\n\n\n\nFinal note:\n\nYou do not need to match every listed expectation to apply for this position. Here at Verana, we know that diverse perspectives foster the innovation we need to be successful, and we are committed to building a team that encompasses a variety of backgrounds, experiences, and skills.\n\n \n\n \n\n \n\n \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Design, Docker, Testing, Cloud and Engineer jobs that are similar:\n\n
$70,000 — $100,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
People Data Labs is hiring a Remote Senior Data Engineer
\nAbout Us\n\nAt People Data Labs, weโre committed to democratizing access to high-quality B2B data and leading the emerging DaaS economy. We empower developers, engineers, and data scientists to create innovative, compliant data products at scale with our clean, easy-to-use datasets of resume, company, location, and education data consumed through our suite of APIs. \n\nPDL is an innovative, fast-growing, global team backed by world-class investors, including Craft Ventures, Flex Capital, and Founders Fund. We scour the world for people hungry to improve, curious about how things work, and willing to challenge the status quo to build something new and better.\n\nRoles & Responsibilities:\n\n\n* Build infrastructure for ingestion, transformation, and loading an exponentially increasing volume of data from a variety of sources using Spark, SQL, AWS, and Databricks\n\n* Building an organic entity resolution framework capable of correctly merging hundreds of billions of individual entities into a number of clean, consumable datasets.\n\n* Developing CI/CD pipelines and anomaly detection systems capable of continuously improving the quality of data we're pushing into production.\n\n* Devising solutions to largely-undefined data engineering and data science problems.\n\n* Work with stakeholders in Engineering and Product to assist with data-related technical issues and support their infrastructure needs\n\n\n\n\nTechnical Requirements\n\n\n* 5-7+ years industry experience with clear examples of strategic technical problem solving and implementation\n\n* Strong software development fundamentals.\n\n* Experience withPython Expertise with Apache Spark (Java, Scala, and/or Python-based)\n\n* Experience with SQL\n\n* Experience building scalable data processing systems (e.g., cleaning, transformation) from the ground up.\n\n* Experience using developer-oriented data pipeline and workflow orchestration (e.g., Airflow (preferred), dbt, dagster or similar)\n\n* Knowledge of modern data design and storage patterns (e.g., incremental updating, partitioning and segmentation, rebuilds and backfills)\n\n* Experience working in Databricks (including delta live tables, data lakehouse patterns, etc.)\n\n* Experience with cloud computing services (AWS (preferred), GCP, Azure or similar)\n\n* Experience with data warehousing (e.g., Databricks, Snowflake, Redshift, BigQuery, or similar)\n\n* Understanding of modern data storage formats and tools (e.g., parquet, ORC, Avro, Delta Lake)\n\n\n\n\nProfessional Requirements\n\n\n* Must thrive in a fast paced environment and be able to work independently\n\n* Can work effectively remotely (able to be proactive about managing blockers, proactive on reaching out and asking questions, and participating in team activities)\n\n* Strong written communication skills on Slack/Chat and in documents\n\n* You are experienced in writing data design docs (pipeline design, dataflow, schema design)\n\n* You can scope and breakdown projects, communicate and collaborate progress and blockers effectively with your manager, team, and stakeholders\n\n\n\n\nNice To Haves:\n\n\n* Degree in a quantitative discipline such as computer science, mathematics, statistics, or engineering\n\n* Experience working with entity data (entity resolution / record linkage)\n\n* Experience working with data acquisition / data integration\n\n* Expertise with Python and the Python data stack (e.g., numpy, pandas)\n\n* Experience with streaming platforms (e.g., Kafka)\n\n* Experience evaluating data quality and maintaining consistently high data standards across new feature releases (e.g., consistency, accuracy, validity, completeness)\n\n\n\n\nOur Benefits\n\n\n* Stock\n\n* Competitive Salaries\n\n* Unlimited paid time off\n\n* Medical, dental, & vision insurance \n\n* Health, fitness, and office stipends\n\n* The permanent ability to work wherever and however you want\n\n\n\n\nNo C2C, 1099, or Contract-to-Hire. Recruiters need not apply.\n\nPeople Data Labs does not discriminate on the basis of race, sex, color, religion, age, national origin, marital status, disability, veteran status, genetic information, sexual orientation, gender identity or any other reason prohibited by law in provision of employment opportunities and benefits. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Design, Python, Education, Cloud, Senior and Engineer jobs that are similar:\n\n
$65,000 — $105,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nSan Francisco, California, United States
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
About Usย \nAurora Labs is the development company behind Auroraโthe EVM blockchain that runs on the NEAR Protocol. We are also the developers of, and integration partner behind, Aurora Cloudโa suite of products that allow Web2 companies to capture the value of Web3.\nWe invite you to be a part of our team of smart, professional, result-oriented and fun individuals. Join us to help ensure that our background processes run smoothly while we are striving to become the best in the industry.\nAbout the team\nOur infrastructure team is responsible for building and supporting critical systems required for running and accessing NEAR and Aurora networks. That includes everything on the path of RPC requests before they hit the blockchain and block production and event delivery once transactions are executed.\nLoad balancing, caching, queueing, transaction simulation and block production is processed by the services written and maintained by the infrastructure team. These services operate at large scale and process terabytes of data. The platform is based on open-source software, such as Kubernetes, NATS, Jetstream, Blockscout, Grafana, Postgres and Near-core, alongside a few internally developed services.\nAll internally developed services are written in Go and implement core pieces of functionality such as Mempool management, NEAR chunk distribution, transaction pre-processing and simulation.\nAbout the position\nThis role is split between two responsibilities: software engineering (80%) and site reliability (20%).\nSoftware Engineering projects include:- Shield - a security service to protect users from making errors or executing malicious transactions.- Mempool - a system to store/reorder transactions before they can hit the blockchain.- Relayer - translates RPC calls on the read and write path from the end user.- Explorer - Blockscout-based system that provides a user interface.- Aurora Cloud - a system to automatically provision multiple infrastructure stacks for Aurora Engine.- CLI tools for pubsub and streaming infrastructure operations.- Indexers and blockchain event aggregation pipelines for monitoring purposes.\nReliability Engineering includes:- Automating configuration and maintenance of software components such as K8s, NATS, Influxdb, Postgres, Cloudflare using e.g. Ansible, Terraform, Helm and kubernetes operators.- Design and implementation of cloud-agnostic solutions without exclusively relying on specific cloud vendors.- Optimizing the latency and throughput of the pub-sub infrastructure.- Incident management, troubleshooting, monitoring, distributed tracing and recovery automation.ย \nAbout you\nYou are a software engineer with experience of creating and maintaining backend systems. You are familiar with the entire Linux stack and can easily find a bottleneck in a distributed system. You have developed CLI tools and backend services before and are comfortable applying your software development skills to automate your daily operations or to create a microservice on the request path of the end users.\nKey Qualifications\n- Experience with Devops or SRE as an engineering subject area, with proficiency in Golang.- Successful track-record and proven experience as a backend internet services software developer.- Knowledge of SDLC, including continuous integration and testing methodologies.- Understanding of base internet infrastructure services including DNS, HTTP,ย server virtualization, server monitoring in critical, large scale distributed systems.- Understanding of SRE principals, including monitoring, alerting, error budgets, fault analysis, and other common reliability engineering concepts, with a keen eye for opportunities to eliminate toil by code and process improvements.- Excellent verbal and written communication skills in English.\nDesired skills\n- Deep familiarity with Go or other system-oriented programming languages.- Experience with development within Kubernetes ecosystem, including operator framework, controllers and CRDs.- Experience with streaming and pubsub systems such as NATS, Apache Kafka, Apache Pulsar.- Automating operations processes via services and tools.- Configuration management and fleet orchestration via Puppet, Chef, Ansible, or others.- Cloud Services (AWS S3/EC2/CloudFront or equivalent).Join our dedicated team of blockchain industry professionals.Please apply today โ weโre standing by for your resume!\nIn applying at this job, I confirm and acknowledge that I read and understood the Privacy Notice published atย https://auroralabs.dev/privacy. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Design, Testing, DevOps, Cloud, Engineer, Linux and Backend jobs that are similar:\n\n
$70,000 — $100,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nWorldwide
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
\nWho We Are:\n\nSmithRx is a rapidly growing, venture-backed Health-Tech company. Our mission is to disrupt the expensive and inefficient Pharmacy Benefit Management (PBM) sector by building a next-generation drug acquisition platform driven by cutting edge technology, innovative cost saving tools, and best-in-class customer service. With hundreds of thousands of members onboarded since 2016, SmithRx has a solution that is resonating with clients all across the country.\n\nWe pride ourselves for our mission-driven and collaborative culture that inspires our employees to do their best work. We believe that the U.S healthcare system is in need of transformation, and we come to work each day dedicated to making that change a reality. At our core, we are guided by our company values:\n\n\n* Integrity: Do the right thing. Especially when itโs hard.\n\n* Courage: Embrace the challenge.\n\n* Together: Build bridges and lift up your colleagues.\n\n\n\n\nJob Summary:\n\nSmithRx is innovating in Pharmacy Benefits Management (PBM) with a next-gen platform, transforming how businesses manage pharmacy benefits. Our advanced technology offers real-time insights for cost efficiency, improved clinical services, and an enhanced customer experience. As part of SmithRx's product & engineering organization, the data engineering team is committed to creating a scalable and reliable data ecosystem, a vital foundation for delivering excellent service and operational superiority to our customers.\n\nWe are currently seeking a highly motivated Senior Data Engineer to join our fast-paced data team. The ideal candidate will work closely with cross-functional teams to develop scalable data pipelines, optimize data workflows, and ensure data quality and reliability. This should also include a strong background in data engineering, with expertise in data modeling, ETL processes, and cloud technologies.\n\nWhat you will do:\n\n\n* Design and implement scalable data models in enterprise data warehouse to support the company's analytical and reporting needs.\n\n* Develop and optimize ETL processes to ingest, transform, and load data from various sources into a data warehouse.\n\n* Collaborate with internal stakeholders, including Data Analytics team, to understand data requirements and translate them into technical solutions.\n\n* Build and maintain data warehouses, data lakes, and other data storage solutions to store and manage large volumes of structured and unstructured data.\n\n* Implement and enforce data governance policies to ensure PII/PHI protection, security, and compliance.\n\n* Monitor and optimize ETL jobs, database performance and data warehouse queries.\n\n* Document data engineering processes, data models, and design for knowledge sharing and reference.\n\n* Mentor junior data engineers and provide technical guidance and support to team members.\n\n\n\n\nWhat you will bring to SmithRx:\n\n\n* Bachelor's degree above in Computer Science, Information Technology, or a related field.\n\n* 5+ years of related experience in data engineering, software engineering, including proven experience as a Data Engineer with expertise in data warehouse technologies.\n\n* Strong programming skills, particularly in languages such as Python, Java. Proficiency in SQL, PySpark.\n\n* Strong programming skills, particularly in languages such as Python, Java. Proficiency in SQL, PySpark\n\n* Solid understanding of data modeling concepts and database design principles.\n\n* Hands-on experience with ETL tools and frameworks (e.g., Apache Spark, Apache Airflow, DBT, Looker)\n\n* Strong problem-solving abilities and attention to detail.\n\n* Excellent communication and collaboration skills.\n\n* Positivity; non-dogmatic, team-first attitude\n\n* Flexibility; someone who is responsive and comfortable with ambiguity\n\n* Start-up or healthcare experience is highly desirable\n\n\n\n\nWhat SmithRx Offers You: \n\n\n* Total Rewards package that includes incentive bonus and stock options\n\n* Highly competitive wellness benefits including Medical, Pharmacy, Dental, Vision, and Life Insurance and AD&D Insurance\n\n* Flexible Spending Benefits \n\n* 401(k) Retirement Savings Program \n\n* Short-term and long-term disability\n\n* Discretionary Paid Time Off \n\n* 12 Paid Holidays\n\n* Wellness Benefits\n\n* Commuter Benefits \n\n* Paid Parental Leave benefits\n\n* Employee Assistance Program (EAP)\n\n* Well-stocked kitchen in office locations\n\n* Professional development and training opportunities\n\n\n \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Design, Cloud, Senior and Engineer jobs that are similar:\n\n
$65,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nSan Francisco, California, United States
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
SADA India is hiring a Remote Senior Data Engineer
\nJoin SADA India as a Senior Data Engineer!\n\nYour Mission \n\nAs a Sr. Data Engineer on the Enterprise Support service team at SADA, you will reduce customer anxiety about running production workloads in the cloud by implementing and iteratively improving observability and reliability. You will have the opportunity to engage with our customers in a meaningful way by defining, measuring, and improving key business metrics; eliminating toil through automation; inspecting code, design, implementation, and operational procedures; enabling experimentation by helping create a culture of ownership; and winning customer trust through education, skill sharing, and implementing recommendations. Your efforts will accelerate our customersโ cloud adoption journey and we will be with them through the transformation of their applications, infrastructure, and internal processes. You will be part of a new social contract between customers and service providers that demands shared responsibility and accountability: our partnership with our customers will ensure we are working towards a common goal and share a common fate.\n\nThis is primarily a customer-facing role. You will also work closely with SADAโs Customer Experience team to execute their recommendations to our customers, and with Professional Services on large projects that require PMO support.\n\nPathway to Success \n\n#MakeThemRave is at the foundation of all our engineering. Our motivation is to provide customers with an exceptional experience in migrating, developing, modernizing, and operationalizing their systems in the Google Cloud Platform.\n\nYour success starts by positively impacting the direction of a fast-growing practice with vision and passion. You will be measured bi-yearly by the breadth, magnitude, and quality of your contributions, your ability to estimate accurately, customer feedback at the close of projects, how well you collaborate with your peers, and the consultative polish you bring to customer interactions.\n\nAs you continue to execute successfully, we will build a customized development plan together that leads you through the engineering or management growth tracks.\n\nExpectations\n\nCustomer Facing - You will interact with customers on a regular basis, sometimes daily, other times weekly/bi-weekly. Common touchpoints occur when qualifying potential opportunities, at project kickoff, throughout the engagement as progress is communicated, and at project close. You can expect to interact with a range of customer stakeholders, including engineers, technical project managers, and executives.\n\nOnboarding/Training - The first several weeks of onboarding are dedicated to learning and will encompass learning materials/assignments and compliance training, as well as meetings with relevant individuals.\n\nJob Requirements\n\nRequired Credentials:\n\n\n* Google Professional Data Engineer Certified or able to complete within the first 45 days of employment \n\n* A secondary Google Cloud certification in any other specialization\n\n\n\n\nRequired Qualifications: \n\n\n* 5+ years of experience in Cloud support\n\n* Experience in supporting customers preferably in 24/7 environments\n\n* Experience working with Google Cloud data products (CloudSQL, Spanner, Cloud Storage, Pub/Sub, Dataflow, Dataproc, Bigtable, BigQuery, Dataprep, Composer, etc)\n\n* Experience writing software in at least two or more languages such as Python, Java, Scala, or Go\n\n* Experience in building production-grade data solutions (relational and NoSQL)\n\n* Experience with systems monitoring/alerting, capacity planning, and performance tuning\n\n* Experience with BI tools like Tableau, Looker, etc will be an advantage\n\n* Consultative mindset that delights the customer by building good rapport with them to fully understand their requirements and provide accurate solutions\n\n\n\n\nUseful Qualifications:\n\n\n* \n\n\n* Mastery in at least one of the following domain areas:\n\n\n\n\n\n* \n\n\n* Google Cloud DataFlow: building batch/streaming ETL pipelines with frameworks such as Apache Beam or Google Cloud DataFlow and working with messaging systems like Pub/Sub, Kafka, and RabbitMQ; Auto scaling DataFlow clusters, troubleshooting cluster operation issues\n\n* Data Integration Tools: building data pipelines using modern data integration tools such as Fivetran, Striim, Data Fusion, etc. Must have hands-on experience configuring and integrating with multiple Data Sources within and outside of Google Cloud\n\n* Large Enterprise Migration: migrating entire cloud or on-prem assets to Google Cloud including Data Lakes, Data Warehouses, Databases, Business Intelligence, Jobs, etc. Provide consultations for optimizing cost, defining methodology, and coming up with a plan to execute the migration.\n\n\n\n\n\n\n\n\n\n* Experience with IoT architectures and building real-time data streaming pipelines\n\n* Experience operationalizing machine learning models on large datasets\n\n* Demonstrated leadership and self-direction -- a willingness to teach others and learn new techniques\n\n* Demonstrated skills in selecting the right statistical tools given a data analysis problem\n\n* Understanding of Chaos Engineering\n\n* Understanding of PCI, SOC2, and HIPAA compliance standards\n\n* Understanding of the principle of least privilege and security best practices\n\n* Understanding of cryptocurrency and blockchain technology\n\n\n\n\n \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Cloud, Senior and Engineer jobs that are similar:\n\n
$60,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nThiruvananthapuram, Kerala, India
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
About the team\n\nAt Typeform, the Data Engineering team is making data our #1 asset, and we know how important that is, as our product helps people collect information in the best way possible. We do that by making the user experience as human and conversational as possible. Typeformโs data needs are growing, and we need to find new technical solutions to respond to these needs. \n\n\nAbout the role\n\nWeโre looking for a Sr. Data Engineer embedded in the broader Engineering organization to support all departments at Typeform (Product, Marketing, Customer Success, and Software engineering) and provide them with the data they need to drive the business forward and push our product to the next level. \n\n \nThings you will do\n\n\n* Be part of the team responsible for the near-real-time transfer of data from source systems to the Data Lake, between systems within Typeform, and to/from our external tools.\n\n* Collaborate in designing, engineering, developing, and delivering an information technology infrastructure supporting the entire Data Team.\n\n* Help develop a scalable data architecture with a team of data engineers who populate and maintain Typeformโs Data Lake.\n\n* Lead and support Typeformโs engineering efforts in designing, developing, and rolling out Data systems and services.\n\n* Partner and learn with system architects, functional managers, and program managers to deliver mission-critical data to wherever it is needed\n\n* Help in building infrastructure and services that have an immediate impact on our business and customers.\n\n* Mentor and upskill junior team members.\n\n\n\n \nWhat you already bring to the table:\n\n\n* 4+ years of experience in the Data Engineering field, with a proven track record of technical ability\n\n* Previous experience in using an event-driven architecture with Kafka\n\n* Excellent stakeholder management and communication skills \n\n* Experience with Scala, Python, and SQL for data pipelines\n\n* Experience with modern cloud data warehouses (like AWS Redshift, GCP BigQuery, Azure Synapse or Snowflake)\n\n* Strong problem-solving skills \n\n* Strong mentorship skills\n\n\n\n\n \nExtra awesome:\n\n\n* Experience with Apache Spark (in both batch and streaming)\n\n* Experience with a job orchestrator (Airflow, Google Cloud Composer, Flyte, Prefect, Dagster)\n\n* Experience with dbt\n\n\n \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Cloud, Senior and Engineer jobs that are similar:\n\n
$60,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nDublin, Dublin, Ireland
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
Waabi is hiring a Remote Senior Software Engineer Maps Data Pipeline
\nWaabi, founded by AI pioneer and visionary Raquel Urtasun, is an AI company building the next generation of self-driving technology. With a world class team and an innovative approach that unleashes the power of AI to โdriveโ safely in the real world, Waabi is bringing the promise of self-driving closer to commercialization than ever before. Waabi is backed by best-in-class investors across the technology, logistics and the Canadian innovation ecosystem.\n\n\nWith offices in Toronto and San Francisco, Waabi is growing quickly and looking for diverse, innovative and collaborative candidates who want to impact the world in a positive way. To learn more visit: www.waabi.ai\n\n\nYou will...\n- Contribute to Waabiโs cutting edge AV stack and data-driven simulator.\n- Build and own robust, petabyte-scale ETL pipelines for ingesting and aggregating multi-sensor data to produce the maps leveraged both by Waabi World, as well by our autonomy software.\n- Be part of a team of multidisciplinary Engineers, Research Scientists, and Product Managers using an AI-first approach to enable safe self-driving at scale.\n- Interact with all areas of autonomy and simulation, many of which will be direct customers of maps.\n- Have the chance to learn about, and integrate numerous cutting-edge ML models into various stages of Waabiโs data pipelines: semantic segmentation, automated map annotation, 3D surface reconstruction, etc.\n- Collaborate closely with other teams including research scientists, ML and software engineers, and system engineers to understand use cases and deliver features that improve our overall data ecosystem.\n- Bring your expertise to provide technical leadership and mentorship to other engineers, and contribute to org-wide data architecture decision-making.\n- Assist in project roadmap planning, prioritization, and delivery.\n\n\nQualifications:\n- 5+ years of experience in developing and maintaining high-performance production data pipelines, including deep understanding of cloud infrastructure and cloud storage services like AWS S3, Google Cloud Storage, and Azure Blob Storage.\n- Solid coding proficiency and knowledge in Python and a compiled language like C++ or Rust.\n- Solid understanding of cloud job orchestration, monitoring, and instrumentation best-practices.\n- Open-minded and collaborative team player with the willingness to help others.\n- Passionate about self-driving technologies, solving hard problems, and creating innovative solutions.\n\n\nBonus/nice to have:\n- Experience writing production software in Rust.\n- Experience with MapReduce frameworks (Apache Hadoop/Spark) or orchestration frameworks like Apache Beam/Apache Airflow/Google Dataflow.\n- Familiarity with robotic sensor (LiDAR, camera) data.\n- Good documentation and technical writing skills.\n- Experience working in an Agile/Scrum environment.\n\n\n\n\n\nThe US yearly salary range for this role is: $129,000 - $238,000 USD in addition to competitive perks & benefits. Waabi (US) Inc.โs yearly salary ranges are determined based on several factors in accordance with the Companyโs compensation practices. The salary base range is reflective of the minimum and maximum target for new hire salaries for the position across all US locations. Note: The Company provides additional compensation for employees in this role, including equity incentive awards and an annual performance bonus.\n\n\nPerks/Benefits:\n- Competitive compensation and equity awards.\n- Health and Wellness benefits encompassing Medical, Dental and Vision coverage (for full-time employees only).\n- Unlimited Vacation.\n- Flexible hours and Work from Home support.\n- Daily drinks, snacks and catered meals (when in office).\n- Regularly scheduled team building activities and social events both on-site, off-site & virtually.\n- As we grow, this list continues to evolve! \n\n\nWaabi is an equal opportunity employer that celebrates diversity and is committed to creating a supportive, inclusive, and accessible environment for all employees. We seek applicants of all backgrounds and identities, across race, color, ethnicity, national origin or ancestry, age, citizenship, religion, sex, sexual orientation, gender identity or expression, military or veteran status, marital status, pregnancy or parental status, caregiver status, disability, or any other characteristic protected by law. We make workplace accommodations for qualified individuals with disabilities as required by applicable law. If reasonable accommodation is needed to participate in the job application or interview process please let our recruiting team know. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Python, Cloud, Senior and Engineer jobs that are similar:\n\n
$60,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nToronto, CAN, San Francisco, CA & Remote - US & Canada
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
Voltron Data is hiring a Remote Senior C++ Software Engineer Data Engines
\nWe are looking for a highly motivated Senior C++ Software Engineer for Data Engines, youโll have the opportunity to work directly on Theseus, the accelerator-native data processing engine built for composability. You will work closely with Voltron Data development teams to build, optimize and maintain our data execution framework, adding new features, making it run faster and more scalable and even contributing to new core architectural components that will enable the engine to run at Petabyte scale.\n\n \n\nWhy work at Voltron Data?\n\n\n* We are Going for Impact: We are a Series A, venture-backed startup assembling a global team to build a new foundation for data analytics with Apache Arrow. This foundation will usher in a wave of innovation in data processing that can take full advantage of the speed and efficiency offered by modern hardware. \n\n* We are Committed to Bridging Open Source Communities: We are a collection of open source maintainers who have been driving open source ecosystems over the last 15 years, particularly in the C++, Python, and R programming ecosystems. \n\n* We are Building a Diverse, Inclusive Company: We are creating a representative, equitable, and respectful workplace that prioritizes employee growth. Everyone at Voltron Data is bought into the companyโs success; all voices are critical to shaping the organizationโs future. \n\n\n\nTimeline:\n\nBelow is a rough timeline of where you can expect to be at different points during your career path starting in this position.\nUpon Joining:\n\n\n* Spending time learning about the Apache Arrow, the compute primitives we use in Theseus, the query parser and optimizer and other foundational components.\n\n* Diving into the data processing engine architecture, how all the different components interact with each other and how data flows through the compute graph. \n\n* Understanding memory management mechanics, including spilling memory from GPU, to Host and Disk.\n\n* Learning and embracing the software development culture at Voltron Data.\n\n\n\nWithin a month:\n\n\n* Profiling single node and distributed queries executions and analyzing the engine telemetry to better understand how the engine works and how to solve distributed engine issues.\n\n* Diving deep into the various distributed relational algebra algorithms to understand how they work and how they can be improved.\n\n* Working with the team on fixing bugs, implementing simple optimizations or code refactoring projects.\n\n\n\nWithin 6 months:\n\n\n* Building new relational algebra components to expand SQL coverage or DataFrame functionality coverage.\n\n* Making small improvements to more sophisticated engine components such as resource management, task scheduling, and fault tolerance.\n\n\n\nWithin 12 months:\n\n\n* Proposing and implementing core architecture improvements to the engine.\n\n* Working on challenging tasks such as language agnostic user defined functions, multi-query concurrency, and multi-tenancy.\n\n* Integrating the engine with other components and features developed by other teams in the company to provide enterprise grade customer experiences.\n\n\n\nPrevious experience that could be helpful:\n\n\n* Experience with data processing engines or frameworks\n\n* Experience in distributed and multi-threaded systems\n\n* Experience in HW resource management including memory and thread pools \n\n* Working with SQL and non-SQL systems and their computational abstractions\n\n* Developing in C++, especially using modern C++\n\n* Developing for multiple types of hardware (i.e. CPU, GPU)\n\n\n\n\nUS Compensation - The salary range for this role is between $171,000.00 to $210,000.00. We have a global market-based pay structure which varies by location. Please note that the base pay range is a guideline and for candidates who receive an offer, the exact base pay will vary based on factors such as actual work location, skills and experience of the candidate. This position is also eligible for additional incentives such as equity awards.\n\n \n\n#LISM1 \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Node, Senior and Engineer jobs that are similar:\n\n
$65,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
Memora Health is hiring a Remote Senior Data Engineer
\nMemora Health works with leading healthcare organizations to make complex care journeys simple for patients and clinicians so that care is more accessible, actionable, and always-on. Our team is rapidly growing as we expand our programs to reach more health systems and patients, and we are excited to bring on a Senior Data Engineer. \n\nIn this role, you will have the responsibility of driving the architecture, design and development of our data warehouse and analytics solutions, alongside APIs that allow other internal teams to interact with our data. The ideal candidate will be able to collaborate effectively with Memoraโs Product Management, Engineering, QA, TechOps and business stakeholders.\n\nThis role will work closely with the cross-functional teams to understand customer pain points and identify, prioritize, and implement maintainable solutions. Ideal candidates will be driven not only by the problem we are solving but also by the innovative approach and technology that we are applying to healthcare - looking to make a significant impact on healthcare delivery. Weโre looking for someone with exceptional curiosity and enthusiasm for solving hard problems.\n\n Primary Responsibilities:\n\n\n* Collaborate with Technical Lead, fellow engineers, Product Managers, QA, and TechOps to develop, test, secure, iterate, and scale complex data infrastructure, data models, data pipelines, APIs and application backend functionality.\n\n* Work closely with cross-functional teams to understand customer pain points and identify, prioritize, and implement maintainable solutions\n\n* Promote product development best practices, supportability, and code quality, both through leading by example and through mentoring other software engineers\n\n* Manage and pare back technical debts and escalate to Technical Lead and Engineering Manager as needed\n\n* Establish best practices designing, building and maintaining data models.\n\n* Design and develop data models and transformation layers to support reporting, analytics and AI/ML capabilities.\n\n* Develop and maintain solutions to enable self-serve reporting and analytics.\n\n* Build robust, performant ETL/ELT data pipelines.\n\n* Develop data quality monitoring solutions to increase data quality standards and metrics accuracy.\n\n\n\n\nQualifications (Required):\n\n\n* 3+ years experience in shipping, maintaining, and supporting enterprise-grade software products\n\n* 3+ years of data warehousing / analytics engineering\n\n* 3+ years of data modeling experience\n\n* Disciplined in writing readable, testable, and supportable code in JavaScript, TypeScript, Node.js (Express), Python (Flask, Django, or FastAPI), or Java.\n\n* Expertise writing, and consuming RESTful APIs\n\n* Experience with relational or NoSQL databases (PostgreSQL, MySQL, MongoDB, Redis, etc.)\n\n* Experience with Data Warehouses (BigQuery, Snowflake, etc.)\n\n* Experience with analytical and reporting tools, such as Looker or Tableau\n\n* Inclination toward test-driven development and test automation\n\n* Experience with scrum methodology\n\n* Excels in mentoring junior engineers\n\n* B.S. in Computer Science or other quantitative fields or related work experience\n\n\n\n\nQualifications (Bonus):\n\n\n* Understanding of DevOps practices and technologies (Docker, Kubernetes, CI / CD, test coverage and automation, branch and release management)\n\n* Experience with security tooling in SDLC and Security by Design principles\n\n* Experience with observability and APM tooling (Sumo Logic, Splunk, Sentry, New Relic, Datadog, etc.)\n\n* Experience with an integration framework (Mirth Connect, Mule ESB, Apache Nifi, Boomi, etc..)\n\n* Experience with healthcare data interoperability frameworks (FHIR, HL7, CCDA, etc.)\n\n* Experience with healthcare data sources (EHRs, Claims, etc.)\n\n* Experience working at a startup\n\n\n\n\n\n\nWhat You Get:\n\n\n* An opportunity to work on a rapidly scaling care delivery platform, engaging thousands of patients and care team members and growing 2-3x annually\n\n* Enter a highly collaborative environment and work on the fun challenges of scaling a high-growth startup\n\n* Work alongside world-class clinical, operational, and technical teams to build and scale Memora\n\n* Shape how leading health systems and plans think about modernizing the care delivery experience for their patients and care teams\n\n* Improve the way care is delivered for hundreds of thousands of patients\n\n* Gain deep expertise about healthcare transformation and direct customer exposure with the countryโs most innovative health systems and plans\n\n* Ownership over your success and the ability to significantly impact the growth of our company\n\n* Competitive salary and equity compensation with benefits including health, dental, and vision coverage, flexible work hours, paid maternity/paternity leave, bi-annual retreats, Macbook, and a 401(k) plan\n\n\n \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Design, Python, DevOps, NoSQL, Senior, Engineer and Backend jobs that are similar:\n\n
$60,000 — $110,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.