\nVerana Health, a digital health company that delivers quality drug lifecycle and medical practice insights from an exclusive real-world data network, recently secured a $150 million Series E led by Johnson & Johnson Innovation โ JJDC, Inc. (JJDC) and Novo Growth, the growth-stage investment arm of Novo Holdings. \n\nExisting Verana Health investors GV (formerly Google Ventures), Casdin Capital, and Brook Byers also joined the round, as well as notable new investors, including the Merck Global Health Innovation Fund, THVC, and Breyer Capital.\n\nWe are driven to create quality real-world data in ophthalmology, neurology and urology to accelerate quality insights across the drug lifecycle and within medical practices. Additionally, we are driven to advance the quality of care and quality of life for patients. DRIVE defines our internal purpose and is the galvanizing force that helps ground us in a shared corporate culture. DRIVE is: Diversity, Responsibility, Integrity, Voice-of-Customer and End-Results. Click here to read more about our culture and values. \n\n Our headquarters are located in San Francisco and we have additional offices in Knoxville, TN and New York City with employees working remotely in AZ, CA, CO, CT, FL, GA, IL, LA, MA, NC, NJ, NY, OH, OR, PA, TN, TX, UT , VA, WA, WI. All employees are required to have permanent residency in one of these states. Candidates who are willing to relocate are also encouraged to apply. \n\nJob Title: Data Engineer\n\nJob Intro:\n\nAs a Data/Software Engineer at Verana Health, you will be responsible for extending a set of tools used for data pipeline development. You will have strong hands-on experience in design & development of cloud services. Deep understanding of data quality metadata management, data ingestion, and curation. Generate software solutions using Apache Spark, Hive, Presto, and other big data frameworks. Analyzing the systems and requirements to provide the best technical solutions with regard to flexibility, scalability, and reliability of underlying architecture. Document and improve software testing and release processes across the entire data team.\n\nJob Duties and Responsibilities:\n\n\nArchitect, implement, and maintain scalable data architectures to meet data processing and analytics requirements utilizing AWS and Databricks\n\nAbility to troubleshoot complex data issues and optimize pipelines taking into consideration data quality, computation and cost.\n\nCollaborate with cross-functional teams to understand and translate data needed into effective data pipeline solutions\n\nDesign solutions to solving problems related to ingestion and curation of highly variable data structures in a highly concurrent cloud environment.\n\nRetain metadata for tracking of execution details to reproducibility and providing operational metrics.\n\nCreate routines to add observability and alerting to the health of pipelines.\n\nEstablish data quality checks and ensure data integrity and accuracy throughout the data lifecycle.\n\nResearch , perform proof-of-concept and leverage performant database technologies(like Aurora Postgres, Elasticsearch, Redshift) to support end user applications that need sub second response time.\n\nParticipate in code reviews.\n\nProactive in staying updated with industry trends and emerging technologies in data engineering.\n\nDevelopment of data services using RESTful APIโs which are secure(oauth/saml), scalable(containerized using dockers), observable (using monitoring tools like datadog, elk stack), documented using OpenAPI/Swagger by using frameworks in python/java and automated CI/CD deployment using Github actions.\n\nDocument data engineering processes , architectures, and configurations.\n\n\n\n\nBasic Requirements:\n\n\nA minimum of a BS degree in computer science, software engineering, or related scientific discipline.\n\nA minimum of 3 years of experience in software development\n\nStrong programming skills in languages such as Python/Pyspark, SQL\n\nExperience with Delta lake, Unity Catalog, Delta Sharing, Delta live tables(DLT)\n\nExperience with data pipeline orchestration tools - Airflow, Databricks Workflows\n\n1 year of experience working in AWS cloud computing environment, preferably with Lambda, S3, SNS, SQS\n\nUnderstanding of Data Management principles(governance, security, cataloging, life cycle management, privacy, quality)\n\nGood understanding of relational databases.\n\nDemonstrated ability to build software tools in a collaborative, team oriented environment that are product and customer driven.\n\nStrong communication and interpersonal skills\n\nUtilizes source code version control.\n\nHands-on experience with Docker containers and container orchestration.\n\n\n\n\nBonus:\n\n\nHealthcare and medical data experience is a plus.\n\nAdditional experience with modern compiled programming languages (C++, Go, Rust)\n\nExperience building HTTP/REST APIs using popular frameworks\n\nBuilding out extensive automated test suites\n\n\n\n\nBenefits:\n\nWe provide health, vision, and dental coverage for employees\n\n\n\n\n\n\n\nVerana pays 100% of employee insurance coverage and 70% of family\n\nPlus an additional monthly $100 individual / $200 HSA contribution with HDHP\n\n\n\n\n\n\n\n\nSpring Health mental health support\n\nFlexible vacation plans\n\nA generous parental leave policy and family building support through the Carrot app\n\n$500 learning and development budget\n\n$25/wk in Doordash credit\n\nHeadspace meditation app - unlimited access\n\nGympass - 3 free live classes per week + monthly discounts for gyms like Soulcycle\n\n\n\n\nFinal note:\n\nYou do not need to match every listed expectation to apply for this position. Here at Verana, we know that diverse perspectives foster the innovation we need to be successful, and we are committed to building a team that encompasses a variety of backgrounds, experiences, and skills.\n\n \n\n \n\n \n\n \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Design, Docker, Testing, Cloud and Engineer jobs that are similar:\n\n
$70,000 — $100,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
About Usย \nAurora Labs is the development company behind Auroraโthe EVM blockchain that runs on the NEAR Protocol. We are also the developers of, and integration partner behind, Aurora Cloudโa suite of products that allow Web2 companies to capture the value of Web3.\nWe invite you to be a part of our team of smart, professional, result-oriented and fun individuals. Join us to help ensure that our background processes run smoothly while we are striving to become the best in the industry.\nAbout the team\nOur infrastructure team is responsible for building and supporting critical systems required for running and accessing NEAR and Aurora networks. That includes everything on the path of RPC requests before they hit the blockchain and block production and event delivery once transactions are executed.\nLoad balancing, caching, queueing, transaction simulation and block production is processed by the services written and maintained by the infrastructure team. These services operate at large scale and process terabytes of data. The platform is based on open-source software, such as Kubernetes, NATS, Jetstream, Blockscout, Grafana, Postgres and Near-core, alongside a few internally developed services.\nAll internally developed services are written in Go and implement core pieces of functionality such as Mempool management, NEAR chunk distribution, transaction pre-processing and simulation.\nAbout the position\nThis role is split between two responsibilities: software engineering (80%) and site reliability (20%).\nSoftware Engineering projects include:- Shield - a security service to protect users from making errors or executing malicious transactions.- Mempool - a system to store/reorder transactions before they can hit the blockchain.- Relayer - translates RPC calls on the read and write path from the end user.- Explorer - Blockscout-based system that provides a user interface.- Aurora Cloud - a system to automatically provision multiple infrastructure stacks for Aurora Engine.- CLI tools for pubsub and streaming infrastructure operations.- Indexers and blockchain event aggregation pipelines for monitoring purposes.\nReliability Engineering includes:- Automating configuration and maintenance of software components such as K8s, NATS, Influxdb, Postgres, Cloudflare using e.g. Ansible, Terraform, Helm and kubernetes operators.- Design and implementation of cloud-agnostic solutions without exclusively relying on specific cloud vendors.- Optimizing the latency and throughput of the pub-sub infrastructure.- Incident management, troubleshooting, monitoring, distributed tracing and recovery automation.ย \nAbout you\nYou are a software engineer with experience of creating and maintaining backend systems. You are familiar with the entire Linux stack and can easily find a bottleneck in a distributed system. You have developed CLI tools and backend services before and are comfortable applying your software development skills to automate your daily operations or to create a microservice on the request path of the end users.\nKey Qualifications\n- Experience with Devops or SRE as an engineering subject area, with proficiency in Golang.- Successful track-record and proven experience as a backend internet services software developer.- Knowledge of SDLC, including continuous integration and testing methodologies.- Understanding of base internet infrastructure services including DNS, HTTP,ย server virtualization, server monitoring in critical, large scale distributed systems.- Understanding of SRE principals, including monitoring, alerting, error budgets, fault analysis, and other common reliability engineering concepts, with a keen eye for opportunities to eliminate toil by code and process improvements.- Excellent verbal and written communication skills in English.\nDesired skills\n- Deep familiarity with Go or other system-oriented programming languages.- Experience with development within Kubernetes ecosystem, including operator framework, controllers and CRDs.- Experience with streaming and pubsub systems such as NATS, Apache Kafka, Apache Pulsar.- Automating operations processes via services and tools.- Configuration management and fleet orchestration via Puppet, Chef, Ansible, or others.- Cloud Services (AWS S3/EC2/CloudFront or equivalent).Join our dedicated team of blockchain industry professionals.Please apply today โ weโre standing by for your resume!\nIn applying at this job, I confirm and acknowledge that I read and understood the Privacy Notice published atย https://auroralabs.dev/privacy. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Design, Testing, DevOps, Cloud, Engineer, Linux and Backend jobs that are similar:\n\n
$70,000 — $100,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nWorldwide
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
Upshot is hiring a Remote Senior Distributed Systems Engineer
Upshotโs mission is, and always has been, to enable the creation of efficient financial markets for anything. To realize this, weโre developing the Upshot Machine Intelligence Network, a network designed to crowdsource financial alpha produced by machine learning models and powered by our new Proof of Alpha scoring mechanism. We believe building at the intersection of crypto and AI is the best way to achieve our mission and usher in a new era of efficient financial markets.\nAs crypto-natives and ML experts with experience across leading Web3 projects, large finance companies, and large tech companies, our team combines deep blockchain knowledge with world-class technical capabilities. Weโre backed by top crypto venture funds and angel investors who share our vision for the future of digital ownership.\nAt Upshot, we nurture our people just as much as our products, providing an environment where top talent can thrive at the intersection of crypto and AI. Together, weโre building the infrastructure to enable financial markets for anything. Join us as we shape the next frontier.\n\nBackend technical stack: libp2p, Go, NodeJS, TypeScript, Python, Redis, Ethers, Postgres, Docker, Kafka, AWS, Apache Flink, Apache Airflow\nResponsibilities\n\n* Work with various teams and squads to bring features and products to life\n\n* Build high-quality and well-tested code\n\n* Own the entire lifecycle of our backend services, from defining a roadmap and proposing designs to implementing a final product\n\n* Enhance the software development lifecycle to enable rapid learning, including ideation, technical design, implementation, and testing of product features and tech debt\n\n* Produce high-level internal and external documentation\n\n* Collaborate, mentor, and provide support to other team members\n\n\nRequirements\n\n* 3+ years experience building infrastructure in an adversarial p2p environment e.g. creating or maintaining blockchain nodes\n\n* 5+ years experience coding any of the following languages - Node, TypeScript, Python, Go, or Rustย \n\n* 5+ years experience with SQL and NoSQL systems\n\n* Prior experience with the Web3 technical stack\n\n* Highly knowledgeable in the Web3/Crypto/DeFi and AI space with a solid grasp of recent global trends and use cases\n\n* Experience with building public-facing APIs used in a production setting and serving at least 4 orders of magnitude of requests per day\n\n* Ability to solve problems and comfortable with removing blockers in an ambiguous environment\n\n* Ability to collaborate effectively and with at least 4 multi-disciplinary teams at once\n\n* Ability to move quickly, adjusting course when necessary, in a fast-paced startup environment (An individual involved in a startup as it pivoted would be very qualified in this respect)\n\n\nNice to have\n\n* Prior experience building open source developer tooling and interacting with open source communities\n\n* Prior experience with Kafka or other distributed event streaming platform\n\n* Prior experience building production-grade smart contracts\n\n\nThis Organization Participates in E-VerifyThis employer participates in E-Verify and will provide the federal government with your Form I-9 information to confirm that you are authorized to work in the U.S. If E-Verify cannot confirm that you are authorized to work, this employer is required to give you written instructions and an opportunity to contact the Department of Homeland Security (DHS) or Social Security Administration (SSA) so you can begin to resolve the issue before the employer can take any action against you, including terminating your employment.Employers can only use E-Verify once you have accepted a job offer and completed the Form I-9.For more information on E-Verify, or if you believe that an employer has violated its E-Verify responsibilities, please contact DHS.888-897-7781E-Verify.gov\nUpshot is an equal opportunity employer. We value diversity at our company and do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Web3, Crypto, Testing, Finance, NoSQL, Senior, Engineer and Backend jobs that are similar:\n\n
$60,000 — $105,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nWorldwide
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
Upshot is hiring a Remote Senior Distributed Systems Engineer
Upshotโs mission is, and always has been, to enable the creation of efficient financial markets for anything. To realize this, weโre developing the Upshot Machine Intelligence Network, a network designed to crowdsource financial alpha produced by machine learning models and powered by our new Proof of Alpha scoring mechanism. We believe building at the intersection of crypto and AI is the best way to achieve our mission and usher in a new era of efficient financial markets.\nAs crypto-natives and ML experts with experience across leading Web3 projects, large finance companies, and large tech companies, our team combines deep blockchain knowledge with world-class technical capabilities. Weโre backed by top crypto venture funds and angel investors who share our vision for the future of digital ownership.\nAt Upshot, we nurture our people just as much as our products, providing an environment where top talent can thrive at the intersection of crypto and AI. Together, weโre building the infrastructure to enable financial markets for anything. Join us as we shape the next frontier.\n\nBackend technical stack: libp2p, Go, NodeJS, TypeScript, Python, Redis, Ethers, Postgres, Docker, Kafka, AWS, Apache Flink, Apache Airflow\n\nResponsibilities\n\n\n\n* Work with various teams and squads to bring features and products to life\n\n* Build high-quality and well-tested code\n\n* Own the entire lifecycle of our backend services, from defining a roadmap and proposing designs to implementing a final product\n\n* Enhance the software development lifecycle to enable rapid learning, including ideation, technical design, implementation, and testing of product features and tech debt\n\n* Produce high-level internal and external documentation\n\n* Collaborate, mentor, and provide support to other team members\n\n\n\nRequirements\n\n\n\n* 3+ years experience building infrastructure in an adversarial p2p environment e.g. creating or maintaining blockchain nodes\n\n* 5+ years experience coding any of the following languages - Node, TypeScript, Python, Go, or Rust \n\n* 5+ years experience with SQL and NoSQL systems\n\n* Prior experience with the Web3 technical stack\n\n* Highly knowledgeable in the Web3/Crypto/DeFi and AI space with a solid grasp of recent global trends and use cases\n\n* Experience with building public-facing APIs used in a production setting and serving at least 4 orders of magnitude of requests per day\n\n* Ability to solve problems and comfortable with removing blockers in an ambiguous environment\n\n* Ability to collaborate effectively and with at least 4 multi-disciplinary teams at once\n\n* Ability to move quickly, adjusting course when necessary, in a fast-paced startup environment (An individual involved in a startup as it pivoted would be very qualified in this respect)\n\n\n\nNice to have\n\n\n\n* Prior experience building open source developer tooling and interacting with open source communities\n\n* Prior experience with Kafka or other distributed event streaming platform\n\n* Prior experience building production-grade smart contracts\n\n\nThis Organization Participates in E-VerifyThis employer participates in E-Verify and will provide the federal government with your Form I-9 information to confirm that you are authorized to work in the U.S. If E-Verify cannot confirm that you are authorized to work, this employer is required to give you written instructions and an opportunity to contact the Department of Homeland Security (DHS) or Social Security Administration (SSA) so you can begin to resolve the issue before the employer can take any action against you, including terminating your employment.Employers can only use E-Verify once you have accepted a job offer and completed the Form I-9.For more information on E-Verify, or if you believe that an employer has violated its E-Verify responsibilities, please contact DHS.888-897-7781E-Verify.gov\nUpshot is an equal opportunity employer. We value diversity at our company and do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Web3, Crypto, Testing, Finance, NoSQL, Senior, Engineer and Backend jobs that are similar:\n\n
$55,000 — $100,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nWorldwide
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.