Remote jobs in Programming



Senior & Principal Software Engineers

Durham ,NC



amazon web services

apache kafka

apache spark

3 months

Remote Jobs


Remote Jobs in NC


Senior & Principal Software Engineers

Location: Durham, NC Type: Full-time

Genesys is building the data platform of the future with a small team with a startup feel and the financial stability of an industry leader. We’re experiencing hyper growth and scalability is a key focus with hundreds of terabytes of data our customers generate. Our Analytics team is exposed to a large variety of technology with a chance to move around and work on something new practically every day. Working remote in the US or Canada is fine.

As a member of the team, you will:

  • Develop and deploy highly-available, fault-tolerant software that will help drive improvements towards the features, reliability, performance, and efficiency of the Genesys Cloud Analytics platform.

  • Actively review code, mentor, and provide peer feedback.

  • Collaborate with engineering teams to identify and resolve pain points as well as evangelize best practices.

  • Partner with various teams to transform concepts into requirements and requirements into services and tools.

  • Engineer efficient, adaptable and scalable architecture for all stages of data lifecycle (ingest, streaming, structured and unstructured storage, aggregation) in support of a variety of data applications.

  • Build, deploy, maintain, and automate large global deployments in AWS.

  • Troubleshoot production issues and come up with solutions as required.

This may be the perfect job for you if:

  • You have a strong engineering background with ability to design software systems from the ground up.

  • You have expertise in Java, Python or similar programming languages.

  • You have experience in web-scale data and large-scale distributed systems on a cloud infrastructure.

  • You have a product mindset. You are energized by building things that will be heavily used.

  • You have engineered scalable software using big data technologies.

  • You have worked on and understand messaging/queueing/stream processing system.

  • You have deep knowledge of Kafka, Flink, or Druid.

  • You design not just with a mind for solving a problem, but also with maintainability, testability, monitorability, and automation as top concerns.

Technologies we use and practices we hold dear:

  • Right tool for the right job over we-always-did-it-this-way.

  • We pick the language and frameworks best suited for specific problems. This usually translates to Java for developing services and applications and Python for tooling.

  • Packer and Ansible for immutable machine images.

  • AWS for cloud infrastructure.

  • Infrastructure (and everything, really) as code.

  • Automation for everything. CI/CD, testing, scaling, healing, etc.

  • Apache Flink and Apache Kafka for stream processing.

  • Amazon DynamoDB, Postgres, Amazon S3, and Redis for query and storage.

Your DREAM REMOTE JOB inside your inbox!

Get a
email of all new remote

Cookies, terms, and privacy policy

By clicking or navigating this website you accept and allow all our cookies, terms of use and privacy policy. This site uses cookies to offer you a better browsing experience.


How would you rate your experience?


We may wish to follow up. Enter your email if you're happy for us to contact you.