fbpx

Full Stack Developer

What will you be doing?

As a full stack engineer, you should be proficient in the relational and non-relational database, interaction with APIs and the external world. You must have substantial knowledge in all the aspects, from the creation of the concept of finishing it. Responsibilities include:

  • Work as part of a cross-functional team alongside product, operations, design and development
  • Build elegant code, across multiple parts of our systems
  • Solve business problems with a smart, “hacker-like” mindset
  • Ship code both collaboratively and independently

Who you are?

You love a good challenge, curious about everything (food, science, art, trading – whatever), and have the passion to build smart solutions, in short user-feedback cycles. You are a well-rounded, generalist software engineer, who can tackle any problems that come your way. You are an individual who prides themselves on their work. You care about the little details, the bigger picture and have a keen eye for perfection.

What should you have?

  • 2+ years of experience in building stuff
  • You have a keen ability to understand business requirements and design and build the products, tools and processes that will support the current and future needs of our business
  • In-depth knowledge of MERN stack

Back-End Developer

What will you be doing?

  • Ensure the technical feasibility of our API and Backend infrastructure
  • Implement smart and elegant solutions to solve product challenges
  • Build reusable code and libraries for future use
  • Own your work as a team member and a colleague
  • Be proactive in designing solutions with the product and data teams
  • Optimize our applications for maximum speed and reliability

Who you are?

We are looking for a passionate Node.js Developer to join our backend team. The ideal candidate will be an experienced JavaScript developer with a sense of leadership who wishes to take his career to the next level.

What should you have?

  • 3+ years of Node.js development experience and substantial JavaScript experience
  • 2+ years experience working with the Express.js framework
  • 2+ years experience working with non-relational databases such as MongoDB, Firebase or similar technologies
  • Agile/Scrum development cycle understanding
  • Proficiency in Linux environment
  • Knowledge working closely with git
  • Excellent communication skills and being able to work independently or as part of an agile team
  • Experience in test automation and test-driven development
  • Able to work in a start-up environment: Pro-active and can-do attitude, ability to work in a dynamic environment, and collaborate with a global team
  • A true team player who is willing and excited to contribute to our team’s and company’s success
  • Excellent communication, time management skills, and being able to work independently or as part of an agile team.
  • Proven ability to build relationships and work in a team as well as an independent environment; can easily work and be productive when working remotely
  • Excellent verbal and written communication skills in English

Machine Learning Engineer

What will you be doing?

  • Architect, Design, and implement infrastructures and tools to enable AI research and deployment within the AWS ecosystem
  • Develop batch and streaming pipelines that fuel machine learning services
  • Architect complicated jobs and orchestrate them on managed services
  • Install and update disaster recovery procedures to safeguard our data
  • Integrate techniques to constantly improve data reliability and quality

Who you are?

A seasoned software engineer who’s experienced in all layers of the data hierarchy – from database design to data collection and storage techniques, to a deep understanding of data transformation tools and methodologies, to provisioning and managing of analytical databases, to building infrastructures that bring machine learning capabilities into production.

What should you have?

  • 2+ years of experience developing real-time stream processing solutions using Apache Kafka or Amazon Kinesis – Must
  • 2+ years of experience developing infrastructures that bring machine learning services to production using Amazon Sagemaker and AWS EMR – Must
  • Demonstrated experience orchestrating containerized applications in the AWS ecosystem using AWS ECS and ECR – Must
  • 3+ years of experience Writing production-grade Python code and working with both relational and non-relational databases
  • Solution orientation and ‘can do’ attitude – with a sense of ownership and accountability
  • Bachelor’s Degree in Computer Science, Engineering or a similar computational discipline
  • High English skills
  • Bonus: Experience in modeling and developing machine learning models
  • Bonus: Experience working with graph databases such as Neo4j or Amazon Neptune
  • Bonus: Experience with Snowflake Data Warehouse

DevOps Engineer

What will you be doing?

  • Architect, Design, and implement infrastructures and tools to enable AI research and deployment within the AWS ecosystem
  • Implement & manage all infrastructure stacks for various company applications
  • Implement & manage CI/CD pipelines for various company applications
  • Implement & manage automation processes for various applications and tasks
  • Implement & manage integrations between company systems and third parties
  • Administer third party applications
  • Administer relational & non relational databases in the company
  • Monitor all infrastructure resources & applications to make sure they are healthy, operational 24/7, and secure against any cyber attacks

Who you are?

An experienced DevOps engineer who is passionate about building infrastructures that fuel cutting edge applications and workloads at scale. The ideal candidate is someone who is experienced in building secure, high-performing, resilient, and efficient infrastructures using current technologies. You’ll work in collaboration with the engineering and data teams, help automate and streamline our operations and processes and troubleshoot issues in our development, test and production environments.

What you should have?

  • 3+ years of experience in a similar role
  • 2+ years of proven experience in the AWS cloud ecosystem
  • 2+ years of proven experience in UNIX/Linux system administration
  • 2+ years of proven experience in Cl/CD processes and tools such as Jenkins
  • 1+ years of proven experience in container technologies, Docker and Kubernetes
  • 1+ years of proven experience with NoSQL database administration (namely MongoDB)
  • 1+ years of proven experience with infrastructure scripting solutions using Bash or Python
  • 1+ years of proven experience with system monitoring tools such as Datadog
  • Good knowledge in cloud security practices and data security practices

Advantage:

  • Experience provisioning and managing Machine Learning infrastructures and tools
  • Good Knowledge in provisioning and managing data streaming services such as Kafka or Kinesis
  • Good knowledge in agile software development life cycle

Data Scientist

What will you be doing?

  • Invent, design, and develop state-of-the-art machine learning models to tackle fascinating challenges in both data-rich and data-scarce environments
  • Enhance our product line with recommendation, personalization, anomaly detection, structured prediction, and classification capabilities using machine and deep learning algorithms
  • Work with developers, product managers, and other stakeholders to translate business needs into data-driven technological solutions
  • Deliver ongoing business impact by writing production-quality code while implementing your own ideas

Who you are?

We are looking for a passionate and curious individual to join our Data team to help tackle fascinating challenges and take part in shaping the company’s core technology. As a data scientist in the team, you’ll have a central role in generating business impact by applying sophisticated data-driven approaches, taking projects end to end, from inception to production.

What should you have?

  • 3+ years of experience as a Data Scientist in environments that combine versatile data sources such as textual, relational, unstructured transnational, graph, and stream data
  • Deep understanding of both the theory and application of Statistical modeling, Machine learning & Deep learning
  • Experience with state-of-the-art representation learning approaches in NLP and their application in a production environment
  • Experience with machine learning lifecycle platforms such as MLflow and Amazon Sagemaker
  • Bachelor’s Degree in Computer Science, Engineering or a similar computational discipline
  • High English skills

Advantage

  • M.Sc. or Ph.D. involving machine-learning related research
  • Experience writing production-level code for ML services
  • Experience working with the AWS ecosystem
  • Experience with knowledge graphs, recommendation models, and graph databases
  • Prior leadership experience

Senior Data Engineer

What will you be doing?

  • Be a central focal point related to data architecture activity within the company
  • Design, implement and optimize sophisticated ETL/ELT processes and data streaming pipelines
  • Work closely with data scientists, ML engineers, and Infrastructure engineers to build production-grade  data-intensive pipelines to fuel machine learning models
  • Develop monitoring and validation tools over data infrastructure to continuously improve data quality
  • Integrate and translate business requirements and product features into technical data solutions

Who you are?

An experienced Data engineer with a deep understanding of the analytics stack in a cutting edge startup, who is passionate about promoting great analytics capabilities and data infrastructure that enable a data-driven culture. You are a team player with excellent collaboration and communication skills, possessing a “can-do” approach and strong business acumen.

What should you have?

  • 4+ years of experience in similar roles
  • 4+ years of experience working with both relational and non-relational databases
  • 2+ years of experience administering and designing cloud-based data warehousing solutions such as Snowflake or Amazon Redshift
  • 2+ years of experience working with unstructured data, complex data sets, and data modeling
  • 4+ years of SQL experience, 2+ years of Python experience
  • BSc in Computer Science or a similar computational discipline
  • High English skills