Niine 11, Tallinn
Published: 25/01/2018 Deadline: 25/02/2018

In August 2017, Planet OS, the Estonian-American big data startup that pioneered how large-scale geospatial and energy data is processed, became a part of Intertrust Technologies.

"We built the Planet OS data platform to give enterprises, government organizations, and researchers the ability to understand and act on the enormous amounts of data coming from Internet connected sensors, with the thesis that knowledge is key to informed decisions." - Rainer Sternfeld, Founder of Planet OS, GM of Intertrust Data Platforms.
Our main products are Powerboard and Datahub. They are used to operate some of the biggest wind farms in the world. Powerboard is meant for visualization of data from various sources and Datahub aggregates and stores enormous amounts of data from a huge amount of sources. The open data based 
Datahub is accessible to everyone at

Intertrust is the inventor of DRM (digital rights management) and a pioneer in trusted computing. Founded in 1990 by inventor-entrepreneur Victor Shear, the company has made fundamental contributions in the areas of computer security and digital trust and holds hundreds of patents that are key to DRM, internet security, trust, privacy management components of operating systems, trusted mobile code, network operating environments, web services, and cloud computing.


You´ll be working for Intertrust Data Platforms, which provides big data infrastructure to help the renewable energy sector transform the way data is used in their organizations. From operators in control rooms to executives in boardrooms, our specialized applications to integrate, exchange, and visualize data for all stakeholders make renewable energy more competitive.

Your role:

The data processing pipeline that powers the products is implemented as a series of components communicating with message queues, each separately scaleable. Together they form a pipeline that can ingest, index and make available via the API a varied set of data types. Essentially forming a novel type of database for multidimensional arrays of data, the fundamentals of the system are built using Scala and the Akka actor framework. Your job is to develop the software powering the data processing pipeline. You’ll join a growing team in Tallinn to develop the software powering and scaling the data processing pipeline.

Your main responsibilities:

•    Designing and building back-end systems
•    Focusing on efficient storage and processing infrastructure for large datasets
•    Writing code using Scala and Akka
•    Developing deployment automation
•    Developing automated testing
•    Ensuring robustness and scalability of components

Your profile:

•    Backend software development in Scala or Java for 5+ years
•    Knowledge of distributed systems fundamentals
•    Experience in functional programming
•    Experience with JVM-based languages
•    Ability to learn and adapt quickly
•    Creative thinker

Bonus points:

•    Experience in Scala, Akka, HBase, Elasticsearch
•    Knowledge of scientific data formats (NetCDF, HDF, various others)
•    Experience with environmental and sensor data

Our offer:

•    Competitive salary and flexible working hours
•    Independence in making decisions and selecting software tools
•    Opportunity to work closely with the teams in Estonia as well as in Silicon Valley
•    You will have the chance to build something significant for the energy industry 
•    Stock options in a growing company

To apply, please email us on with your resume in English. For any questions you may have please email or call 699 0557

Please fill in to apply:

Log in