Data Operations Engineer

Back to all jobs
  • Janes
  • Bengaluru, KA
  • 1 month ago
Published
February 18, 2026
Location
Bengaluru, United Kingdom

Job Description

Janes Introduction:

Janes empowers military, government, and defence leaders to act with confidence in an increasingly complex world. Our trusted defence, security, and geopolitical information delivered through seamless digital platforms and system integrations—turns overwhelming data into clear, actionable intelligence and insight. By filling critical information gaps, Janes helps customers analyse threats, accelerate decisions, and stay ahead of emerging challenges.

Job purpose: 

The role of Data Operations Engineer involves working within the Janes RD&A (Research, Data and Analysis)- Data Acquisition, Operations & Exploitation team that support the Janes global data operations & analytics function by delivering the last mile data & analytics products to Janes customers. Engage in developing and maintaining the data engineering process flows, ETL (extract, transfer, loading) tools those are used for the core service delivery. The role will also contribute significantly towards automating the data engineering function using python-based programming & Janes internal APIs as data sources within AWS cloud native tools and technologies. The role demands highly logical and analytics mindset to deliver results in an engineering discipline.  

The successful applicant will work closely with the Janes RD&A teams, customer support and product management teams for the data and analytics delivery to Janes offline, GPS based customers & Data Engineering process automation

Experience in python, Data Engineering tools, PowerBI, Postgres database, AWS cloud native data engineering tools are desirable for this role.

How you will contribute at Janes: 

  • Data Operations: Develop and maintain python-based data processing scripts to extract, transform and load the data (both open source and internal API based) to product the last mile deliveries for the Janes data & insights products – GPS (Global Platforms and Systems) & offline data products. Develop python-based data models on raw data sets and process the data into the relational database tables, file-based output. Maintain all the process workflow for the data processing and delivery. Design internal tools to track data preparation, modelling and delivery pipeline from source to destination
  • Process automation & Data Scraping: Create automated process for data processing, data scraping & configuration to automate manual workflows within the Janes systems using AWS native tools, Python programming to optimize the operational work and reduce manual effort
  • Maintenance & creation of data sets for analytics product using Tableau dashboards and associated datasets to ensure that subscribers are provided with the latest available data from the janes data sources (Janes APIs, RDBMS data sources)
  • Data wrangling & Analytics: Transforming data sets into more efficient formats to be used by other analysts or as a basis for new analytics offerings from API data sources. Maintain Tableau based analytics dashboards
  • Ad-hoc projects: Undertaking custom data delivery & analytics projects as required; sourcing, restructuring, data transformation and visualizing data to address customer requirements on demand

The ideal skills and experience for this role are: 

  • 4-5 years of hands-on experience as a Python programmer & Data Engineering role. Implementation of NumPy, Panda , PySpark , BeautifulSoup )
  • Deep understating of using APIs and configuration of different APIs in an enterprise data environment is must
  • Business analytics and/or Tableau certification is an added advantage
  • Must have experience on AWS native tools- data and analytics and exposure to working in a cloud infra data engineering environment
  • Good knowledge of data engineering and best practices – ETL, ELT
  • Can demonstrate innovative, dynamic and creative approach to problem-solving.
  • • A willingness to learn and grow in the role.

The desirable skills & Qualifications for this role are:

  • Experience working within a data engineering environment (python, analytics, databases, data transformation).
  • Batchelor / B-Tech /MCA in computer science engineering or equivalent
  • Experience with other technologies including powerBI, postgress, sql, Tableau
  • Good verbal and written communication skills that is required to communicate with internal and external stakeholders 

Benefits:

India

  • Annual health checkup
  • National Pension scheme
  • Meal Card (Sodexo)
  • Healthy Half (0.5 day leave every 6 months for wellbeing) 
  • 22 days of annual leave  
  • Transportation allowance
  • Day care / Creche Reimbursement
  • Group Medical Claim
  • Access to LinkedIn Learning

 

 

 

Life at Janes

We believe Janes is truly a great place to work. Our values and leadership code drive everything we do, and we understand that the right behaviours and culture will always result in the best outcomes for our customers, our colleagues, our shareholders, and our business. We provide a supportive, stretching, and dynamic environment with the ability for you to grow rapidly, both personally and professionally.

Janes is an inclusive and equal opportunities employer and encourages applications regardless of age, race, disability, religion / belief, sexual orientation, gender reassignment, marriage or civil partnership, pregnancy/maternity, or gender.

Although this role is advertised as full time, Janes believed that flexibility at work can provide many significant benefits both to our colleagues and the business. We already work in a hybrid style across all offices and regions and can support different ways of working and offer different flexible working arrangements. So, if you are interested and have any requirements or needs in the way you would like to work, please apply, and speak to us about this.  We will always consider part time or flexible applications

Job Applicants - Privacy Policy

Know your rights document

Key Skills
? Key Skills in dark blue have been inferred based on similar industry roles
Apache Spark Airflow Tableau Numpy Pyspark ELT Product Management Leadership Service Delivery Python ETL SQL AWS

Subscribe to Career Resources

Get the latest career advice, industry insights, and job opportunities delivered to your inbox.