cover image
BlueRose Technologies

BlueRose Technologies

www.bluerose-tech.com

6 Jobs

522 Employees

About the Company

BlueRose Technologies is a leading global provider of technology services and digital solutions. Our mission is to empower our clients worldwide to thrive in today's rapidly evolving landscape. With a presence in 25+ countries, we go above and beyond to support our clients' digital transformation journey.

At BlueRose, we harness cutting-edge technology to drive enterprise-wide business transformations on a global scale. We firmly believe that success stems from a combination of talent, teamwork, and unwavering focus. Our impressive track record of delivering exceptional results and our deep industry knowledge, comprehensive services, and innovative solutions enable us to add significant value to our clients' mission-critical projects and strategic initiatives. We have strategically positioned ourselves at the forefront of emerging and niche technologies, making it a key driver of our success.

BlueRose is a fast-growing IT product and services company. We partner with leading COTS product vendors and provide services in Cloud, ERP, RPA, and CRM. With our in-depth and business-centric approach, we provide complete solutions across multiple industries.

* Led by industry veterans with Cloud, CRM, ERP & IT Services expertise.
* Team of 1200+ experienced resources across multiple products and domains.
* Operational in 25+ countries.
* Strategic partnership with Oracle, SAP, Pega, ServiceNow, Appian, AA, and Inceptum.
* Experience in greenfield and brownfield implementation of systems and providing managed services
* Delivered 100+ engagements for customers across the globe.
* Flexible and blended engagement model.
* Industry-focused in-house CoE, Verticals, and best practices.

Listed Jobs

Company background Company brand
Company Name
BlueRose Technologies
Job Title
Dotnet Developer
Job Description
**Job Title:** .NET Developer **Role Summary:** Design, develop, and maintain enterprise‑grade applications in ASP.NET Core 9.0 with C#, leveraging PostgreSQL, Azure services, and containerized deployments. Focus on robust RESTful APIs, real‑time features, and clean architecture principles for energy‑trading and compliance systems. **Expactations:** 8+ years of .NET development experience; proven expertise in ASP.NET Core, PostgreSQL, Azure AD, Docker/OpenShift, and API design. Strong problem‑solving skills, adherence to SOLID and Clean Architecture, and ability to deliver maintainable code in a hybrid on‑premise/remote environment. **Key Responsibilities:** - Develop, test, and maintain large‑scale .NET applications using ASP.NET Core 9.0 and C#. - Design and implement RESTful APIs for biogas trading, inventory management, regulatory reporting, and carbon credit calculations (RINS, LCFS). - Implement business logic, background processing (Quartz.NET), and real‑time features (SignalR). - Persist data with PostgreSQL and Entity Framework Core; maintain schema integrity and performance. - Integrate authentication/authorization via Azure AD, OAuth2, and JWT. - Configure email workflows, automated reports, and Microsoft Graph API integration for SharePoint document handling. - Package applications in Docker containers and deploy to OpenShift/Kubernetes. - Maintain CI/CD pipelines with Azure DevOps; automate testing (xUnit, FluentAssertions) and documentation (OpenAPI/Swagger). **Required Skills:** - Core C# with ASP.NET Core 9.0 development. - Entity Framework Core, PostgreSQL, and advanced SQL. - RESTful API design, OpenAPI/Swagger documentation. - OAuth2, JWT, Azure AD authentication. - Containerization: Docker, Kubernetes/OpenShift. - Background job frameworks (Quartz.NET). - CI/CD and Azure DevOps pipelines. - SOLID principles, Clean Architecture. - Unit & integration testing: xUnit, FluentAssertions. **Required Education & Certifications:** - Bachelor’s or Master’s degree in Computer Science, Software Engineering, or a related field. - Microsoft Certified: Azure Developer Associate (preferred).
London, United kingdom
Hybrid
Mid level
29-10-2025
Company background Company brand
Company Name
BlueRose Technologies
Job Title
Senior Data Scientist
Job Description
Job Title: Senior Data Scientist Role Summary: Lead end‑to‑end analytics initiatives, designing and deploying production‑grade data science solutions, collaborating with cross‑functional teams to transform complex datasets into actionable insights. Expectations: • Drive data strategy and architecture across large, distributed data environments. • Mentor junior analysts and data engineers. • Own full lifecycle of analytical projects: acquisition, engineering, modeling, deployment, monitoring, and continuous improvement. Key Responsibilities: • Build scalable data pipelines and statistical models in Python, Pandas, and SQL for production deployment. • Develop, test, and maintain analytical codebases using Jupyter notebooks, Spark, Git, and cloud services (AWS). • Design and implement monitoring frameworks for model performance and data drift. • Translate business requirements into analytical solutions, presenting findings to stakeholders. • Ensure data quality, governance, and adherence to best practices. Required Skills: • Advanced proficiency in Python, Pandas, SQL; familiarity with Spark and Git. • Strong statistical and machine learning fundamentals. • Experience with data engineering concepts and distributed processing. • Knowledge of cloud (AWS) services for data storage, compute, and deployment. • Excellent problem‑solving, communication, and collaboration abilities. Required Education & Certifications: • Master’s degree in Statistics, Mathematics, Economics, Computer Science, or related quantitative field *or* Bachelor’s degree with 7–12 years industry experience. • Certifications in advanced data science or cloud platforms (e.g., AWS Certified Data Analytics) preferred.
United states
Remote
Senior
26-11-2025
Company background Company brand
Company Name
BlueRose Technologies
Job Title
TMF/OSS Consultant
Job Description
**Job title**: TMF/OSS Consultant **Role Summary**: Deliver end‑to‑end consulting services for Telecom Management Framework (TMF) and Operations Support Systems (OSS). Design, integrate, and analyze solutions that leverage REST APIs and modern development stacks to optimize telco operations. **Expectations**: - Apply deep telco expertise in TMF/OSS architecture. - Translate business requirements into technical designs. - Lead integration of services using REST, Java, and related technologies. - Provide analytical insights to drive operational efficiency. **Key Responsibilities**: - Configure and customize TMF/OSS modules to meet client specifications. - Develop and integrate RESTful services with existing OSS components. - Conduct system analysis, mapping, and documentation. - Collaborate with development teams to build and deploy Java/JSP microservices. - Implement front‑end components using Angular/HTML/CSS. - Manage source control (Git/Maven) and collaborate on CI/CD pipelines. - Deploy applications to container platforms (OpenShift) and cloud environments (Azure). - Troubleshoot performance issues and recommend improvements. **Required Skills**: - Telecommunication industry experience (TMF, OSS). - Proficient in TMF/OSS frameworks and standards. - Strong REST API design and consumption skills. - Analytical thinking and requirement elicitation. - *Preferred*: Java, SQL, GIT/Maven, JSP, Angular, JavaScript, Azure, HTML/CSS, Spring Boot, Spring Data, JPA/Hibernate, OpenShift. **Required Education & Certifications**: - Bachelor’s or Master’s degree in Computer Science, Information Technology, Telecommunications, or related field. - Relevant telecom certifications (e.g., TMF Certification) are a plus but not mandatory.
Brussels region, Belgium
Hybrid
29-12-2025
Company background Company brand
Company Name
BlueRose Technologies
Job Title
Data Architect
Job Description
**Job title** Data Architect **Role Summary** Design and govern end‑to‑end data product architecture across streaming, batch, and lakehouse environments, ensuring scalability, reliability, and compliance with data‑mesh principles. **Expectations** * Deliver production‑ready data pipelines and architectures in a contract, hybrid setting. * Own data quality, governance, and cost monitoring for large‑scale event streams and analytics workloads. **Key Responsibilities** * Architect data products using Kafka (Confluent), AWS MSK, Kinesis, and EventBridge; enforce ordering, replay, and semantic guarantees. * Manage schema evolution with Avro/Protobuf in a Schema Registry; define subject strategy and compatibility rules. * Build streaming pipelines: Kinesis → S3 → Glue (batch and streaming) → Athena/Redshift; implement DLQ and back‑pressure patterns. * Apply lakehouse strategies (Iceberg) and partitioning for performance and cost efficiency. * Design payment processing flows (ISO 20022 PAIN/PACS/CAMT) and reconcilation logic for APIs, files, and SWIFT. * Create observability dashboards, alerts, and FinOps KPIs; troubleshoot latency and throughput at scale. * Produce high‑quality code, conduct reviews, automate tests, and embed security best practices. * Model logical data with 1NF/BCNF and physical schemas for OLTP/OLAP; evaluate denormalization, star, and Data Vault designs. * Implement CQRS/ES patterns where appropriate; enforce domain‐driven design, bounded contexts, and event contracts. **Required Skills** * Proven data product architecture experience. * Proficiency with Kafka (Confluent), AWS MSK, Kinesis, EventBridge. * Schema registry expertise (Avro/Protobuf) and event semantics. * AWS data stack: S3, Glue, Athena, Redshift, Step Functions, Lambda, Iceberg. * Payment domain knowledge: ISO 20022, PAIN/PACS/CAMT, reconciliation. * Data‑mesh mindset: ownership, SLAs, lineage, retention. * Observability (Dashboards, alerts) and FinOps (cost KPIs). * Strong coding in Python/Java/Scala; automated testing, CI/CD pipelines. * Logical/physical data modeling (ER, normalization, denormalization, SCD). * CQRS, Event‑Sourcing, Saga, and domain‑driven design fundamentals. * Experience with QuickSight/Tableau (nice to have). **Required Education & Certifications** * Bachelor’s (or higher) in Computer Science, Data Engineering, or related field. * Relevant certifications (e.g., AWS Certified Solutions Architect, Confluent Certified Developer) preferred.
London, United kingdom
Hybrid
30-12-2025