cover image
Block

Block

block.xyz

9 Jobs

12,949 Employees

About the Company

Block is one company built from many blocks, all united by the same purpose of economic empowerment. The blocks that form our foundational teams -- People, Finance, Counsel, Hardware, Information Security, Platform Infrastructure Engineering, and more -- provide support and guidance at the corporate level. They work across business groups and around the globe, spanning time zones and disciplines to develop inclusive People policies, forecast finances, give legal counsel, safeguard systems, nurture new initiatives, and more. Every challenge creates possibilities, and we need different perspectives to see them all. Bring yours to Block.

Listed Jobs

Company background Company brand
Company Name
Block
Job Title
Data Scientist - Model Strategy Management
Job Description
**Job Title:** Data Scientist – Model Strategy Management **Role Summary:** Develop, implement, and scale AI‑driven predictive models that inform strategic investment, revenue, and customer‑lifecycle decisions across Finance, Marketing, Sales, Product, and Risk. Serve as the bridge between engineering and business teams, translating requirements into modelable solutions and ensuring models meet business impact goals. **Expectations:** - Deliver high‑impact data‑science projects on schedule. - Prioritize work based on strategic value and business impact. - Communicate technical concepts clearly to both technical and non‑technical stakeholders. - Collaborate across functions and drive adoption of model insights. - Explore productivity enhancements using LLMs and emerging tools. **Key Responsibilities:** - Design and deploy advanced predictive and forecasting models (e.g., revenue flow, CLV, prospect value). - Manage project timelines, milestones, and deliverables for data‑science initiatives. - Conduct early‑stage exploratory analysis to identify key drivers and features. - Build and maintain end‑to‑end data pipelines with ML engineering partners. - Implement post‑deployment performance testing and monitoring frameworks. - Align parallel modeling efforts with business expectations and reporting standards. - Gather and refine business requirements, ensuring models address stakeholder needs. - Investigate and prototype analytics improvements leveraging LLMs and other productivity tools. **Required Skills:** - Strong foundation in forecasting, predictive modeling, and value estimation (statistical & ML methods). - Advanced Python proficiency; experience with scikit‑learn, XGBoost, LightGBM, pandas, numpy. - SQL for data extraction and manipulation. - Familiarity with modern ML ops tools (e.g., Streamlit, Prefect, GitHub, Databricks). - Excellent written and verbal communication; ability to explain complex concepts to diverse audiences. - Proven teamwork and cross‑functional collaboration skills. **Required Education & Certifications:** - Bachelor’s degree + 5 + years of relevant experience **or** Master’s degree + 3 + years in finance, economic analysis, data science, or a related field. - No specific certifications required; demonstrated expertise in applied data science and business analytics is essential.
San francisco, United states
On site
Junior
19-12-2025
Company background Company brand
Company Name
Block
Job Title
Staff Software Engineer, Finplat, Ledgering
Job Description
**Job Title** Staff Software Engineer, Ledgering **Role Summary** Lead the design and evolution of Cash App’s core ledgering system, ensuring reliable, scalable processing of millions of daily transactions. Drive architectural decisions, technical excellence, and AI integration across a globally distributed engineering team. **Expectations** - Deliver high‑quality, impactful technical contributions that align with organizational goals. - Lead strategic, multi‑quarter initiatives and guide the team through complex technical challenges. - Mentor and influence peers, establishing technical standards and best practices. - Serve as a trusted advisor to senior leadership on technology strategy. - Foster a people‑first culture, encouraging collaboration and diverse viewpoints. **Key Responsibilities** - Design, develop, and maintain durable, scalable ledgering services using Java/Kotlin on AWS. - Ensure data integrity and zero discrepancy for high‑volume transaction processing. - Architect event‑driven pipelines with Kafka, gRPC, and Protocol Buffers. - Resolve technical obstacles, perform root‑cause analysis, and propose long‑term solutions. - Embed AI/machine learning capabilities into ledgering workflows to improve accuracy and efficiency. - Collaborate across global teams (USA, Australia, Europe) and with internal engineering groups. - Mentor junior engineers, review code, and enforce quality standards (JUnit, Hibernate, Guice). **Required Skills** - 10+ years of experience designing and delivering complex, distributed systems. - Proficient in Java, Kotlin, MySQL/Vitess, AWS (EC2, RDS, Lambda, SQS, SNS). - Strong background in event‑driven architectures (Kafka, message queues). - Expertise in HTTP, JSON, gRPC, and Protocol Buffers. - Experience with testing frameworks (JUnit), ORM (Hibernate), and dependency injection (Guice). - Deep understanding of scalability, high availability, and fault tolerance. - Excellent communication, leadership, and empathetic collaboration skills. - Curiosity, problem‑solving mindset, and ability to thrive in ambiguous environments. **Required Education & Certifications** - Bachelor’s or Master’s degree in Computer Science, Computer Engineering, or related field. - Relevant certifications desirable: AWS Certified Solutions Architect, Oracle/Java SE Certified Developer, or equivalent.
San francisco, United states
On site
Senior
23-12-2025
Company background Company brand
Company Name
Block
Job Title
Staff Data Scientist - Financial Platform
Job Description
**Job Title**: Staff Data Scientist – Financial Platform **Role Summary** Lead data science initiatives for Block’s Financial Platform (FinPlat), driving end‑to‑end data products that enable payment processing, transaction cost optimization, and financial reporting across Square, Cash App, and AfterPay. Partner with product, engineering, and operations leaders to translate raw event data into actionable insights, machine‑learning models, and evidence‑based product decisions. **Expectations** - 12+ years of experience in data science (12 yrs with BA), or 8+ yrs with Master’s, or PhD with 5+ yrs. - Deep expertise in payment rail analytics (cards, ACH, wire, direct deposit). - Proven ability to solve ambiguous, high‑impact problems and influence product strategy. - Strong hypothesis‑testing, causal inference, and statistical modeling skills. - Hands‑on experience building data pipelines, architecture, and ML models. - Excellent communication and stakeholder‑management skills across cross‑functional teams. **Key Responsibilities** 1. Build comprehensive data products from raw event streams to deliver actionable analyses and predictive models. 2. Conduct hypothesis testing, causal inference, and A/B experimentation to evaluate product changes, authorization rates, and transaction costs. 3. Collaborate with Engineering to design and prototype data pipelines and scalable analytics architectures. 4. Drive data‑driven decision making in product roadmaps, working closely with FinOps and product managers. 5. Evangelize analytical solutions within the AI, Data & Analytics community, improving insightfulness and actionability of financial analytics. 6. Report on key financial metrics, maintaining accuracy and alignment with the central FinPlat ledgering system. **Required Skills** - Advanced proficiency in SQL, Python, and data visualization tools (e.g., Looker). - Expertise in hypothesis testing, causal inference, and experimental design. - Strong statistical and machine‑learning modeling experience. - Knowledge of payment processing, chargebacks, and financial transaction analytics. - Ability to architect data pipelines and manage large event‑driven data sets. - Clear communication, presentation, and stakeholder management skills. **Required Education & Certifications** - Bachelor’s degree in Engineering, Computer Science, Physics, Mathematics, or related technical field (or equivalent experience). - Master’s degree or PhD in a technical or quantitative discipline preferred. ---
San francisco, United states
On site
Senior
25-12-2025
Company background Company brand
Company Name
Block
Job Title
Senior Software Engineer, Data Operations Platform
Job Description
**Job Title** Senior Software Engineer, Data Operations Platform **Role Summary** Design, build, and maintain scalable, reliable core components of the company’s Lakehouse platform built on Databricks and Snowflake. Drive framework enhancements, data ingestion pipelines, governance standards, and performance optimizations to support analytics, machine learning, and AI initiatives across the organization. **Expectations** 8+ years of software engineering or data platform development experience. Demonstrated ability to design distributed data systems, implement best‑practice frameworks, and collaborate with cross‑functional teams. Strong focus on data quality, governance, cost efficiency, and performance. **Key Responsibilities** - Engineer core Lakehouse platform components and API frameworks that enable data ingestion, transformation, and storage. - Build and extend standardized data ingestion pipelines for internal teams, ensuring robustness, governance, and scalability. - Partner with product, data engineering, ML, and AI stakeholders to translate data requirements into architecture and implementation plans. - Define and promote Lakehouse architecture best practices, including data modeling, ETL/ELT processes, cost optimization, and query performance. - Contribute to the long‑term technical vision for the data platform, identifying innovation opportunities and shaping future infrastructure direction. - Maintain, test, and document platform components to support high availability and reliability. **Required Skills** - Proficiency in Python, Java, or Go for data framework and service development. - Hands‑on experience with Apache Spark and distributed data processing. - Deep knowledge of Databricks, Snowflake, and Lakehouse architecture. - Strong grasp of data modeling, ETL/ELT workflows, and data lake concepts. - Experience with cloud data ecosystems (AWS, GCP, or Azure). - Expertise in data governance, data quality, cost optimization, and query performance tuning. - Excellent communication and collaboration skills for cross‑team engagement. **Required Education & Certifications** - Bachelor’s degree in Computer Science, Software Engineering, or a related technical field (or equivalent professional experience). - No specific certifications required; additional credentials in cloud data platforms or big‑data technologies are a plus.
San francisco, United states
On site
Senior
25-12-2025