
Fermé
Publié
Payé lors de la livraison
We require a robust, from-scratch backend system capable of continuously ingesting live exchange data streams — including price ticks, trade volumes, and full order-book depth — from multiple major cryptocurrency exchanges. The system must normalize heterogeneous exchange formats into a unified schema and store them in a scalable, time-series optimized architecture that can grow seamlessly with increasing traffic and trading pair coverage. On top of this storage layer, we require a clean, well-structured internal API that powers real-time dashboards and analytics without exposing raw databases to the front-end layer. Architecture Expectations The backend must include: • Low-latency WebSocket-based ingestion pipelines • Lossless, fault-tolerant streaming architecture • Schema normalization across exchanges • Time-series optimized storage with rollups and retention policies • Horizontal scalability by design Analytics Requirements Analytics are central to the build. The system must support: • Trend analysis (technical indicators, momentum, moving averages) • Liquidity and order-flow analytics (spread, imbalance, depth metrics) • Volatility and risk calculations (realized volatility, drawdowns, regime detection) • Sentiment overlays derived from derivatives data (funding rates, open interest) • Predictive modelling modules (probabilistic signal generation — not guaranteed forecasting) Analytics must compute in near-real time and be accessible exclusively through the internal API. The frontend must never query raw databases directly. Technology Stack We are open to the stack, including but not limited to: Python, Go, Node.js, Rust Kafka or equivalent streaming broker WebSockets Redis TimescaleDB, InfluxDB, or ClickHouse REST or gRPC Docker and Kubernetes You may recommend the stack you can confidently deliver and support. Primary priorities: • Low-latency ingestion • Fault tolerance • Horizontal scalability • Clean modular code structure • Maintainability and clear documentation Deliverables • Streaming ingestion engine connected to at least three major exchanges • Normalized historical time-series database with retention and roll-up policies • Analytics modules implementing the metrics described above • Authenticated internal API with documented endpoints • Containerized deployment (Docker) and basic orchestration scripts • Monitoring hooks and logging setup • Brief operational run-book for DevOps handover Acceptance Criteria Tick-level parity between exchange data and stored records over a continuous 24-hour validation run. Internal API endpoints returning computed analytics in <200 ms for queries covering the most recent 60 minutes of data. Clean, well-documented code pushed to our private repository. Architecture walkthrough session covering system design, CI/CD, scaling strategy, and monitoring. Closing This is an end-to-end systems engineering challenge. We are looking for a developer or team who can take ownership of architecture, implementation, and handover. If you’re confident in building high-performance streaming data systems at scale, let’s discuss timelines, milestones, and execution strategy.
N° de projet : 40263182
36 propositions
Projet à distance
Actif à il y a 12 jours
Fixez votre budget et vos délais
Soyez payé pour votre travail
Surlignez votre proposition
Il est gratuit de s'inscrire et de faire des offres sur des travaux
36 freelances proposent en moyenne $7 656 USD pour ce travail

⭐⭐⭐⭐⭐ Build a Scalable Backend for Live Crypto Exchange Data ❇️ Hi My Friend, hope you are doing well. I've reviewed your project requirements and see you are looking for a robust backend system for live cryptocurrency data. You have no need to look any further as Zohaib is here to help you! My team has successfully completed 50+ similar projects for backend systems. We will create a system that normalizes data from various exchanges and stores it efficiently, ensuring it can grow with your needs. ➡️ Why Me? I can easily build your backend system as I have 5 years of experience in backend development, focusing on data ingestion, API design, and real-time analytics. My expertise includes Python, Node.js, and database management. Additionally, I have a strong grip on technologies like Docker and Kubernetes, ensuring a robust architecture for your project. ➡️ Let's have a quick chat to discuss your project in detail and let me show you samples of my previous work. I look forward to discussing this with you in our chat. ➡️ Skills & Experience: ✅ Python Development ✅ Node.js Programming ✅ WebSocket Integration ✅ Data Normalization ✅ Time-Series Databases ✅ API Development ✅ Streaming Architecture ✅ Docker & Kubernetes ✅ Redis Management ✅ Kafka Integration ✅ Analytics Implementation ✅ Fault-Tolerant Systems Waiting for your response! Best Regards, Zohaib
$6 000 USD en 2 jours
7,8
7,8

I understand the importance of building a robust, high-performance Advanced Crypto Analytics Platform that meets your specific requirements. Your need for a from-scratch backend system capable of ingesting live exchange data streams from multiple major cryptocurrency exchanges, along with the need for real-time analytics and a scalable architecture, is crucial for your project's success. With over 5 years of experience in blockchain and Web3 projects, I have successfully implemented similar solutions, including the normalization of heterogeneous data formats, real-time analytics, and scalable storage architectures. My expertise in Python, Node.js, and Kafka align perfectly with the technology stack you have outlined, ensuring a smooth and efficient development process. I have a proven track record in delivering high-performance streaming data systems and analytics modules, and I am confident that I can meet and exceed your expectations for this project. Let's discuss timelines, milestones, and execution strategy to ensure a successful implementation.
$8 000 USD en 60 jours
7,3
7,3

⭐⭐⭐⭐⭐ CnELIndia proposes a high-performance, event-driven architecture built with Rust/Go for low-latency WebSocket ingestion and Python for analytics. We will implement exchange-specific connectors for three major exchanges, stream data through Kafka with exactly-once semantics, and normalize schemas into a unified model. Time-series storage will use ClickHouse or TimescaleDB with partitioning, rollups, and retention policies to ensure scalable historical depth. A modular analytics engine will compute indicators, liquidity/order-flow metrics, volatility, derivatives sentiment, and probabilistic signal models in near real time. An authenticated internal REST/gRPC API layer (sub-200 ms for last-60-minute queries) will isolate storage from dashboards. Dockerized microservices with Kubernetes orchestration, CI/CD pipelines, structured logging, Prometheus monitoring, and a DevOps run-book will ensure fault tolerance and horizontal scalability. Under the leadership of Raman Ladhani, CnELIndia will own architecture, implementation, validation (24-hour tick parity), documentation, and walkthrough—delivering a clean, maintainable, production-ready system end-to-end.
$7 500 USD en 7 jours
5,9
5,9

Interesting project, I will build the streaming ingestion layer in Go with WebSocket connections to Binance, Coinbase, and Kraken, normalize ticks into a unified schema, and store them in TimescaleDB with continuous aggregates for automatic rollups. Analytics modules will compute indicators, order-flow metrics, volatility, and sentiment overlays through a gRPC API returning under 200ms. I will use Kafka partitioned by trading pair so each exchange consumer runs independently, a failure on one does not stall the others, and adding a new exchange is a single consumer deployment with no pipeline changes. Questions: 1) How many trading pairs per exchange at launch? 2) Do you need historical backfill from exchange REST APIs, or only live forward data? 3) Is there a frontend team already, or does that come later? Looking forward to your response. Best regards, Kamran
$6 000 USD en 25 jours
6,3
6,3

As a seasoned professional in the realm of secure blockchains, my skill set dovetails perfectly with your requirements for an advanced crypto analytics platform. Having already developed and deployed both decentralized and centralized exchanges, I'm exceedingly experienced in normalizing heterogeneous data sources and ensuring that they assume a unified schema. My specialization also extends to extensive data analysis, which has seen me perfecting tech-based trend analysis, assessing liquidity, calculating volatility, and conducting probabilistic signal generation. Moreover, I have a thorough understanding of risk assessment protocols and detecting market regimes based on historical data. This ability will be invaluable to your project as it requires near-real-time computations which I can assure you, I can deliver. Scaling is something I heavily factor into all my projects and you mentioned that this was a primary priority for your project as well. Alongside proficiency in Python, Rust and Node.js, I am also well-versed in DevOps tools like Docker and Kubernetes. This ensures that not only will we manifest your vision in efficient code - we'll
$10 000 USD en 90 jours
5,1
5,1

This looks like a great fit, We will build the ingestion engine in Go with WebSocket feeds from Binance, Coinbase, and Kraken, normalize tick and order-book data into a unified schema, and pipe it through Kafka into TimescaleDB. Analytics modules will deliver indicators, liquidity metrics, volatility, and sentiment data through an authenticated gRPC API. One thing we will set up from day one: materialized continuous aggregates at 1-second, 1-minute, and 1-hour granularities so the API reads pre-computed rollups instead of scanning raw ticks. This is how you hold the sub-200ms target consistently as data volume grows. Questions: 1) Which exchanges are highest priority beyond the initial three? 2) Should predictive modules run across all trading pairs, or a curated watchlist? 3) What is the expected concurrent user count on the dashboards? Send me a message and we can discuss further. Best regards, STALLYONS Technologies
$5 000 USD en 30 jours
4,7
4,7

Hi, This is a systems-engineering problem, not just an API build, and I’m comfortable owning it end-to-end. I would design the ingestion layer in Go or Rust for deterministic low-latency WebSocket handling, feeding a Kafka (or Redpanda) broker to guarantee ordered, fault-tolerant streaming across exchanges. A normalization service would transform heterogeneous payloads into a unified schema before persisting into a time-series-optimized backend such as ClickHouse (for ultra-fast analytical queries) or TimescaleDB depending on your retention and aggregation patterns. Analytics would run as modular stream processors: real-time indicator computation (EMA, VWAP, volatility), liquidity and order-flow metrics (spread, imbalance, depth skew), derivatives overlays (funding, OI), and probabilistic signal modules — all exposed exclusively through an authenticated internal API (REST or gRPC). Redis would be used for hot-window caching to guarantee <200 ms responses on recent data. The system would be containerized (Docker) with Kubernetes-ready manifests, structured logging, Prometheus metrics, and alerting hooks. I focus heavily on clean modular boundaries (ingestion, normalization, storage, analytics, API) so horizontal scaling and maintenance remain straightforward. Happy to walk through a concrete architecture diagram and scaling plan tailored to your expected throughput.
$10 000 USD en 7 jours
4,2
4,2

Hi, I am an IIT Grad, PMP Certified Professional, ex-BFSI and worked at fortune 500 companies. I will make it a reality for you. As a Backend Developer, I will build a robust backend system using Node.js, PostgreSQL, and Apache Kafka to continuously ingest live exchange data streams, normalize formats, and store them in a scalable time-series optimized architecture utilizing InfluxDB and TimescaleDB for seamless growth. Kindly click on the chat button so we can discuss and get started. Will share you my prior projects done and my resume too. I have been doing freelancing since 2019 worked at top MNCs in both USA and India. Lets connect
$5 000 USD en 7 jours
2,7
2,7

I’ve built low-latency streaming systems before that continuously ingest and normalize live market data from multiple sources. In one project, I aggregated heterogeneous financial feeds into a unified schema with fault-tolerant Kafka pipelines and stored the data in a TimescaleDB cluster optimized for time-series queries and rollups. This architecture scaled horizontally and maintained sub-second query response even under heavy load. For your platform, I suggest using Go for ingestion services with WebSocket clients feeding Kafka topics to ensure lossless, fault-tolerant streaming. TimescaleDB fits well for your storage needs with retention policies and rollups. The internal API can be built with gRPC for performance and strong typing, ensuring the frontend never hits raw data stores. To design fault tolerance, would you prefer active-active ingestion pipelines, or is active-passive failover acceptable? Also, how granular should the rollups be configured to balance real-time speed and historical depth? I’m ready to start architecting and building the full pipeline, from ingestion to analytics modules and internal API, and deliver tested, documented code along with a run-book for smooth handover.
$5 000 USD en 7 jours
2,9
2,9

Thanks for sharing the details. I’ve reviewed your requirement and would be glad to discuss it further. I’m Prabhath, an experienced MQL4/MQL5, Pine Script, Python, and C++ developer specializing in automated trading systems and institutional-grade algorithmic solutions. I develop Expert Advisors, indicators, dashboards, data tools, and custom trading utilities for MT4/MT5, TradingView, and standalone platforms. Along with MQL5 systems, I also build fully automated trading software in Python and C++ for Indian stock markets and global exchanges (US, EU, and others). These solutions can be tailored for stocks, indices, futures, forex, and crypto based on project needs. As an active trader, I work with ICT, SMT, market structure, liquidity models, order blocks, FVGs, VWAP, and volume-based logic, ensuring each strategy follows the client’s trading methodology. My expertise includes institutional-grade EA and indicator development, ICT/SMT-based trading systems, Pine Script automation, Python and C++ systems for Indian and global markets, backtesting, paper trading and live trade integration, strategy optimization, and low-latency execution. I also fix, optimize, and enhance existing trading systems to make them stable and production-ready. Where permitted, I can share demos or walkthroughs of previously completed projects while respecting client confidentiality. Thank you for your time and consideration.
$5 000 USD en 15 jours
2,7
2,7

Hi, I have carefully reviewed your Advanced Crypto Analytics Platform project. You require a low-latency, fault-tolerant streaming backend that ingests multi-exchange tick/order-book data, normalizes schemas, stores time-series data efficiently, and serves near-real-time analytics via a secure internal API. I propose a high-performance architecture using Rust or Go for WebSocket ingestion, Kafka for durable streaming, ClickHouse for time-series storage with rollups/retention, Redis for hot caching, and a Python analytics layer (NumPy/Pandas/FastAPI). gRPC/REST will expose authenticated internal endpoints, fully containerized via Docker with Kubernetes-ready scaling. The system will guarantee tick-level parity, <200 ms analytics response (last 60 mins), and modular, well-documented code with monitoring and CI/CD support. Ready to discuss architecture and execution strategy. Daniel
$5 000 USD en 7 jours
2,4
2,4

I understand you require a robust backend system to ingest live exchange data streams with low latency and fault tolerance, normalizing heterogeneous formats into a unified schema stored in a time-series optimized database. You also need a clean internal API serving real-time analytics like trend analysis and volatility metrics, without exposing raw databases to the frontend. With over 15 years of experience and 200+ projects completed, I specialize in full stack development using Python, Node.js, and API integration, combined with strong DevOps skills including Docker and Linux environments. My background includes building scalable streaming pipelines and time-series data architectures tailored to financial and crypto markets. For your project, I will design a fault-tolerant WebSocket ingestion pipeline using Python or Node.js, normalize data into a TimescaleDB or ClickHouse backend with retention and roll-up policies, and build a modular internal API delivering analytics within 200 ms. Containerized deployment with Docker and basic orchestration scripts will ensure maintainability. A 6-8 week timeline is realistic for delivery and thorough testing. Let’s discuss how to align milestones and ensure smooth handover for your advanced crypto analytics platform.
$5 500 USD en 7 jours
2,0
2,0

Hello, Now Meta is pleased to offer our expertise in Matching Job Skills for your project, "Advanced Crypto Analytics Platform." We have carefully reviewed the project requirements and are excited to propose our approach. Our team will follow a systematic process by implementing low-latency WebSocket-based ingestion pipelines, fault-tolerant streaming architecture, schema normalization across exchanges, time-series optimized storage, and horizontal scalability. We will focus on trend analysis, liquidity and order-flow analytics, volatility and risk calculations, sentiment overlays, and predictive modeling modules in near-real time accessible through the internal API. We recommend utilizing Python, Kafka, Redis, TimescaleDB, and Docker for this project. Our primary priorities include low-latency ingestion, fault tolerance, scalability, clean code structure, and maintainability. We invite you to open a chat for a more personalized discussion to move this project forward. Regards, Now Meta
$5 000 USD en 7 jours
0,0
0,0

Hello there, We have around 8 years of rich experience in real-time data engineering, streaming architectures, and production ML systems. Your requirement for tick-level parity over a 24-hour validation run — combined with <200ms analytics on 60-minute windows — tells me this needs careful architecture choices, not just good code. For the predictive modelling modules and sentiment overlays, we'd build a structured pipeline: raw market signals (funding rates, open interest, order-flow imbalance) feed into feature engineering, then into probabilistic signal generation using lightweight gradient-boosted models for regime detection and volatility forecasting. No LLM needed here — but we'd use structured output validation and fallback logic similar to what we apply in AI pipelines: confidence scoring on every signal, automatic degradation to simpler heuristics when model drift is detected, and strict schema validation on all analytics outputs before they hit the internal API. Cost and compute stay lean because we'd run inference on pre-aggregated rollups from TimescaleDB — not raw tick data. Batch retraining on historical windows, real-time scoring on compressed features. Redis handles caching for repeated dashboard queries, cutting redundant computation by 80%+ on common time ranges. Naveen Brainstack Technologies
$9 500 USD en 70 jours
0,0
0,0

Delhi, India
Méthode de paiement vérifiée
Membre depuis mars 9, 2024
₹37500-75000 INR
₹1500-12500 INR
₹1500-12500 INR
₹1500-12500 INR
$8-15 USD / heure
$25-50 USD / heure
₹12500-37500 INR
$30-250 USD
₹400-750 INR / heure
$1500-3000 USD
$15-25 USD / heure
$5000-10000 USD
₹600-1500 INR
$1500-3000 USD
$25-50 USD / heure
₹250000-500000 INR
₹250000-290000 INR
$250-750 USD
₹750-1250 INR / heure
₹1500-12500 INR
₹1500-12500 INR
₹50000-55000 INR
₹12500-37500 INR
$10-30 USD / heure