The ultimate guide to hiring a web developer in 2021
If you want to stay competitive in 2021, you need a high quality website. Learn how to hire the best possible web developer for your business fast.
Data Integration is the process of combining data from multiple sources, into one common format, to easily analyze and use in meaningful ways. A Data Integration Expert is an individual with experience in merging, cleansing and transforming data from multiple sources into unified views. They understand the complexities data of physical or conceptual architectures, as well as the challenge of capturing heterogeneously sourced units of information into unified datasets. They also work with data ingestion, validation, transformation and conversion architectures to manipulate data responsibly.
Here's some projects that our Data Integration Experts made real:
Our Data Integration Professionals will partner with you by utilizing their expertise to create effective integrations and solutions that solve your business pain points. They understand the importance of planning, architecture, cleansing and enrichment of datasets to derive actionable insights quickly. If you are looking for a professional consulting services or custom development solutions no matter what the size of your project is, Freelancer.com is the right place for you. Here, you will be matched with highly skilled experts who are ready to transform your ideas into reality. So don't hesitate and post your project now! Hire a Freelancer Data Integration Expert and take your project to the next level!
Sur 32,556 commentaires, les clients ont évalué nos Data Integration Experts 4.83 sur 5 étoiles.Data Integration is the process of combining data from multiple sources, into one common format, to easily analyze and use in meaningful ways. A Data Integration Expert is an individual with experience in merging, cleansing and transforming data from multiple sources into unified views. They understand the complexities data of physical or conceptual architectures, as well as the challenge of capturing heterogeneously sourced units of information into unified datasets. They also work with data ingestion, validation, transformation and conversion architectures to manipulate data responsibly.
Here's some projects that our Data Integration Experts made real:
Our Data Integration Professionals will partner with you by utilizing their expertise to create effective integrations and solutions that solve your business pain points. They understand the importance of planning, architecture, cleansing and enrichment of datasets to derive actionable insights quickly. If you are looking for a professional consulting services or custom development solutions no matter what the size of your project is, Freelancer.com is the right place for you. Here, you will be matched with highly skilled experts who are ready to transform your ideas into reality. So don't hesitate and post your project now! Hire a Freelancer Data Integration Expert and take your project to the next level!
Sur 32,556 commentaires, les clients ont évalué nos Data Integration Experts 4.83 sur 5 étoiles.Federal Data Operations Specialist (Canonical Company Lists, Upstream of CRM) Engagement Type Contract / Part-time Remote Ongoing or project-based, depending on workload Engagement: Fixed-price, per-deliverable (project-based) Role ObjectiveGraviton builds outbound and CRM systems on top of canonical company role exists to create those datasets upstream of CRM, by correctly merging multiple raw data sources into clean, deterministic, company-level tables using stable , correctness, and discipline matter more than speed. Core Responsibilities You will: Merge multiple datasets with different levels of granularity into a single canonical table Enforce a primary identifier (e.g., UEI, CAGE, or equivalent) as the identity anchor Aggregate transaction-level data to the entity...
I'm looking for an experienced Python developer to create an algorithmic trading bot based on my own unique strategy for options trading. The bot should incorporate the following key functionalities: - Backtesting: Ability to test the strategy against historical data. - Real-time data feed: Integrate with reliable data sources for live market updates. - Automatic order execution: Execute trades automatically based on the strategy signals. Ideal Skills and Experience: - Proficiency in Python and experience with trading libraries. - Strong understanding of options trading and market dynamics. - Experience in developing and backtesting trading algorithms. - Familiarity with integrating real-time data feeds and order execution systems. Please provide examples of previous work and relev...
I need assistance in setting up a learning connection between Supabase and Qwen 2.5, specifically for data synchronization in both directions. Key Requirements: - Implement data synchronization between Supabase and Qwen 2.5. - Ensure the connection supports both directions for data flow. - Initial focus on user data, with potential to expand to course content and usage statistics. Ideal Skills and Experience: - Proficiency in Supabase and Qwen 2.5. - Strong understanding of data synchronization techniques. - Experience with real-time data updates and authentication integration. - Ability to document the setup process for future reference. Please provide relevant experience in your bids.
I have an ODB database sourced from the Statistics Canada page ( ) that lists every non-residential building across Canada. I need the entire dataset— every province, every building— exported into a single, clean .xlsx workbook. In addition, I will need to have every Residential low, medium and high-rise building in the Country. This data will be sources from the same suite and additional researched sources which you will be required to research. Low rise buildings have 4 floors or more and should start there. Again all buildings are divided into the noted segments by province and then by city. Please place all fields in one master sheet and activate an auto-filter row so I can sort and slice instantly. At minimum each record must include: building name (if present), str...
I need a propensity-modelling software package that plugs directly into our CRM system, website analytics, and sales database, unifying those streams into a clean, continuously updated dataset. On top of that data layer, the build must train, evaluate, and deploy the best-performing predictive models—whether regression, decision-tree, neural-network, or any other technique that proves superior—then surface the results through a lightweight web interface and an API our teams can call in real time. Key deliverables • Automated ETL jobs and data-quality checks for the three sources mentioned above • Modular training pipeline with experiment tracking, lift/ROC reporting, and feature-importance visuals • Scoring service exposed via REST (or GraphQL) endpoints plu...
AI Automation for Finance Analytics AI / Machine Learning DO NOT BID IF BIDDING FOR 40-HOUR WORK WEEK WE ARE LOOKING FOR A CONSULTANT / BUILDER / TUTOR TO WORK WITH OUR TEAM 3-10 HOURS A WEEK TO BUILD THE SYSTEM JONITLY DO NOT BID FOR LONGER THAN THOSE HOURS. DO NOT BID FOR FULL-TIME WORK DETAILS OF WHAT I NEED HELP WITH I run a real estate private equity and hotel development platform. We want to replace manual analysis and reporting with a practical AI workflow. This is about extracting, comparing, and interpreting data. Excel and PowerPoint remain the source of truth. What we need: -Compare PowerPoint vs Excel and flag mismatches - Explain underwriting models and trace outputs - Compare legal/term sheets vs financial assumptions - Track document versions and changes - Summarize dea...
Our organisation’s data assets sit in multiple platforms and formats, and the absence of a common language is starting to hinder everything from analytics to audit response times. I want to put in place a unified metadata management framework that improves governance across the board, with a special emphasis on meeting strict Compliance requirements. The scope covers information architecture, controlled vocabularies, and an enterprise-wide taxonomy that standardises terminology between systems. The work will culminate in a reusable model that makes stewardship activities—classification, retention, lineage tracking—far easier and more defensible. Key deliverables • A logical and physical data taxonomy covering all major subject areas • Controlled vocabu...
I have several disparate sources holding our inventory information and I want it all pulled together into one clean, well-structured Microsoft SQL Server database file. The job is straightforward in principle: extract every piece of inventory data you can reach, reconcile duplicate SKUs or mismatched item descriptions, then load the final result into a single .sql or .bak file that I can restore through SQL Server Management Studio. Please preserve every existing field that carries operational value—quantities on hand, reorder thresholds, location codes, cost figures, and any audit timestamps—and normalise the schema where it clearly improves reporting speed or data integrity. If a field appears under different names in different systems, map it consistently; if you uncover ga...
AI Systems Engineer — Audit-Grade Investment Research Platform (Indian Equities) We are building an autonomous, institutional-grade “Investment Committee” system for Indian equities (NSE/BSE). This is NOT a trading bot and NOT a dashboard project. It is a backend-first, audit-grade research platform designed to perform forensic-level analysis of company filings, fundamentals, governance, and supply-chain signals. The system must operate with ZERO manual data handling and full traceability. ________________________________________ Core Objective Design and implement a fully automated, multi-agent research platform that: • Discovers, ingests, validates, and parses official exchange filings • Enforces evidence-based decision logic in backend code • Maintains c...
Estoy por arrancar una migración con Informatica PowerCenter y necesito apoyo directo en la construcción de los procesos ETL. El conjunto de datos con el que trabajaremos está en Oracle y lo extraeremos desde una base de datos SQL; el objetivo es una migración limpia, controlada y auditable hacia el nuevo esquema. Qué espero de ti • Diseñar y desarrollar mappings, sesiones y workflows en PowerCenter que cubran todo el flujo de extracción, transformación y carga. • Incluir manejo de errores, validaciones de calidad y registro detallado de eventos. • Documentar cada mapping en el repositorio y entregar un pequeño manual de despliegue/rollback. Aceptaremos el trabajo cuando: - Todos los procesos se ejecu...
I already have an automation running across Google Forms, Google Sheets, Autocrat, my self-hosted n8n instance, and ZeptoMail. Autocrat is doing its job and dropping a neatly-named PDF into Google Drive every time a form is submitted, and n8n is successfully firing off an email through the ZeptoMail API. Where things fall short is in the last mile: • the freshly-generated PDF never makes it into the outgoing message, and • the body of the email can’t yet greet each recipient by name or include any other personalized text. What I need from you is a tidy update to the n8n workflow (and, if necessary, a tweak to the ZeptoMail API call) so that: 1. The PDF is pulled from Drive and sent as a direct attachment—not just a link. 2. The email body can be dynami...
I have two payroll spreadsheets—one shows what we planned to spend, the other records what we actually paid, both already broken out by department. I need a solution that lets me compare the two quickly, spotlight variances, and surface trends without wrestling with formulas every month. Here’s what I’m after: 1. A refined spreadsheet template (Excel or Google Sheets is fine) that I can drop fresh data into at any time. When I paste or import the budget and actual files, the template should instantly calculate departmental variances, highlight over- or under-spend, and roll everything up into a clean executive summary. 2. A lightweight web app that performs the same comparison online. I want to upload the two files (CSV/XLSX), hit “run,” and see: • ...
I need a Google Cloud / BigQuery specialist to stand up an end-to-end, webhook-driven data ingestion pipeline running in our production environment. When our external form system fires a webhook, your Cloud Function (or equivalent service) should capture the JSON payload, write the untouched record to a raw BigQuery table, then immediately process it. The processing step must • parse any nested JSON, • flatten and clean each answer field, • split the results into two purpose-built reporting tables, and • guarantee idempotency through a hashing technique that blocks duplicates. All components have to be secure, version-controlled, and able to scale with traffic spikes. This is strictly a backend/data-engineering job—no UI work is involved.
I have four spreadsheets in total—three inventory feeds from separate dropshippers plus my own master product list. Your task is to merge these files, align every item by its SKU, part number, or manufacturer part number (whichever matches first), and refresh the quantity on my master list. Here’s exactly what I need done: • Consolidate all four files in Excel. • Use the SKU/part-number fields as the sole matching key. • Whenever the same item appears in more than one feed, set the quantity to the highest figure found. • Keep every column that exists in any of the source sheets so nothing is lost; I want a single, fully populated file back. • Flag anything that doesn’t find a match so I can review quickly. Please deliver the finished w...
I'm looking for a skilled developer to build a comprehensive customer database platform. The platform is intended to serve multiple purposes including: - Storing contact information - Managing customer interactions - Tracking sales and orders Key features of the database should include: - Ability to add new contacts via manual entry or importing from files - Storing a variety of information such as: - Personal details (name, address, phone) - Customer preferences and history - Contract information - Sales information Ideal skills and experience: - Experience in database design and development - Familiarity with data import functionalities - Strong understanding of data privacy and security - Ability to create user-friendly interfaces for manual data entry Please provide e...
I want to enrich my online shoe store with an AI-powered recommendation engine that studies each shopper’s purchase history and instantly serves up the pairs they are most likely to buy next. The model can draw on three data streams—user account data, my e-commerce platform records, and any third-party customer datasets I supply—to build a unified profile and surface truly personal suggestions. Here is what the finished job looks like from my side: • A trained model (Python preferred, TensorFlow or PyTorch are both fine) that ingests the above data sources, updates itself regularly, and outputs ranked product recommendations in real time. • An API or embeddable snippet I can drop into the product and home pages to display “You might also like” sh...
We want to do this in a consulting / facilitators / builders format in which we work with the facilitator / consultant / trainer for 3-6 hours a week for 3-6 months in order to help us collaboratively create various agents for our private equity business. The only billed time will be the time spent on the video call with our team, unless specifically approved otherwise. we want to be able to create a screen scrape tool to average certain cost items of specific real estate proejcts We also want to compare legal documents vs term sheets and excel spreadsheets Data sources • Company databases (SQL, flat files, Excel exports) - Dropbox all our files are in drop box • Extensive web scraping for competitor benchmarks and investment-market signals If you have ideas for safely add...
I need help to tabulate data from 1-5 Excel spreadsheets. The data is to be used for customer productivity and incentive calculation. Key tasks include: - Tabulating sales figures - Tabulating performance metrics Ideal skills and experience: - Proficiency in Excel - Experience with data tabulation and automation - Attention to detail
looking for an experienced OT / Industrial IIoT freelancer to develop a small on-premise technical demo (PoC) related to an automotive Paint Shop (Pre-Treatment & E-Coat). The goal is to demonstrate a realistic Industry 4.0 data flow, fully on-premise, without modifying PLC control logic, and aligned with industrial cybersecurity best practices.
AIRDIT is looking for an experienced Databricks Architect / Expert to support our team on a part-time basis (5–6 hours). Key Responsibilities • Set up and configure the Databricks environment • Guide and mentor the technical team • Assist with Databricks sizing and architecture planning • Recommend best practices for performance, scalability, and cost optimization Experience Level Architect / Expert (hands-on Databricks experience required) Engagement Model Part-time
Freelancer Job Brief: AI-Driven Product Builder (Design, Development & Systems Thinking) No Placeholder bids We are building a first-of-its-kind storytelling and book-creation platform that combines AI interviewing, intelligent content structuring, and high-quality design output. This is not a simple app or a single product. It is a scalable system designed to generate multiple types of books and digital outputs from human experiences, using AI as the engine. We are looking for a high-calibre developer / designer team capable of thinking in systems, not just screens. High-Level Product Overview The platform captures human stories through guided interaction, enriches them using existing profile data, and transforms them into professionally designed outputs (digital, PDF, and physic...
Samsung phone has data problems. Can you help? Telco has not helped much. Will need to work with me over the phone in english.
We are looking for a skilled Power Automate Desktop RPA developer to implement a complex automation for a Windows Terminal Server environment. The target system is a legacy desktop ERP (TriData) with accessible UI Automation elements (verified via Accessibility Insights). Key tasks: • Launch and control a desktop ERP application • Navigate customer orders • Apply filters and sorting • Execute price assignment and updates • Handle confirmation dialogs • Iterate through order lines • Evaluate supplier availability and pricing • Write results to log files • Prepare (but not submit) purchase orders Environment: • Windows Server 2019 (Terminal Server) • RDP-based execution • User session remains logged in Deliverables: ...
Help wanted: daily/multi-daily comparison of supplier prices and stock levels (B2B webshop) Text: We operate a B2B webshop where business customers can place orders or commission items on request. Most of the goods are sourced directly from manufacturers. For most suppliers we have access to their stock levels and current prices; for some, no login is required, while others require login credentials. We are looking for a solution or a skilled professional who can help us retrieve supplier prices and stock levels daily — ideally multiple times per day — and compare them with our internal purchase prices so we stay up to date. No automatic syncing with our system or automatic price changes are required. It is sufficient if discrepancies between supplier prices and our system pu...
Project Overview Demo of project we would like to use as example provided below. We are seeking an experienced developer to build a low-latency, real-time stock screener using the QuoteMedia API. The screener will monitor all actively traded U.S. stocks and display key metrics with updates occurring every second. The ideal candidate should have prior experience building financial data tools, stock scanners, or trading platforms and be comfortable working with high-frequency data updates. Core Requirements Data Source QuoteMedia API will be used as the primary data provider. The system must pull and maintain a live list of all U.S. equities. Required Metrics For each stock, the screener must calculate and display: Current price Volume Relative volume (daily) Relative volume (5-m...
I already run my content calendar across several social- and blog-channels and now want the entire publishing chain handled automatically through Make (formerly Integromat). Together we’ll map the exact workflow, but at a high level I need a scenario that can: • Pull fresh copy, images, and short clips generated by ChatGPT, Google AI, or IBM Watson—choosing the best tool per step. • Re-format that content on the fly (proper aspect ratios, text length, metadata, alt tags). • Publish simultaneously to the social networks and blogging platform accounts I’ll share with you. • Return clear logs plus error notifications in Slack or email so nothing slips through the cracks. what I’m missing is the Make scenario(s) and any helper scripts that g...
I run a growing remote insurance sales agency that operates entirely inside GoHighLevel. We onboard new agents weekly, and I am moving all technical setup and troubleshooting away from myself and into a dedicated support role. This is not a one-time project. This is an ongoing position where you become the primary technical contact for agents during onboarding and daily usage of the platform. Agents will message you directly when they need help inside the CRM. Your job is to guide them, troubleshoot issues, and make sure they are operational. You are NOT responsible for sales, coaching, or insurance — only platform support. What You’ll Help Agents With • CRM login issues • Connecting Google calendar & email • LeadConnector mobile app setup • Phone...
I have a Siemens IOT2040 already running Node-RED and two flow-meters that expose their readings over Modbus registers. I need a clean flow that polls those two devices, keeps the register map exactly as it is, and republishes the same data over Modbus TCP so a Crimson-based PLC on the other side can read it without extra parsing or scaling. There is no data pre-processing involved: whatever the meters expose is what the Crimson PLC must see. All I’m after is a reliable bridge with tidy, documented Node-RED nodes and comments so future edits are easy. Deliverables and Remote Work: 1- I will provide access to Red node and computer connected to the IOT - work can be done remote connection or on your workstation then imported here, 2- There will be UIs on the Node Red to see wh...
We use "insync" as connector between Sage100 and Magento, need an expert to make bidirectional data flowing more seamlessly. If you are not familiar with Sage ERP, this is not for you.
Looking for help to make and Setup account in POWER BI Pro
Veuillez vous inscrire ou vous connecter pour voir les détails.
I need every bit of information currently stored in my Tally company—masters, vouchers, inventory, bank transactions, statutory ledgers, the lot—pulled out once and delivered in a clean, tabular Excel workbook. The extraction must be fully automated (TDL, ODBC, or any method you’re comfortable with) so I can rerun it later, but this engagement covers a single execution and hand-over. Deliverables • An Excel file where each dataset appears as a properly labeled table, with field names matching Tally, dates and numbers intact, ready for analysis or import elsewhere. We will provide Tally data file. Let me know which approach you prefer (TDL, ODBC, etc.) and how quickly you can turn the finished workbook around. Please also advise your working days and hours.
I have an Excel workbook that stores my customer, product and order information. I want the data exposed through a clean GraphQL endpoint running on Mule 4 so that any client can query or update it without touching the raw spreadsheets. The job is centred on two things I specifically need: data transformation and data integration. First, the data sitting in the Excel sheets must be converted—via DataWeave—into tidy JSON objects that match the GraphQL schema. Second, that transformed output has to be integrated into a GraphQL service inside Anypoint, with resolvers wired up and ready to run. What I expect from you • A Mule 4 project (exportable from Anypoint Studio) that reads the Excel file, performs the DataWeave transformation and exposes the GraphQL endpoint. &bull...
Need existing database of European countries which u scrapped earlier
We are looking for a contract Data Engineer with hands-on experience in Microsoft Fabric to support pipeline development and data movement activities. You will help me: 1) Develop and manage data pipelines using Microsoft Fabric 2) Perform data loading and data migration activities 3) Ensure reliable and efficient data flow across systems Required Skills: 1) Strong experience with Microsoft Fabric 2) Proven experience in pipeline development 3) Hands-on exposure to data loading and data migration projects
I have an Excel file in my local disk that stores user information in eight columns, and I need a repeatable Power Automate flow that moves this data straight into a Dataverse table. The goal is to eliminate manual data entry completely. Here’s what I already have: • A structured Excel sheet stored in my local disk. • A Dataverse environment with the destination table created and the columns mapped 1-to-1 with the spreadsheet. What I need from you: • Build and configure a Power Automate flow that takes the file, reads every row, and inserts or updates the matching records in Dataverse. • Make sure errors—such as duplicate rows or missing required fields—are handled gracefully with clear email or Teams notifications. • Provide a short walkt...
My service company relies on SupplyPro for job intake and Housecall Pro for field operations. To eliminate double-entry, I need a lightweight interface that pulls new or updated Job details and Scheduling data from SupplyPro every day and posts them into Housecall Pro through their public API. Scope • Build or configure a script, micro-service, or low-maintenance middleware that can be scheduled to run daily. • Map SupplyPro fields to the corresponding objects in Housecall Pro—job name, description, appointment date/time, technician notes, and any other essentials we identify together. • Handle basic data-format compatibility so nothing breaks if SupplyPro formats dates, phone numbers, or addresses differently. • Provide clear, commented source code plus ...
I need a Google Sheet that automatically pulls historical data for any Indian stock I type in, lets me set a custom date range, and then shows: • Raw daily figures for any date range for traded volume, delivery volume, and the calculated delivery % • An auto-updating chart that visualises that percentage across the same period The flow I have in mind is simple: I enter a ticker (NSE symbol) and choose From / To dates; the sheet (perhaps through IMPORTHTML, IMPORTXML, an API, or Apps Script—whatever is most reliable) grabs the numbers, fills the table, and refreshes the line/column chart in one click. I should be able to repeat this on additional tabs for other symbols without rewriting code. Please build the sheet, wire up the data-fetch logic, create the delivery % fo...
I have two Excel report sheets and need help merging and comparing data between them. The final output should be a clean, automated dashboard that presents the results clearly.
I’m looking for a reliable way to mirror my on-premise SQL database to AWS RDS in real time. The web service you build will capture every change that hits the local tables—customer information, inventory data, and transaction records—then push those changes up to the cloud with minimal latency. My priorities are: • Zero data loss and consistent state between the two stores, even when connectivity drops temporarily. • Latency low enough to feel immediate to downstream apps that rely on the cloud copy. • A straightforward deployment path: I should be able to install the service on a Windows or Linux host beside the local database, point it at the RDS instance, and watch it run. Deliverables 1. Source code for the sync service (language and framewor...
I want to replace several manual reporting routines with an end-to-end AI workflow that ingests data from our internal finance databases and live web sources, then produces clear, timely analytics for management. Reporting and analytics are the sole focus—no transaction execution—so the system must excel at pulling, cleaning, and interpreting numbers rather than booking them. We also want to compare legal documents vs term sheets and excel spreadsheets Data sources • Company databases (SQL, flat files, Excel exports) - Dropbox all our files are in drop box • Extensive web scraping for competitor benchmarks and investment-market signals If you have ideas for safely adding external financial APIs later, let me know, but the two feeds above are mandatory. - Th...
I’m rolling out Joblogic across a multi-site facility management operation and need an experienced hand to make the transition smooth from day one. You should already know the ins-and-outs of Joblogic, be comfortable steering several moving parts at once, and communicate clearly with everyone from senior leadership to on-site technicians. Scope of work • Set up and configure our new Joblogic environment so it reflects existing workflows, asset hierarchies, and mobile field requirements. • Migrate historical data and integrate current data feeds without interrupting day-to-day service. • Deliver role-based training (remote and/or on-site) and concise support documentation so the team can hit the ground running. Acceptance criteria – Live Joblogic instanc...
Veuillez vous inscrire ou vous connecter pour voir les détails.
I need an experienced SQL professional who can take customer data sitting on a remote SQL Server and refresh the corresponding tables on our in-house server every day. The transfer must be reliable, consistent, and handled through SQL Server Management Studio (SSMS), as that is already part of our workflow and permissions model. Your job will be to analyse the source schema, map it cleanly to our destination tables, set up the daily pull / push routine, and keep concise logs so I can trace each run at a glance. Data consistency checks, graceful error handling, and verification reports are essential because the customer records drive several downstream apps on our side. The engagement is straightforward: configure the process, prove a few consecutive successful runs, document the step...
I have several Excel/CSV files filled entirely with numerical figures that I need migrated into a single Google Sheets workbook. Before the transfer, the data must be cleaned and reshaped—column headings reorganised, date and currency fields standardised, and redundant rows removed—so it lands in Sheets ready for instant analysis. You’ll work directly in Google Sheets, but you may need to manipulate the source files first in Excel (or another tool you prefer) to achieve the required structure. Accuracy is essential; formulas and totals in the final sheet should match the originals after reformatting. Deliverable: • A Google Sheets file containing all numbers from the source files, consistently formatted and fully validated. I’ll share the files and a brief m...
Título: Power App Model-Driven — Gestión de Pedidos de Libros Presupuesto: $200 - $250 USD (Precio Fijo) Descripción: Busco un desarrollador con experiencia en Power Apps (Model-Driven) y Dataverse para crear una aplicación sencilla de gestión de pedidos de libros. ¿Qué necesito? Tengo un negocio de venta y distribución de libros. Hasta ahora vengo manejando todo con un archivo Excel, pero ya me quedó chico. Necesito una Power App (preferiblemente model-driven) que me permita llevar el registro de mis clientes y sus pedidos, preparar las órdenes de compra a mis proveedores, y registrar los envíos que hago a los clientes (shipping log). Voy a compartir mi archivo Excel para que veas los datos que manejo actua...
Job Description: AI Implementation Specialist & Python Developer (Freelance) Contesto del Progetto SoldiExpert SCF, società di consulenza finanziaria indipendente , sta sviluppando un sistema proprietario di Intelligenza Artificiale Generativa per ottimizzare i processi aziendali. L'obiettivo è integrare i modelli LLM di Google all'interno dell'infrastruttura aziendale per gestire basi dati complesse e automatizzare task specialistici. Profilo Professionale Ricerchiamo un consulente freelance esperto in Intelligenza Artificiale e sviluppo Python per supportare l'implementazione di soluzioni avanzate di IA Generativa all'interno di un'infrastruttura aziendale esistente. Il candidato ideale possiede una solida esperienza nella configurazione di...
I have an ODB database sourced from the Statistics Canada page ( ) that lists every non-residential building across Canada. I need the entire dataset— every province, every building— exported into a single, clean .xlsx workbook. Please place all fields in one master sheet and activate an auto-filter row so I can sort and slice instantly. At minimum each record must include: building name (if present), street address, city, province, latitude, longitude, number of floors, and every other column that appears in the original ODB file. Both the full address string and the explicit lat/long coordinates have to be preserved so I can map or geocode later. The ODB occasionally stores lookup tables in separate relations; bring those values across so the sheet is human-readable—...
I’m wrapping up a new AI-driven automation service at TechMarX Pvt Ltd, and all core workflows must be live and demo-ready before 15 February. The single most critical piece is ERP workflow automation, but the full roll-out also covers lead research, LinkedIn outreach sequences, enrichment, and internal LinkedIn resource handling. You’ll step straight into an organised environment: tasks are already broken down, our interns are on hand for data collection or manual checks, and I’ll be available to unblock you quickly. All work is reviewed in a short demo first; once the solution is proven between 10:00 AM and 12:00 PM on or before the deadline, payment follows immediately. Deliverables • ERP workflow automation – stable, documented, and easily extendab...
I need a pathology expert to evaluate digital histopathological images. The primary focus is on identifying tumors or cancerous cells. Key Tasks: - Expert Evaluation: Assess digital histological specimens to identify biologically and clinically relevant structures, particularly tumors. - Image Annotation: Create expert annotations of selected structures using appropriate digital tools. - Integration Support: Provide consultancy on linking histological images with other data layers (spatial, molecular, or multi-omic data) and biological interpretation. Ideal skills and experience: - Expertise in pathology with a focus on histopathology - Proficiency in digital image analysis and annotation - Experience in data integration and biological interpretation Preferred Qualifications: - Certif...
If you want to stay competitive in 2021, you need a high quality website. Learn how to hire the best possible web developer for your business fast.
Learn how to find and work with a top-rated Google Chrome Developer for your project today!
Learn how to find and work with a skilled Geolocation Developer for your project. Tips and tricks to ensure successful collaboration.