Elasticsearch Jobs
Elasticsearch is a powerful search engine built to enable the most complex of sites to support incredibly large amounts of data. It allows development teams to slice and dice big data from across varied sources, analyze it and meaningfully present it in an organized manner. Search results can be made more relevant to the users based on the query and available data. A professional Elasticsearch consultant can help bring out the best of your businesses existing and new data: analyzing, segmenting and visualizing this data to increase search relevance and make it work for you.
Here's some projects that our expert Elasticsearch Professionals made real:
- Developing software to integrate Elasticsearch and MySQL databases
- Creating rest API for OpenSearch engine with JavaScript or Python
- Ingesting CSV files through HTTP with PySpark or Pandas
- Automating log file transfer from AWS CloudWatch into Elasticsearch
- Setting up AWS Lambda functions
Whether your team needs assistance visualizing existing data, integrating multiple sources in your cloud, or transferring files from AWS, an expert Elasticsearch Professional can help. Let us help you unlock the power of Big Data using Elasticsearch. Post your project on Freelancer.com today and start exploring what’s possible for your business.
Sur 20,619 commentaires, les clients ont évalué nos Elasticsearch Professionals 4.86 sur 5 étoiles.Embaucher des Elasticsearch Professionals
I am looking for an experienced MongoDB expert to convert my data from and files to a readable format. The data is less than 1 GB, and there is no specific data processing or transformation required during the conversion. The ideal candidate should have expertise in MongoDB and data conversion, with a proven track record of successful project delivery. The project involves the following: - Converting and files to a readable format - Ensuring data accuracy and integrity - Delivering the project within the specified timeline If you have the skills and experience required for this project, please submit your proposal with your relevant work samples.
I'm looking for someone who can assist with deleting data from elasticsearch using Logstash i have logs from logstash and file i run this on my local machine but have anydesk and i can send files
I am looking for a talented and experienced developer to assist in building a location-based shopping application, centered around a chatbot that assists users in finding products. This platform will help users discover products available in their surrounding retailers and provide relevant recommendations based on their preferences. Key Responsibilities including :Develop an interactive frontend interface with React, with primary focus on a chatbot feature for product search and recommendations; Create a search interface for users who prefer direct search functionality; Build a robust backend with Python (Django or Flask) that manages chatbot operations, user requests, and interacts with retailer data; Collaborate on the design and user experience of the app, ensuring it is intuitive and ...
I am looking for a skilled Python developer to build a database update system using RabbitMQ and MongoDB. The purpose of the system is to update our database on demand to ensure that it is always up to date. The objective to merge candidate application data from three data sources - linkedin jobs, naukri and naukri rms. Each job will have a unique job_id and all the applicants from each data source will be tagged to one unique job_id. 1- A service will capture the data from each data source and upsert into each collection. eg. linkedin_jobs, naukri, naukri_rms. Since this service can run any number of times, the operation will always be upsert so that duplicates are removed 2. Stage 2- Upon completion of the upsert, there will be a collection called merged_data that will merge data f...
I want to create a linux server on google cloud with some security patches. install mysql install elasticsearch install bitcoin core install apache
api set up wrong and search function on our home site makes timeout almost every time we search
Good morning, I am trying to put into practice on a LAB ( EKL stack (elasticsearh, logstash and kibana). I am following a YouTube channel "@evermighttech" for training. I'm stuck for fleet server implementation. I need your to solve the problem and explain me why. I will surely need help on EKL as I progress in my LAB. I have a working elasticsearch + kibana with SSL self-signed on ubuntu 22.04. i will provide a teamviewer or anydesk on a workstation for investigation if need. Thanks
I am looking for a developer who can integrate OpenAI's text classification functionality with my existing MongoDB database. The ideal candidate should have experience working with both OpenAI and MongoDB. The project requires the ability to process large amounts of data, exceeding 10GB. The scope of the project includes: - Integrating OpenAI's text classification functionality with our existing MongoDB databases - Ensuring optimal performance and scalability of the integration - Implementing data processing and analysis tools to work with the integrated system - Providing documentation and support for the integrated system post-deployment. If you have experience with OpenAI and MongoDB, and are comfortable working with large volumes of data, please apply.
I already have a script written, but up to you. Need to: 1. Collect over 100,000 URLS from mongo db (script already written). 2. Check if each of the URLs exists or not in Google. 3. Check and report HTTP status of each website. 4. Update mongo db. Will need proxies to stop 429 too many requests issues. Tool designed to tell clients if their URLs are live and indexed. Urgent deadline. More details: What is the main goal of the script? To monitor changes in Google indexing Do you have a preferred programming language for the script? Python What specific data do you want to track for Google indexing changes? If URL exists or not
I am looking for a freelancer to set up a LLM search engine for research and development purposes. The preferred search engine is Algolia or Elasticsearch. I have data to be indexed but no existing database or index. The ideal candidate would have experience in setting up search engines and be proficient in Algolia or Elasticsearch.
I am looking for a freelancer who can help with my Nutch configuration, as I need both web crawling and data extraction. Specifically, I need to extract text, links, and images from specific websites that I have in mind. Ideal skills for this project include experience with Nutch configuration, web crawling, and data extraction. The freelancer should also have knowledge of website architecture and be able to work efficiently and accurately. Nutch and Solr are already running locally, they just need final configurations. Items indexed in solr cores 1st core webcrawl - on the topic of food/recipes for 100 websites 2nd core = = unlimited docs or your suggestion, but least 300 patents 3rd core - site crawler = 50 documents crawled - https://www.^^^^^^^^^.com