Find Jobs
Hire Freelancers

Postgres Expert | Full Stack

$15-25 USD / hour

Fermé
Publié il y a environ 5 ans

$15-25 USD / hour

Project Canary is committed to helping the oil and gas industry reduce methane/VOC emissions. We are an IoT company and deploy real-time sensors that continuously analyze air quality at remote sites. We are looking for a talented, committed resource to help integrate a Postgres backend data and help implement a web UI for user admin and IoT metadata. *****Question****** I have telemetry data flowing into a table called "events". New payloads arrive every few seconds. Let's say there are 5000 devices reporting. Now I want a table that shows ONLY the latest event from each device. How would you populate the data in this table?
N° de projet : 19222410

Concernant le projet

22 propositions
Projet à distance
Actif à il y a 5 ans

Cherchez-vous à gagner de l'argent ?

Avantages de faire une offre sur Freelancer

Fixez votre budget et vos délais
Soyez payé pour votre travail
Surlignez votre proposition
Il est gratuit de s'inscrire et de faire des offres sur des travaux
22 freelances proposent en moyenne $23 USD/heure pour ce travail
Avatar de l'utilisateur
can we discuss more on this to get detail understanding about project ? As i am having some technical question on this so let me know when you get time to discuss on this and clear the doubts. As Postgres developer I have relevant skills and experience as per your project need....i can share some demo as well in further chat Moreover You can also check my profile page as i have more than 33% Repeat Hire Ratio..so i work as long term basis.
$20 USD en 40 jours
5,0 (87 commentaires)
8,6
8,6
Avatar de l'utilisateur
Hello, How are you? I am a senior full stack developer. I have full experiences in developing websites managing big data using AWS Elastic search, GAE bigquery. I am very familiar with Postgresql as well. As for your question, it depends on your db table structure. Anyway I would like to give you the answer via call or chat. Please check my profile it will help you whether you will work with me or not. I can bring you the best Result due to my skills and experiences. Look froward to hearing from you. Thanks and Regards, Peng
$25 USD en 40 jours
4,9 (42 commentaires)
7,5
7,5
Avatar de l'utilisateur
Dear Client. Nice to meet you. I have read your request and I am interested in your project. I am full stack DB manager with over 5 years experience. I have enough experience in postgresql, elastic search. I think we can discuss more details on chat. I will expect your reply. Best regards. From Yeoo.
$22 USD en 40 jours
4,8 (20 commentaires)
6,7
6,7
Avatar de l'utilisateur
Hi There, I can do it very quickly & effectively. I'm having more than 9 years of web development experience. Looking forward to work with you! Thanks!
$22 USD en 40 jours
4,8 (73 commentaires)
6,6
6,6
Avatar de l'utilisateur
Hi In answer to your question. to show the lastest data, you state there are 5000 devices and I am assuming you only want to see the latest results for each device. Assuming each device has a name or id, then you just need a different table where the primary key is the device name or id, then that links to your latest data. On updating this data, you have just stated that your payloads are arriving every few seconds. The issue you will have there is that in a multi thread environment, you might process the payloads in the wrong order. There could be multiple solutions for this. The primary one would be that the payloads are processed one at a time in a queue. Since there are so many devices. we would have multiple queues with different queues handling different devices. I'm also assuming here that 'events' will also have a time stamp. On to me. I'm a Snr python programmer with 12+years experience, mostly with web apps, more to the point in Business to Business Apps and analysis rather than customer focused systems. Thanks Marc Nealer
$27 USD en 40 jours
4,9 (14 commentaires)
6,0
6,0
Avatar de l'utilisateur
Hello How are you? I'm very interested in your job I have full experience in postgresql and mysql Please tell me more details about your device info I'm ready to work Lets discuss via chat Kind Regards
$20 USD en 40 jours
4,7 (12 commentaires)
5,9
5,9
Avatar de l'utilisateur
I would add a unique constraint to the device id field and use PostgreSQL’s upsert feature, so that when a new event is inserted, it will only update the existing row for the device. Please send me a message so we can discuss your requirement in detail. I’m available online from 10pm-1am GMT+8
$17 USD en 20 jours
4,8 (5 commentaires)
5,5
5,5
Avatar de l'utilisateur
I have read your job description pretty carefully. Higher-quality and faster-delivery is promised. Your job seems to be posted only for me not for the others, because you are looking for only an expert just like me. I have lived with Postgree for +6 years so I know it as I do know myself. I can help you complete this job as you want. I have done this kind of work so many times so I know how to complete this job and make you fully happy with my quality works. I can start working right now. I hope to work with you. Thanks.
$25 USD en 40 jours
5,0 (11 commentaires)
5,0
5,0
Avatar de l'utilisateur
DESCRIPTION READ AND QUESTION ANSWERED BELOW | SQL EXPERT WITH 8+ YEARS Hi there! Your Project Canary sounds incredibly exciting! I've always wanted to work directly with IoT startups - didn't think I'd get the chance to do so with SUCH a cool company. Compliments aside, let me prove my competence by answering your question: I would write the data to a different table (as I believe we should keep all data that we get) "distinctDeviceEvents" and use the device ID as a primary key. If a key matches, the old data is overwritten by the new data with the current server timestamp. This way, only the newest data is stored. We could use this logic to write to the "events" table - but only if you don't want to store historical data. If the above explanation doesn't help you on getting an insight into my skills, here's why you should choose me: 1) 8+ years of experience as a web/mobile/desktop/cross-platform developer (that means I can solve problems faster than your generic developers with a very specialized field) 2) 3+ years of experience in Chatbot development (this was before drag & drop chatbots were a thing so you can rest assured that I can design features that are not generally found) 3) Clients include Big banks and small businesses (that means that I will ALWAYS understand and put YOUR needs first, and communication gap will NEVER be an issue) 4) I ALWAYS include a FREE sample of my work - so that YOU know what you're getting into when working with me
$22 USD en 40 jours
5,0 (1 commentaire)
4,5
4,5
Avatar de l'utilisateur
Hello Sir, I have Read your requirements and understood that you are looking for an experienced Postgres Expert | Full Stack I have send your question And Answer via personal chat. I am pleased to inform you we have experienced developer team in Elasticsearch, MySQL, node.js, NoSQL Couch & Mongo, Python. Can you please share your best time for the chat so we can discuss further and move ahead. Let's talk further in details. Awaiting your response. Best Regards Kajal Rajput
$15 USD en 40 jours
5,0 (1 commentaire)
3,7
3,7
Avatar de l'utilisateur
I would use a one-to-many relation between two tables, one containing information for each sensor and the related table containing the timestamped data for each reading joined on the sensor id. When each new reading arrives, insert it in the readings table and update the sensor table with the id of the latest reading. This could be accomplished by either performing the insert and update using a stored procedure, creating a trigger on the readings table to perform the update, or simply using separate insert and update queries in the application. This approach avoids the expense of performing a 'select latest' query which would involve self-joining the readings table on a group-by subquery to determine the most recent reading. With 5000 sensors streaming data, the query approach would be intolerably slow and resource intensive.
$22 USD en 40 jours
5,0 (5 commentaires)
2,8
2,8
Avatar de l'utilisateur
can we discuss more on this project so that we can plan accordingly and start working on it. we can use aws postgrex as a background storage of your data. thanks
$15 USD en 30 jours
0,0 (0 commentaires)
0,0
0,0
Avatar de l'utilisateur
I've worked on IoT and link click data streams before and can easily build a solution to provide an aggregated view for your BI tool and/or application.
$55 USD en 40 jours
0,0 (0 commentaires)
0,0
0,0
Avatar de l'utilisateur
I have done iot solution using golang, kafka, grpc, nodejs, postgresql and mysql. Logs normally inserted using guid PK thus we need 2 tables. Both tables contain exact structure but device log table has serial pk. When new payload arrived, insert new data into log table while patch / update data on the device log table. This is a must especially for PostgreSQL. For better performance, the log table pk can be big serial too as the max value will reach maximum after 58k years if 5000 devices reporting at every second interval. Finally, the latest device event can be queried from device log table which has only 5000 rows instead of millions of rows. To speed everything, sometimes, we need something like in memory cache for those 5000 rows as pulling data from database while inserting and updating makes everything slow. Looking to work for you. Thanks.
$25 USD en 40 jours
0,0 (0 commentaires)
0,0
0,0

À propos du client

Drapeau de MEXICO
Cuernavaca, Mexico
5,0
4
Membre depuis févr. 19, 2019

Vérification du client

Merci ! Nous vous avons envoyé un lien par e-mail afin de réclamer votre crédit gratuit.
Une erreur a eu lieu lors de l'envoi de votre e-mail. Veuillez réessayer.
Utilisateurs enregistrés Total des travaux publiés
Freelancer ® is a registered Trademark of Freelancer Technology Pty Limited (ACN 142 189 759)
Copyright © 2024 Freelancer Technology Pty Limited (ACN 142 189 759)
Chargement de l'aperçu
Permission donnée pour la géolocalisation.
Votre session de connexion a expiré et vous avez été déconnecté. Veuillez vous connecter à nouveau.