Name - Generation of crawler/bots/spiders or robots data in web server log file Details - An external traffic that is open to the internet is needed. For this purpose, any website's log file can be used. Web server log file should contain crawling data collected during 10 to 13 days from requests of several web robots. The size of the related access
Need an ICE/STUN/TURN server installed in an Centos 7 server in order...Asterisk. Need to check and explain me how to configure Asterisk and WebRTC script (like doubango) to work when the client is behind NAT. I have coturn installed but not configurated. I see RTP packets but only in one way. Have server A with Asterisk and server B to WebRTC script.
We have an exciting remotely operated crawler. We need to redesign it to improve its performance and specifications; such as increase depth rating, redesign the diving wheel and belts and increase the motors torque
I need a PHP Crawler work. I need a php coder with good skills in nested loop. I need at LOW budget and for LONG term