I have a simple nodejs crawler that runs on a browser and scraps a particular set of data from a website. the url being : localhost?url=
This what I need to have:
read a txt file full of urls and pass them as arguments to the program
concurrency (multiple websites in parallel) - this should be a parameter that i can modify inside the js file
output the failed urls to a txt file (it should be concatenated)
Hi,
I am Shafayat, I am an M.E.A.N. stack developer.
I believe I am one of the best candidates for this job.
If you have a minute I would request you to send me a message so we can discuss specifications further :)
Regards
Hi, I can do this for you.
I can start immediately and can finish it up in 3 days.
I have vast amount of experience in nodejs and advanced Javascript. I can easily do this for you.
I have more than 2 years experience in working with nodejs and worked on such project before. I have had experience for working for Target Corporation as a software engineer where I built a front end performance monitoring tool for them. Given my extensive experience in nodejs I think I should be able to complete your requirements pretty easily
Dear sir/madam,
Working I have been working with nodeJS for the last weeks, and such a task will come easy for me.
I am looking forward in working with you,
Best regards!
failure can be outputted to the text file provided that your crawler has some sort of indication that weather the crawl was successful or not. I can complete this in one days time and i know exactly what need to be done here.