I wrote a code to check urls, however, ir works really slow.. I want to try to make it work on few urls at the same time, for example 10 urls or at least make it as fast as possible.
I know you will probably say, you have 10 cores and that parallel desicions, but there must be a way to make it much faster. I know tools that check 50 urls in 1 second, how they do it? how can I do it?
my Code:
Parallel.ForEach(urls, new ParallelOptions { MaxDegreeOfParallelism = 10 }, s =>
{
try
{
using (HttpRequest httpRequest = new HttpRequest())
{
httpRequest.UserAgent = "Mozilla/5.0 (Windows NT 10.0; WOW64; rv:52.0) Gecko/20100101 Firefox/52.0";
httpRequest.Cookies = new CookieDictionary(false);
httpRequest.ConnectTimeout = 10000;
httpRequest.ReadWriteTimeout = 10000;
httpRequest.KeepAlive = true;
httpRequest.IgnoreProtocolErrors = true;
string check = httpRequest.Get(s + "'", null).ToString();
if (errors.Any(new Func<string, bool>(check.Contains)))
{
Valid.Add(s);
Console.WriteLine(s);
File.WriteAllLines(Environment.CurrentDirectory + "/Good.txt", Valid);
}
}
}
catch
{
}
});
Aucun commentaire:
Enregistrer un commentaire