all to all,
I have an application that consumes several web services. What this application does is that it gets many users and creates entries in a .csv file for various projects.
I would like to make the execution for a certain number of users, every 100 or 200 users the application waits about 30 seconds and continues its execution.
Let this be done until execution is complete.
I want to avoid the loading of queries to the database and prevent the traffic from being saturated due to the fact that I make several web calls.
How can I do that?
Thank you very much in advance.
Here is part of my code:
UserData[] userIds = GetUserID();
if (userIds.Length > 0)
{
List<string> multiUserIDs = new List<string>();
foreach (var userdata in userIds)
{
multiUserIDs.Add(userdata.List[0].ToString());
}
string testName = "test_";
string outCsvFile = string.Format(@"D:\\TasksForAllProjects\\{0}.csv", testName + DateTime.Now.ToString("_yyyyMMdd HHmms"));
String newLine = "";
var stream = File.CreateText(outCsvFile);
stream.WriteLine("ProjectName,UserFirstName,UserLastName,TaskStatus);
string temp = "";
var AllProjectIds = proj.ProjectID; //Esto viene de otra llamada hecha mas arriba
string[] projectIDs = new string[] { AllProjectIds }; // all projects
string singleUserID = "";
string[] taskStatus = new string[] { "notcompleted" };
TaskEntry[] result = GetTasks(projectIDs, singleUserID, multiUserIDs.ToArray(),taskStatus);
newLine = string.Format("\"{0}\",\"{1}\",\"{2}\"",
item.ProjectTitle,
item.UserID,
status);
//Creating a new file when new entries are found.
stream.WriteLine(newLine);
}
stream.Close();
I see that the problem could come from the variable multiUserIDs
which has all the users.
So it should run from a number of users that comes from that variable.
you have to sleep the thread:
The parameter it receives is in milliseconds.
However, doing this sometimes is not a good idea, and you should ask yourself why you have to sleep the application for so long. If this is something that is going to be maintained over time, you should consider another design.
If you are generating a csv, do not do it that way, use a library such as
CsvHelper
then you can map the entire data without going through record by record.
I understand that you have a class in the variable
result
in that case the export is directWhat's more, if you analyze the documentation you could customize the mapping and generate the file in a single operation, this way you wouldn't have to use any code stop.
>>if I execute the call most of the time I have a timeout or freeze the database
I don't see how you are recovering the records, but that problem is not the generation of the csv but how you recover the data, maybe you should evaluate using a
DataReader
But stopping the code doesn't seem like a good alternative to me.